Concurrency Problem in Access

pikachew

Registered User.
Local time
Tomorrow, 03:14
Joined
Jan 26, 2005
Messages
34
Hi,

Can anyone tell me how to solve the concurrency problem in MS Access in a multi-user enviroment?
Or is it best to use Databases like Oracle or MS SQL Server as they are more robust client/server database.
I have read from somwhere that It is ok to use MS Access for a mutli-user enviroment of not more that 20 users.
My problem is that my application requires unlimited number of users to access the DB.
I am facing some concurrency problems which even my implmenting the rs.Locking does not seem to be a full proof method.

Advise please.
:)
 
I also believe that above 20 users performance suffers but believe the book says 255 concurrent users.

It becomes very important to ensure that you have arecord locking strategy in place.

Ensure that all Enquiry forms are opened Read Only and that you ensure that the Record Locking property on any edit Form is set to Edited Record.

You also of course need to keep people out of the tables

HTH

Len B
 
Hi,
Thanks for replying
The 20 users do suffer sometimes. 255 concurrent users? where did that come from?
but i cant fully comprehend what you are trying to say.
Could u mind elaborating?

As for the locking mechanism, i am using
**********************
rs.CursorLocation=adUseServer
rs.CursorType = adOpenDynamic
rs.LockType = adLockPessimistic
rs.Open sSQL,,,adCmdTable
**********************

Pessimistic locking is used because the accuracy of the data is of the utmost importance although it slows down the performance of the application.
However when the traffic is heavy, the data i get are quite often incorrect which means its not working very correctly.
Can u advise on how to implement the locking?

My last alternative is to migrate the database to MS SQL server which most people say are more robust.

Please help...i seem to have tried everything.

Thanks
 
My problem now is that very often i have many users trying to update/edit the same record in the DB.
I implemented the pessimistic locking strategy in an attempt to try to solve this problem of having one user updating the same record which another user is also trying to update at the same time.
Please help.
 
255 Users is the "Official Microsoft" number of concurrent users I believe.

The way in which I set up an applications is that I split the various functions into

Enquiry
Add Data
Edit Date
Delete Data
Reports

I then use forms and every form is based on an appropriate query

I do not use the type of code as your post

To open a form I use
DoCmd.OpenForm "Form Name",,, Edit??? (Cannot remember rest off hand)
This is contained in an Event procedure of a Command Button or combo box typically.

The DoCmd allows me to open a form where the data is in a controlled condition ReadOnly for Example

If I open a Form to Edit Data then I can do this within the DoCmd but additionally the Form Properties can be used to Lock That Particular Record and thus totally avoid the problem of multiple Users attempting to edit the same record.

Worth Checking out the DoCmd.OpenForm in the Help.

Also Form Properties

HTH

Len B

PS SQL will I believe give you more concurrent users but you still must address the record locking
 
Ic, thanks
I do not use Forms though.
By the way do u encounter concurrency problems as well?
:)
If you do , could u care to explain or describe what they are?
 
I think you got the wrong idea.
My fault as I didnt elaborate.
I buliding a web application in which multiple visitors to my website would often end up tring to update/edit/access the same record at the same time (simultaneously).
I am not trying to manually key in data into the database or having multiple users entering data into the db maunally.
:)
 
Okay

I do not have concurrency problems because all my applications are over a network and therefore use Forms all within Access Environment

I do not know about the web implications. Only that you do need to lock out a record when the first person opens it for edit

Sorry cannot help with the web side of things


len B
 
My problem is that my application requires unlimited number of users to access the DB.

multiple visitors to my website would often end up tring to update/edit/access the same record at the same time (simultaneously).

By demanding essentially real-time data accuracy and then allowing users concurrent access to the same record, you are making totally incompatible choices that will doom you to abject failure. You'll NEVER get this to work in Access. You'll NEVER get this design to work right in any other DB either.

To make this work, you must do a simple yet profound thing. Choose a new data-flow design that does not require an update of the same record at the same time from multiple threads. Destructive interference is a fact of life that DOES and WILL happen. Your web users can NEVER EVER BE ALLOWED TO UPDATE THE RECORD IN REAL TIME!

You simply cannot linearize "unlimited" web access without killing the performance of your site IF you insist on allowing direct updates of that record. I am using "direct" in another sense of the word than you are, perhaps - but trust me, what you describe is direct access. There would BE no locking problem if the access were not essentially direct, no matter how many layers are involved.

You need to linearize the algorithm. Build individual records that CAN be stored without destructive interference. Then, perhaps in a background process, apply the individual records to the counters you described in your earlier post. Do it as a linear process that cannot result in competition for that record from another thread.

Ask yourself some questions first:

What information in the database do the Web users need to see? In particular, do they need to see the contents of the record you are updating? (In a previous post on what appears to be this same subject, they did not. It was important only for your business, not for your Web users, to know the statistics.) If not, then why are they even toucing this record? As an issue in security, users should NEVER be able to trigger an event that causes data loss. Your current design invites that.

What aspect of the information you capture requires you to do a real-time update at your end? Is the real-time aspect so important that you will spend countless hours and bunches of money to migrate to another DB and maybe even a more robust platform? In other words, is the cost of getting this feature implemented worth the gain?

I suspect that someone in your management structure "has a dream." Wake that person up to reality by tossing in some cost estimates for time, software license fees, implementation costs, purchasing a more robust server to host said software, hiring another DB specialist & a new systems administrator for the high-powered server you want to run, .... The costs mount up quickly.

Forgive me if my response sounds a bit harsh. Sometimes it is necessary to be very blunt to bring people to a new and different mind-set. Sometimes that bluntness is mistaken for having a mean spirit. The latter was not my intent.

I am trying to get through to you that your problem is insoluble IN ITS CURRENT FORM. But I am NOT saying that you can't do something constructive that comes close. The art of problem solving is SOMETIMES the art of building something that comes CLOSE ENOUGH to what you wanted to satisfy all parties concerned. I.e. striking a balance between desire and possibility. In the end, peace is all about that kind of compromise.
 

Users who are viewing this thread

Back
Top Bottom