More of a higher-level question, but what's a good way to organize total structure?

Access9001

Registered User.
Local time
Today, 11:54
Joined
Feb 18, 2010
Messages
268
By that I mean I have processes that create tables, processes that extract data, others that export files, and so forth. But there are always issues if someone else is using a given database or file (thereby tying up other processes that either need to access said object for whatever reason).

Is there a general way to avoid such issues? On a modular, macro-level, what is the best way to typically arrange these processes in order to ensure things run smoothly?

Typically I try to separate backend from frontend, but when people use the frontend, it still locks the backend tables, for instance. Or if I need to overwrite/edit/access an Excel file, and someone has it open, there will be problems.
 
Sounds to me like you have issues with the form lock settings.

You always want to convert forms to use queries rather than directly using the tables. You always want optimistic locking on anything form-driven UNLESS it is a review-only form, in which case you want NO locking.

FE/BE splits also help a lot with the issues of conflicting locks. But index the tables you use most on the fields you use most. There IS a balancing point, because the more indexes (indices?) you have, the longer an update or bulk delete takes. But the fewer indexes you have, the less likely you are to be able to do fast selection. Your upper limit on a table is 10 indexes, but you don't want to even get slightly close to that number unless there is a really big, STATIC table used for lookups.

The problem with creating tables and performing extracts is that by their very nature, you are pretty much forced to take out pessimistic locks, which is to be shunned as often as you would shun a frumious bandersnatch. (Sorry, Lewis Carroll...)

It is the nature of the "non-user" processes that if you could do them at a time when users are NOT on the system, you are better off. At our site, when we have massive changes to make that affect metadata/structural stuff, we set aside a time to do it when users know they will not be allowed in. Only you can define your project's timing needs, but we have always found that wholesale updates and piecemeal user actions are almost never comfortable together.

We organized our batch updates into a "script" (based on a macro sort of thing) where we would do the RunCode action on a series of subroutines in VBA. They did all staging for us. "Macro sort of thing" means we started it as a macro that was our scaffold until we could go back to compile the macro later. You can convert a macro to VBA, after which you can add error handling and other bells and whistles.

In the programming arena, "divide and conquer" is good advice whether you are Julius Caesar fighting the Gallic Wars or a programmer attacking a complex - but divisible - problem. Separating the batch-like and interactive functions to different times of day is therefore at least one way you could solve your locking problem.

Without further knowledge of your problem I can only offer the generalities you see above. Good luck on tackling this bear of a problem.
 
Thanks for the detailed response.

Why Optimistic locking for forms? Why no locking for data review/viewing?
 
Typically I try to separate backend from frontend, but when people use the frontend, it still locks the backend tables, for instance. Or if I need to overwrite/edit/access an Excel file, and someone has it open, there will be problems.

I'm not the best at this...yet... so one of the elders in the Forum might corret me here but, wouldn't a recordset method avoid some of these problems?

Get some code that reads the back end, loads up the data into the recordset stored in the front end, and then after updates send the new data back to the back end.
That way the back end should only be locked for the few seconds the data is being retrieved and/or saved.
 
I'm not the best at this...yet... so one of the elders in the Forum might corret me here but, wouldn't a recordset method avoid some of these problems?

Get some code that reads the back end, loads up the data into the recordset stored in the front end, and then after updates send the new data back to the back end.
That way the back end should only be locked for the few seconds the data is being retrieved and/or saved.

This seems like a pretty good idea as well -- although I don't know if this would circumvent any issues pertaining to the actual files themselves (Excel files, PDFs, etc) that may need to be edited but face the risk of having other people using them at the same time.
 
Access Guy, using a "staging" method works only in very rare cases. Besides which, a recordset doesn't have the properties that you seem to be attributing to it. A recordset is ANYTHING that produces a set of records. Moving something to a recordset, though, means you are copying data from one place to another and that a table MUST be part of the destination - because a query cannot hold anything by itself. The trade-off with this approach is that you blow away any benefits to normalization because you have redundant data as a result of the copy. If this information is for review only, you are reviewing a snapshot, not reality, and would not see dynamic changes if any occurred.

Access9001 - You have "NO" locking for data review-only forms (i.e. cannot perform an edit) because you don't care who is using that form to look at something. They can literally do no harm. Any number of users can safely share through a review-only form because nobody will be doing anything to the data anyway.

You use "Optimistic" locking when you are doing spot changes. A user can look at a single record, make a quick change, and be in and out very quickly. The reason you choose that is because your computer is VASTLY faster than you are. You pick optimistic locking to give you the smallest non-zero window of time during which a record is locked. Optimistic locking means "assume that no locks are needed until you find out otherwise." You do that on a form because you (typically) can only look at one record, so you leave it unlocked until the last minute, reach in, commit your changes, and let go. In a fast machine that might be 10 msec or less counting the disk operations.

Pessimistic locking should be reserved for cases where you are doing a massive overhaul and the structure or content of the table is known to be the target of extensive changes. You use pessimistic locking when you know that you can't get in and out quickly because you are doing a lot of things all at once. Batch-oriented processes are often good candidates for this kind of locking simply because in batch you are not waiting for a person to type some sort of input. You would in that case fritter yourself to death taking out, using, and releasing a lock on a single record at a time for a large fraction of your table. So you take out one big lock, make your changes, release one bick lock, and go away.
 
just to clarify, setting access as no locks does NOT mean no checking at all

instead of locking the record, and prevent other users accessing the record, access REREADS the record just before saving your changes.

if the underlying record has changed, it tells you - other wise it just saves your changes

this is generally adequate
 

Users who are viewing this thread

Back
Top Bottom