Improving Split DB Performance (1 Viewer)

Weekleyba

Registered User.
Local time
Today, 03:50
Joined
Oct 10, 2013
Messages
586
I have a split database, with the FEs on each users computer and the BE on a networked server.
When one user is on, the performance is great.
When two users are on, the performance decreases noticeably. What took a fraction of second before with a single user, now takes 1-2 seconds to accomplish when two users are in the database.

Is there something I can do to improve this?

As a side note, when two users are on and one of them closes out of the database, the remaining single user's speed remains the same. That is, slower.
If that last user closes out of the database and then immediately reopens the database, the performance speeds up. Why is this?
 

theDBguy

I’m here to help
Staff member
Local time
Today, 01:50
Joined
Oct 29, 2018
Messages
21,358
First question: Are you using a persistent connection?
 

Weekleyba

Registered User.
Local time
Today, 03:50
Joined
Oct 10, 2013
Messages
586
Well, I don't fully understand what that is but isladogs sent me: (related to a different problem and different post)

"To improve performance create a persistent connection with an always open but hidden form which has as its record source a small BE table.
Ideally use a single field, single record BE table for this purpose."

I did this and it solved the issue of my BE locked file (.laccdb) opening and closing every 3 seconds when my main form was open.

Is that using a persistent connection?
 
Last edited:

isladogs

MVP / VIP
Local time
Today, 08:50
Joined
Jan 14, 2017
Messages
18,186
The BE lock file remains in place because the connection to the BE has been made persistent.
That is the point I was making in the other thread
 

CJ_London

Super Moderator
Staff member
Local time
Today, 08:50
Joined
Feb 19, 2013
Messages
16,553
with regards performance, significant degradation can occur due to a) lack of indexing and/or b) having forms/reports based on whole tables or large datasets which may or may not be subsequently filtered (as in using the where parameter of docmd.openform which applies a filter, not a criteria)
 

Weekleyba

Registered User.
Local time
Today, 03:50
Joined
Oct 10, 2013
Messages
586
CJ London,
So the two times you mentioned would be related to this specific problem?
It just seems strange for a single user to have great performance with the DB and for a second user (in the DB at the same time as the first ), to experience a substantial lag.
I will start looking at implementing this in my database.
thanks.
 

CJ_London

Super Moderator
Staff member
Local time
Today, 08:50
Joined
Feb 19, 2013
Messages
16,553
see this link which covers most areas of performance.


the reason you may be experiencing a problem may be due to file locking activities going on in the background. A single user does not need file locking. You haven't said what happens when user 1 opens the db then user 2 - does user 1 continue with good performance or does it slow when user 2 opens their front end? Just suggesting that lack of indexing will potentially slow the locking down, whilst bringing larger datasets through potentially increases the locking 'workload'.

Another issue might be your network capacity (bandwidth). Bringing large datasets through takes more bandwidth. Doing it for another user as well.....
 

Weekleyba

Registered User.
Local time
Today, 03:50
Joined
Oct 10, 2013
Messages
586
Thanks CJ!
And yes, user 1 slows down when user 2 opens the FE.
I’m not familiar with indexing, so I know I haven’t done any. I will learn.

I’m not sure what you consider a large dataset. One of the tables has 5800 records but it’s by far the largest. All the other tables are no more than 100 records.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 03:50
Joined
Feb 28, 2001
Messages
27,001
Indexing helps when you are doing searches either explicitly OR using a WHERE clause that somehow filters out some records. I'll take a shot from the hip here. For a given table of 5800 records, the log (base 2) of 5800 is between 12 and 13. (2^12 = 4096, 2^13 = 8192, and your table size is between those number.) Exact value doesn't matter - you use the larger number.

Access uses binary trees for indexing. Using standard mathematical analysis, a table of 5800 records that has no index will require you to test, on the average, half the table in a search for a random value. That is 2900 records to be tested. If you use a binary tree due to having indexed the table, the expectation is to probe 13 records to find the one you want. That is why you index.

As to why databases get slower when shared? Besides the issue of the persistent connection, there is also the matter of remote file locking. See, every time you open a file, you take out a WINDOWS file lock on it. Then there is the Access .LDB or database locking file that tracks what you are using inside the DB. Depending on how your DB is implemented, you might be "thrashing" the lock file. That is, creating and releasing it frequently. This beats the hell out of the file locks on the back end machine - which is where locks for the back end file are managed.

Your front end file's locks are on your workstation computer, but those locks are local and NOT shared, 'cause everyone has their own copy so all of the FE locks are invisible to other users. The BE file is shared multiple times. If you have to open and close the lock file, you are doing lock and folder operations very often. If you use the "persistent connection" method though, the lock file stays open until you close the overall application. So NO file thrashing of the lock file and the Windows file locking ALSO doesn't engage every time someone does something. That is because the persistent connection maintains the file locks AND the lock file continuously open until the app exits. (Yes, I said that correctly.)

I would be VERY surprised if addressing persistent file locking and establishing indexes for the fields that are most frequently searched doesn't help you a lot.
 

CJ_London

Super Moderator
Staff member
Local time
Today, 08:50
Joined
Feb 19, 2013
Messages
16,553
indexing helps when you are doing searches either explicitly OR using a WHERE clause that somehow filters out some records.
I would also say where you are using joins in a query as well
 

Users who are viewing this thread

Top Bottom