Query slow on split database on network (1 Viewer)

Jupie23

Registered User.
Local time
Today, 09:38
Joined
Nov 9, 2017
Messages
90
I haven't used Access much in several years now, so I am rusty on how to use it and also never knew the full ins and outs, so I may need a thorough explanation. I am hoping to clean up a 5+ year old database that hasn't been used for a while but they are picking up again. The database is split with the backend on the company network drive and has always had some slowness.

There is a form where the user fills out fields to create a letter, and then they pend it to revisit in 14 days when it shows up in the Pending form. The Pending form is based on the following query and takes 2-3 minutes to load the form. When I run the query separately, it takes some time there too. Just wondering if there is anything I can do differently here to speed this up? Thank you!

Code:
SELECT tblState2State.ID, tblState2State.CustomerFirstName, tblState2State.CustomerLastName, tblState2State.Customer2FirstName, tblState2State.Customer2LastName, tblState2State.CustomerAddress, tblState2State.CustomerCity, tblState2State.CustomerState, tblState2State.CustomerZip, tblState2State.DMVName, tblState2State.DMVAddress, tblState2State.DMVCity, tblState2State.DMVState, tblState2State.DMVZip, tblState2State.DealerName, tblState2State.DealerAddress, tblState2State.DealerCity, tblState2State.DealerState, tblState2State.DealerZip, tblState2State.VIN, tblState2State.VehicleYear, tblState2State.VehicleMake, tblState2State.VehicleModel, tblState2State.LicensingAgent, tblState2State.DateSent, tblState2State.Sentby, tblState2State.Pend1, tblState2State.Pend1Date, tblState2State.Comments, tblState2State.Completed, DateDiff("d",[Pend1Date],Date()) AS [Days Pended], tblState2State.Pend1By, tblState2State.CompletedDate, tblState2State.CompletedBy, tblState2State.AccountNumber, tblState2State.Pend2, tblState2State.Pend2Date, tblState2State.Pend2By, tblState2State.ATTNDMV, tblState2State.ATTNDealer, tblState2State.InternalComments, tblState2State.CustomerAddress2, tblState2State.DocsRecdCust, tblState2State.DocsRecdCustDate, tblState2State.DocsRecdDMV, tblState2State.DocsRecdDMVDate, tblState2State.DocsRecdDealer, tblState2State.DocsRecdDealerDate, tblState2State.eTitle
FROM tblState2State
WHERE (((tblState2State.Pend1)=True) AND ((tblState2State.Completed)=False) AND ((DateDiff("d",[Pend1Date],Date()))>14) AND ((tblState2State.Pend2)=False) AND ((tblState2State.eTitle)=False));
 
That is ENTIRELY too long. This is a query on a single table, with a DateDiff function but otherwise nothing special. That should take a second or so.

First test would be to take the network out of the equation: copy the BE to your local machine, reattach, and try again.
If fast, I would do a PING in a Command Prompt window:
PING [yourservername]
and look for the "time" component to be "<1ms". If it is not, get a competent network engineer to look into it.

We're not on a wireless network, right?
 
First suggestion, just for future ease of editing. You have a single-table SELECT, so qualifying each field (with tblState2State.) is not required. There is no other table source possible to provide the fields.

Second suggestion, as a MINOR technical issue that won't change anything except typing: The WHERE clause had entirely too many parentheses, which means it was probably created by the query grid design method. (The grid is notorious for LISPing... Lots of Insipid, Silly Parentheses.) I removed a few but based on operator precedence rules, could probably have removed one more layer in most cases.

Code:
SELECT 
    ID, 
    CustomerFirstName, CustomerLastName, 
    Customer2FirstName, Customer2LastName, 
    CustomerAddress, CustomerCity, CustomerState, CustomerZip, 
    DMVName, DMVAddress, DMVCity, DMVState, DMVZip, 
    DealerName, DealerAddress, DealerCity, DealerState, DealerZip, 
    VIN, VehicleYear, VehicleMake, VehicleModel, 
    LicensingAgent, DateSent, Sentby, 
    Pend1, Pend1Date, Comments, Completed, DateDiff("d",[Pend1Date],Date()) AS [Days Pended], Pend1By, 
    CompletedDate, CompletedBy, AccountNumber, 
    Pend2, Pend2Date, Pend2By, 
    ATTNDMV, ATTNDealer, InternalComments, CustomerAddress2, 
    DocsRecdCust, DocsRecdCustDate, DocsRecdDMV, DocsRecdDMVDate, DocsRecdDealer, DocsRecdDealerDate, 
    eTitle
FROM tblState2State
WHERE 
    (Pend1=True) AND 
    (Completed=False) AND 
    (DateDiff("d",[Pend1Date],Date())>14) AND 
    (Pend2=False) AND 
    (eTitle=False));

Reorganizing it this way, I can see things worthy of mention for a down-the-road fixup. You have a LOT of fields that seem to repeat except for a qualifying number. Pend1 & Pend2 for one example; Customer1 & Customer2 is another example. A few others show up as well. You have a TON of Address fields that could be relegated to separate address tables. Looking at this, you have probably between 500 and 1000 characters per record. Perfectly legal but incredibly cumbersome to manipulate.

This single-table query should not, of itself, take that long to load. Even with a native Access backend (as opposed to something like SQL server) this is not a complex query. But it MIGHT be complicated based on the size of the database file. How many records are involved here? A downer of sorts is that your criteria except for the DATEDIFF function are all T/F fields so even indexing would not help for four of your criteria.

Normalizing the table to address the many repeated field types might help in another way, since if you were to divide the table in a way to reduce its size significantly it would take less time to load a main table and then load only the relevant child records for sub-forms (because you could take advantage of indexing in that case.)
 
I asked, but it is buried in my 4th paragraph.
Thank you for the ideas, I will try them out as soon as I can - but just to fill in info - the main table was just cleared out at the beginning of the year and currently only has 1645 records. Before cleanout it was about 20,000. The query is pulling 675 records. I have noticed in the past that when the backend is on my own machine it is quite a bit faster, but I will try that again now. I don't know that I am high enough on the totem pole to influence any changes to the network, but I can ask. I am on wireless at home, but the users are in office wired in and have the same slowness.
 
With that number of rows, the result should be instantaneous whether the query engine uses an index at all. There is something else at play. Compact both databases and try again. You need to get your network people to figure out the slowness.

Are you running the FE on the server? Each user should have his own FE loaded on his C drive and run from there.
 
Probably not this issue, but when I worked for a Building Society in Coventry, we were based in Birmingham.
We had our own server and about 5 PC desktops.

However everything was still slow, despite the server being alongside my desk.

Turns out the network people were sending all the traffic up to Coventry and then back down to us. :)
Once this was corrected, it was as fast as you would expect for a local server.
 
> Turns out the network people were sending all the traffic up to Coventry and then back down to us
This could probably be ascertained by running a TraceRoute in a Command Prompt window:
TRACERT [yourservername]
You'll see more hops than you want to see.
 
> Turns out the network people were sending all the traffic up to Coventry and then back down to us
This could probably be ascertained by running a TraceRoute in a Command Prompt window:
TRACERT [yourservername]
You'll see more hops than you want to see.
Not only that, but they might get some timeouts as well?
1713550487286.png
 
With that number of rows, the result should be instantaneous whether the query engine uses an index at all. There is something else at play. Compact both databases and try again. You need to get your network people to figure out the slowness.

Are you running the FE on the server? Each user should have his own FE loaded on his C drive and run from there.
The users have been instructed to save it to their desktops and always open it from there. Clarification - this has always been the case, I did not mean that I just instructed them now.
 
Last edited:
The users have been instructed to save it to their desktops and always open it from there.
This is not the kind of thing you should leave to users. Create a shortcut that runs a batch file. The batch file should be stored on the server for ease of maintenance. The batch file copies the master version of the FE to the local folder and opens it. This ensures that the users are always using the current version of the app. I also use internal version checks to make sure that the FE/BE always are in sync. Some people perefer to use a loader app to start the Access database.
 
This is not the kind of thing you should leave to users. Create a shortcut that runs a batch file. The batch file should be stored on the server for ease of maintenance. The batch file copies the master version of the FE to the local folder and opens it. This ensures that the users are always using the current version of the app. I also use internal version checks to make sure that the FE/BE always are in sync. Some people perefer to use a loader app to start the Access database.
Interesting - I had no idea of this method - I am not in a tech role, more of a "citizen developer" type. I will look into this option. Thanks!
 
There is lots here written on the topic.
Here is the simplest version of a batch file that does what you need.
Code:
md c:\DwgLog
del c:\DwgLog\DrawingLog.accdb
copy "\\BSCCTIMBERLINE1\Timberline Office\AccessApps\Data\CommonFE\DrawingLog.accdb" c:\DwgLog
start c:\DwgLog\DrawingLog.accdb

Copying a fresh version of the FE each time the app is opened doesn't add a lot of extra time since the files tend to be small but it does prevent bloat.
 
Update: I did a compact and repair on front and back ends. I changed the query as suggested to Pend1Date<date()-14 and indexed the field. I made a copy of the backend, saved to my computer and linked tables. It was significantly faster. I then re-linked to the original backend on the network and it is much faster! So I'm not sure which of those things made a difference, or if it's a fluke, but I will test again on Monday. I clocked it before the changes as taking 1 minute 22 seconds to open the Pending form, and right now it's 5 seconds. Thank you for your help!
 
There is lots here written on the topic.
Here is the simplest version of a batch file that does what you need.
Code:
md c:\DwgLog
del c:\DwgLog\DrawingLog.accdb
copy "\\BSCCTIMBERLINE1\Timberline Office\AccessApps\Data\CommonFE\DrawingLog.accdb" c:\DwgLog
start c:\DwgLog\DrawingLog.accdb

Copying a fresh version of the FE each time the app is opened doesn't add a lot of extra time since the files tend to be small but it does prevent bloat.
Thank you - where does this code go?
 
The code I posted was a batch file. Do NOT use it without understanding what each line does since you will need to change the code to reflect your path and file names. It is saved with a .bat extension and edited with Notepad. Since a .bat file is an executable, it runs when you double click on it so to edit it, you MUST right click on the file name and choose the program to use to edit it OR open Notepad first, then select the .bat file to edit it.

You then create a shortcut and the shortcut runs the .bat file which you saved on a shared folder on the server.
 
You might know them as cmd files?
They do the same job as batch files do or did.

Just like directories are now known as folders.
 

Users who are viewing this thread

Back
Top Bottom