Question Poor performance on one network (1 Viewer)

andrewmrichards

Registered User.
Local time
Today, 18:07
Joined
Mar 15, 2012
Messages
18
Hi all

I'm developing a database for a client. It's currently split into an Access front end and Access back end, although the plan is to migrate to a SQL back end next year.

To give an idea of scale, the back end contains 40 tables, and is just over 10mb. As you can tell, it doesn't yet contain a huge amount of data, as we're still in development.
The front end is around 16mb. It has 8 local temp tables, plus a couple dozen queries and around 80 forms.

All the tables are properly related and have indexed key fields.

Some of the queries are fairly complex, and include calls to custom functions.

The database includes code to maintain a constant connection to the back end, so that the .laccdb file isn't constantly being created and destroyed.

Here's the issue. I've developed the database on a laptop, and it performs very well, with screens opening instantly, as you'd expect. Obviously, you would also expect to find a difference in performance when putting the database into the live environment, where the back end sits on a networked drive. But we've been seeing a HUGE performance hit, to the point when the database becomes unworkable. The first screen (after logging on) displays a list of information, along with various filtering tools. Yes, the query behind it is fairly complex, calling on other queries, and (as mentioned) one or two UDFs. But that screen is taking 3 minutes to load! Selecting an item from the list in order to view the details in another form (which is a much simpler form, albeit with a subform) results in a delay of another 1.5 minutes. Switching to a different list form (a much simpler one) takes anything from 1 to 3 minutes, depending on which list is selected.

I fully accept that it may be that I need to look at optimising some of these queries. But I've developed a lot of databases over many years, often with complex queries, for various clients in various environments, and never seen performance like this, even with databases with >1m rows in tables being queried.

So here's the testing we've so far undertaken at my client's office:
1. Put the front end on a different laptop for a different user to test (everyone at this organisation uses laptops) - no change to performance
2. Switch the front end laptops between wired and wireless connections to the network - no change to performance
3. Move the back end to a different network drive, on a different server - no change to performance

I then tried a completely different environment. I set up a networked environment in my office, with a "server" PC and a "client" laptop connected via a domestic router. In this environment, a simple copying of the back end file from the server to the client (to test network speed) took 10 seconds. The same test in my customer's environment took less than 1 second, so obviously they don't have an issue with raw network speed. But running the database in my office with this test environment resulted in the initial (complex) list form opening in around 3 seconds, and other forms responding more or less instantly.

Finally, I've tested in another office, with a much more realistic network setting. Both my client and this other office are using Active Directory environments for managing logins, both have several hundred users logged in, proper commercial grade switches, routers and cabling etc etc. In this other "proper" office environment, albeit one where my "front end" machine is actually a virtual PC, and typically doesn't run particularly quickly, once again resulted in pretty much instantaneous response times for the database forms.

So now I'm stumped. As I said, I can accept that I might want to go back and see if I can optimise some of my queries, or some of my code. But I'd be expecting that this might result in shaving a couple of seconds off a form loading, taking it from 5 seconds to 3, perhaps. But what I'm seeing is in a different league, and I just don't buy that it's anything in my database design that's causing it.

Has anyone else come across issues like this? Can anyone offer suggestions? The people with whom I'm working have asked another guy in their office who's designed one or two databases, and his response was that he'd seen something similar with one of his databases, so he moved the back end tables onto SharePoint, but I'm really not wanting to do that. The IT department have said that there are "no known network issues at present" which was really helpful...

I'd be really grateful for thoughts or advice. Could it be virus scanning? Firewall issues? Help!

Many thanks
Andrew
 

JHB

Have been here a while
Local time
Today, 19:07
Joined
Jun 17, 2012
Messages
7,732
Only a pair of dumb question, how fast do the first form load if you take out the recordsource?
What if you've both the Front- and Back-end on the same laptop?
 

static

Registered User.
Local time
Today, 18:07
Joined
Nov 2, 2015
Messages
823
If it's just a poorly configured / heavily used network, there's not going to be much you can do about it.

How many concurrent users?
I had a database where performance was good for a single user, but as soon as a somebody else fired it up it dropped through the floor.
I had a another database that worked fine until the team moved from one room to another, same building and network.
Networks are funny things.

Run your tests on a copy of the back end to make sure all users are out.
Try querying a table from Excel.
Try moving the file closer to the root of the drive and query that. I've heard that lots of folder nesting can degrade performance.
Use UNC paths.

Since you want as few concurrent users as possible in an Access database, keeping a constant connection is probably false economy.

Remember that Access doesn't have a server. To run a query, the entire table gets dragged down from the network before the data can be queried. Make sure all of your tables are properly normalised and indexed.
 

andrewmrichards

Registered User.
Local time
Today, 18:07
Joined
Mar 15, 2012
Messages
18
Only a pair of dumb question, how fast do the first form load if you take out the recordsource?
What if you've both the Front- and Back-end on the same laptop?

I haven't tried taking out the recordsource, and can't test that right now as I'm only in that client's office once a week (the rest of the time I'm either doing development from my own office or with other clients). But the very first screen to appear (the splash screen) does open quickly, and has no data. The second screen that appears is a logon screen. There's no security required for the database, so it simply provides a drop-down list of people in a combo box, so that a certain amount of tailoring can go on based on who's logged in, but even that form, the data for which consists of an unbound form with a single combo box connected to a table with about 8 rows of data, still takes around 10 seconds to load...

And yes, if the FE and BE are on the same laptop, performance is excellent.

Thanks
Andrew
 

andrewmrichards

Registered User.
Local time
Today, 18:07
Joined
Mar 15, 2012
Messages
18
If it's just a poorly configured / heavily used network, there's not going to be much you can do about it.

How many concurrent users?
I had a database where performance was good for a single user, but as soon as a somebody else fired it up it dropped through the floor.
I had a another database that worked fine until the team moved from one room to another, same building and network.
Networks are funny things.

Run your tests on a copy of the back end to make sure all users are out.
Try querying a table from Excel.
Try moving the file closer to the root of the drive and query that. I've heard that lots of folder nesting can degrade performance.
Use UNC paths.

Since you want as few concurrent users as possible in an Access database, keeping a constant connection is probably false economy.

Remember that Access doesn't have a server. To run a query, the entire table gets dragged down from the network before the data can be queried. Make sure all of your tables are properly normalised and indexed.

As we're currently only testing, only two people have a copy of the FE, and there's only ever one of them using the system at a time.

The BE sits on the server around 3 levels deep in the folder structure. I've also heard that long paths can cause issues, and so this is deeper than I'd like, but unfortunately we can't change that, as IT policies dictate that this is the first folder to which this department have read/write access.

I have added the constant connection recently as part of the testing. Performance was just as poor without it, and it was added after I did some reading around and came across this link to the FMS website, so thought I'd give it a try.

All the tables are properly (and not overly) normalised, and have appropriate indexes.

I've not tried converting to a UNC path for the linked tables, so that is something I can try when I'm back there next Tuesday. I'd be delighted to find that this worked, but I'd somehow be surprised if it makes the huge difference we're looking for.

I can also test the connection to the queries via Excel then too. Stepping through the code in testing, there wasn't one single thing that seemed to slow it down - just every time the data is requested, there's a delay.

Related question. In many of the forms, the recordsource property looks something like this:
SELECT * FROM qryMyQuery WHERE 1=0
This produces an empty recordset, minimizing data transfer. Then the code in the open event replaces the WHERE clause with the specific data that's required, based on run-time factors. It's an approach I've used successfully for a long time without a problem.

But there still seems to be a "double-hit" to the data source going on with this approach (based on there being a delay before we hit the open event). But I can't remove the value from the recordsource property, as this will produce compile errors with "Field not found" for every reference to any field. Is there a better way to do this? And is there a tool that will allow me to see exactly what is being requested of the back end file on each request? A bit like a SQL Profiler tool?

Thanks
Andrew
 

Minty

AWF VIP
Local time
Today, 18:07
Joined
Jul 26, 2013
Messages
10,371
Are the laptops connecting on wireless networks - if so that's the problem, unless they have a mega good (read cutting edge with the latest hardware) wireless infrastructure.

We notice the difference in performance between a 100Mhz wired and 1Gb wired networks internally, and on wireless it's very poor.
 

andrewmrichards

Registered User.
Local time
Today, 18:07
Joined
Mar 15, 2012
Messages
18
Hi

As per my original posting:

So here's the testing we've so far undertaken at my client's office:
...
2. Switch the front end laptops between wired and wireless connections to the network - no change to performance

So yes, we've tried both options, on two different laptops - one cutting edge, one less so...

Andrew
 

Minty

AWF VIP
Local time
Today, 18:07
Joined
Jul 26, 2013
Messages
10,371
Sorry I did quickly read your OP but managed to miss that bit.
I do find it odd there was no performance change between wired and wireless, particularly on a complicated form / query load, in my experience it is pronounced.
This makes me think it is a network issue not db design.

Is the wired network sound? e.g hardwired Gigabit end to end, and did you turn off the wireless connection on the laptops to ensure it was going the correct route?

Edit - And yes virus scanning I have seen cause some serious issues.
 

andrewmrichards

Registered User.
Local time
Today, 18:07
Joined
Mar 15, 2012
Messages
18
TBH, I can't really answer the questions re their network infrastructure - I've no idea what their setup is (this is a large government body, whose IT is outsourced to a company based in India and who are not particularly responsive, but I can try to find out.

I hadn't thought to turn off the wireless connection when connecting via Wireless, so that's certainly something I can try.

As I said, it's making me question my own DB design, but I've done this kind of think a lot before without these issues, and the fact that it performs fine in another networked environment would make me think that it's not likely to be the database design (I certainly hope not!!). But I'm a bit lost as to what else to look for...

Thanks for your thoughts
Andrew
 

static

Registered User.
Local time
Today, 18:07
Joined
Nov 2, 2015
Messages
823
But there still seems to be a "double-hit" to the data source going on

Perhaps you could leave the empty tables in the front end and use queries in place of linked tables

qryTable1 : select * from [\\myserver\myshare\dbdata.accdb].table1

link your form to the local tables then switch out the table as well as the WHERE

recordsource = SELECT * FROM Table1;
tag = SELECT id, field1 FROM qryTable1 where...;

Code:
Private Sub Form_Open(Cancel As Integer)
    Me.RecordSource = Me.Tag
End Sub

Since the table is local and empty there shouldn't be a delay. Just a thought.
 

andrewmrichards

Registered User.
Local time
Today, 18:07
Joined
Mar 15, 2012
Messages
18
Hmmmm.... That's an interesting thought - I hadn't thought of doing that.

I might try giving that a go next time I'm there.

Thanks a lot.
Andrew
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 12:07
Joined
Feb 28, 2001
Messages
27,188
I had an abysmally slow response speed that was time-of-day dependent for a remotely hosted BE file. I was in New Orleans; our BE was hosted on a SAN in Charleston. If you got in early enough (before 8 AM), performance wasn't really bad. If you tried to do something at about 2 PM to 4 PM, take a nap!

The big issue for us was drive contention. Even though there were very few users of our app on that drive, it was shared among many hundreds of users for the other files and apps involved on other folders for that same drive because every department had a folder in that big bucket. We have to remember that when doing access arbitration, you need network bandwidth for the ACLs on each layer of the BE host disk's directory infrastructure - for EVERY user. Further, Windows File Lock arbitration often involves the need for each attempt to touch a drive to "vette" that touch with ALL OTHER USERS even if they are all doing simple sharing. For that reason, I respectfully disagree that keeping open a back-end connection is false economy. (Please, Static, no offense is intended, I just have observed different results from you, perhaps.) Far more important is to assure Optimistic locking for all query and recordset operations - including the recordset that you keep open to the BE file.

Now, this next observation was totally unexpected. I noticed a humongous improvement in the quality of connections when I started using some single-table queries defined in the front end to access data linked from the back end. In essence, I had one query to return every field from a given table, no modification of the fields (i.e. no reformatting, with VERY few exceptions), and no WHERE clauses. Just SELECT field1, field2, ..., fieldn FROM table ; - for every table in the BE, I had a corresponding FE query. They query, of course, named the table - but everything else that would have named the table instead named the FE query. The problem I was having at the time LOOKED LIKE somehow the form using a direct table reference was having trouble whereas using the query was no problem at all.

Does (or could) this sort of issue apply to your situation (i.e. FE form directly names BE table rather than a query as an intermediate)?

I know you found some combinations of events that made performance better. Making the BE reside on a less contentious network seemed to improve your performance, which might be related to file-system lock performance. Making the test on an LDAP-enabled network is not proof of anything except that contention for the resource was different.

Minty, as to "using the correct route" - if the network is set up correctly, then the cost per hop for each possible network hop will already be defined such that as long as the faster path is enabled, you don't have to turn off the slow path - you'll never use it because of the "hop-cost" factor. Most routing algorithms will invariably choose the cheapest network route.
 
Last edited:

Users who are viewing this thread

Top Bottom