addressing performance issues (1 Viewer)

CJ_London

Super Moderator
Staff member
Local time
Today, 07:01
Joined
Feb 19, 2013
Messages
16,521
As your db expands and there are more active users, it is quite possible you will experience performance issues and may be considering upsizing to SQL Server or MySQL.

Before considering such a transition, you should review your current design:

Overview
From a performance perspective, Access should be considered to be more like a web application than an Excel - and should be designed with the same considerations in mind - primarily to minimise network traffic.


Tables
1. are the tables properly normalised? If not, queries will be more convoluted and slower
2. have all lookups and multivalue fields been removed from tables? - they are not compatible with anything else anyway so will need to be removed if you are upsizing. In the meantime they have a drag on performance.
3. are the tables properly indexed? (any field which is used in joins or regularly used for searches and/or sorting should be indexed - but not those which mainly contain nulls or a limited range of values such as boolean fields) - see this link for more information - http://www.access-programmers.co.uk/forums/showthread.php?p=1516326#post1516326

Queries
4. have all domain functions been replaced with subqueries/sub tables? - domain functions are slow and do not use indexing - see this link for alternatives https://www.experts-exchange.com/articles/1921/Access-Techniques-Fast-Table-Lookup-Functions.html
5. are udf functions designed efficiently and reviewed to see if they can be moved to SQL? - udf functions are generally slower than SQL but sometimes cannot be avoided - but see the link in 4 above re the use of Static which can improve performance
6. have the initial wild cards used in searches (i.e. like *something*) been removed and users trained to use them when required? (the initial * negates the use of indexes with a resultant impact on performance) - see link in 3 above

Forms/Reports
7. Have these been designed to return only the records and fields required before being populated? - i.e. do not just have a table as a recordsource
8. have these been designed to not populate the sub forms/reports until required? - as a minimum on or after the main form current event
9. have any controls with domain functions been reviewed to see if the data can be included in the recordsource?
10. Are combo/listbox rowsources designed to only list what is required in both terms of width and depth? - having a combo listing 20,000 customer names is not efficient and slows the loading of a form. Instead leave the rowsource blank until a few characters have been typed by the user before then populating the rowsource based on what is typed.

Modules
11. do they have Option Explicit just below Option Compare Database?
12. are they compiled?
13. have you done a decompile?
14. are user front ends .accde and each user has their own copy?

15. has currentdb been assigned to a database object which is referred to in code rather than currentdb
16. does the front end maintain a persistent connection with the back end?
- if 15 is done right, it will

Access
17. Has name autocorrect been turned off?
18. has front end been compacted?

Network
19. does it have sufficient bandwidth for the number of users and level of traffic?
20. can bandwidth be increased?


Much of the above would need to be addressed anyway to take advantage of the performance benefits of upsizing the backend and if there is not another reason for upsizing such as db size, improved security, high volume of concurrent users.
 

static

Registered User.
Local time
Today, 07:01
Joined
Nov 2, 2015
Messages
823
Nice post.

Access sql isn't slower than SQL Server sql per se. A badly designed database will be slow whatever.

6. I know what you're saying but don't understand your point from a design perspective. If a user is able to design queries, performance is their own concern not the developers.As a developer you shouldn't want users to have to/be able to create anything.
9. There is never a good reason to use domain functions. Learn SQL.
11. has no impact on performance.
12. negligible
15. don't understand this. Dbengine(0)(0) would be faster than currentdb anyway.
16. Persistent connection only prevents recreation of .ldb file. negligible impact on performance


The biggest benefit of having a server is that the amount of data crossing the network is highly reduced. And that's just not something you can get around with Access.
 

CJ_London

Super Moderator
Staff member
Local time
Today, 07:01
Joined
Feb 19, 2013
Messages
16,521
@static Thanks for the comments:

6. I was really referring to search forms. Many developers design a search form and in code set the filter or criteria as

"[somefield] like '*" & me.searchcontrol & "*'"

sometimes even when the values are numeric.

this automatically makes for a slow response because indexing cannot be used. And in many instances users know the first characters so the initial * is not required. I actually do train users (and it's also there in the manual/help page) about using wild chars and my code inspects the searchfield for them and builds the string accordingly (i.e. = or like). Some struggle but most 'get it'.

11. probably not, I just included it because many users don't so there is potential for weird things to happen which might impact on performance

12. as for 11

15. The point is that each time currentdb is referenced a new instance is created (which is why the recordsaffected function always returns 0) and depending on the code flow can lead to memory issues, whereas declare a global curDB as DAO.Database and assign currentdb to it when the first code is run overcomes this. I've not tried dbengine in this context so no idea if recordsaffected will work or not.

16. Disagree - a persistent connection can make a difference because there is still an overhead in opening the file

I fully agree with your comment Access sql isn't slower than SQL Server sql per se. But I would say that sql server, because of the nature of the beast is often running many different concurrent applications which can impact on performance. Last year I was asked by a client to move a highly interactive application off sql server and back to access because the performance was about four times slower. This was due to insufficient resources being available for the app and IT unwilling to address the issue in any meaningful way.
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 18:01
Joined
Jan 20, 2009
Messages
12,847
4. have all domain functions been replaced with subqueries/sub tables? - domain functions are slow and do not use indexing

Many years ago I had a discussion on this site about the effect of indexing on DCount().

Testing (see post 17) showed that an index on a field facilitated an almost instant response while several seconds were required to run the DCount on the same field with no index.

We concluded that DCount does use indexes.
 
Last edited:

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 02:01
Joined
Feb 28, 2001
Messages
26,946
Static - in re #11 - using Option Explicit has a potentially major effect for beginners but the effect is less for experienced programmers who learn to turn on Option Explicit to help them debug.

If you DON'T use Option Explicit and have an implicit declaration, the badly declared variable is a Variant. The problem then becomes that the pseudo-code has to determine the current type of the variable because Variants are nearly "faceless." You are dealing with a data structure rather than a data element. Although I've not gone as far as to reverse-engineer the code, I have seen some evidence via system calls Variants are examples of a general Descriptor such as occurs in another version of Basic with which I am intimately familiar.

The Variant structure in question CONTAINS the data element, but it contains a lot more that has to be tested first. In linear VBA code, this is a small cost of doing business. For a loop? Not so cheap.

I suppose that in theory, you COULD do the debugging such that your code is tight and clean, after which you could remove the statement. But it's the kind of statement that only makes a difference if you don't compile your code first.

As to #15 - a DAO.Database object: While DBEngine(0)(0) might be slightly faster - and I'll emphasize MIGHT in this context, it is harder to read. The libraries for DAO support the use of the { dbDAO.Execute SQL, dbFailOnError } construct, which is incredibly efficient when you are doing lots of queries. The ability to then get dbDAO.RecordsAffected makes for a great pay-off for the minor annoyance of remembering to instantiate the DAO object.

Regarding #16 - persistent connection: MASSIVE on performance when you are dealing with a back-end on a slow network. Unfortunately, we can't all ride 100 Gb Ethernet all of the time. With the Dept. of Defense, we USUALLY had hot networks, but some of the issues were in almost constant transition and we invariably had slow multipoint hops in some cases. If you have a persistent connection, everything rides the open connection and you skip access arbitration on every level of every directory for the BE. Even more important, you skip EVERY FIREWALL test because of the persistent connection. If you then have a slow network, the effect of re-negotiating the circuit is the modern equivalent of re-inventing the wheel for every foot of travel for your wagon.

Now share that back-end and put every file lock, every directory lock, and then every lock on the contents of the LDB file in a situation of lock contention. It's generally called a nightmare and you DON'T want to be in this situation.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 02:01
Joined
Feb 28, 2001
Messages
26,946
The obvious reason that it is faster to use DCount on an indexed field is that Access only needs to read the index, not the whole table, and that means that you fit more records in a disk buffer. Older versions of Jet might not have done this, but Galaxiom's experiment would quickly detect the difference. Heck, you could try it too.

Take a table with a No Dups index. Use the Timer() function to get milliseconds since midnight before and after you run the DCount() function on the indexed field. Then repeat the experiment on any non-indexed field. The difference between the after and before times is the time it took to run the function in milliseconds. If the tables in question are big enough, you WILL see a difference.
 

CJ_London

Super Moderator
Staff member
Local time
Today, 07:01
Joined
Feb 19, 2013
Messages
16,521
@Galaxiom - your link is comparing dcount for a whole table rather than one with a criteria - and dcount in that situation can just refer to the tabledef to get the value so indexing is not required. But I take your point, perhaps indexing is used - but still not as efficient as a count in sql. I'll see if I can set up a test later to run the code in post to get some definitive results.
 

static

Registered User.
Local time
Today, 07:01
Joined
Nov 2, 2015
Messages
823
As to #15 - a DAO.Database object: While DBEngine(0)(0) might be slightly faster - and I'll emphasize MIGHT in this context, it is harder to read. The libraries for DAO support the use of the { dbDAO.Execute SQL, dbFailOnError } construct, which is incredibly efficient when you are doing lots of queries. The ability to then get dbDAO.RecordsAffected makes for a great pay-off for the minor annoyance of remembering to instantiate the DAO object.

Regarding #16 - persistent connection: MASSIVE on performance when you are dealing with a back-end on a slow network. Unfortunately, we can't all ride 100 Gb Ethernet all of the time. With the Dept. of Defense, we USUALLY had hot networks, but some of the issues were in almost constant transition and we invariably had slow multipoint hops in some cases. If you have a persistent connection, everything rides the open connection and you skip access arbitration on every level of every directory for the BE. Even more important, you skip EVERY FIREWALL test because of the persistent connection. If you then have a slow network, the effect of re-negotiating the circuit is the modern equivalent of re-inventing the wheel for every foot of travel for your wagon.

15. Currentdb just returns a copy of DBEngine(0)(0).

16. So the 'connection' isn't a data connection it's just acting as a file opener.

Of course, Access gets slower and slower with every active connection. If you have 20 users and they all have the FE open, a persistent connection means that the BE has 20 active connections even though the majority of the users probably aren't even using it, causing worse performance for the few that are. Swings and roundabouts IMO.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 02:01
Joined
Feb 28, 2001
Messages
26,946
Static - in re point #15: The DAO.Database variable is a copy of what you get by referencing DBEngine(0)(0). The object variable, by having a type of DAO.Database, has properties that DAO routines won't question and that won't confuse Intellisense, so speed or no speed, it has advantages in clarity and ease of use.

Speed difference in execution? Probably low though not necessarily zero difference. Remember, using the indexes - even if 0 - to traverse the implied pointers involved in that double indexing reference under the component object model also takes a couple of instructions. Admittedly, not very many, and we ARE splitting hairs that, on modern systems, are GHz-width hairs. For the DAO.Database variable, which is an object, the address lookup for the targeted object is done once. Remember that objects in Access are actually pointers to a data structure. So there is a slight speed difference in favor of the object that is multiplied by how many times you reference it.

In re #16 - yes, you are correct. The persistent connection's benefit has nothing to do with data per se. It has everything to do with getting to that data residing on the BE's host file-server (because for a pure Access setup, that's all it is.) But for split DB's, that speed of data fetching IS part of the performance thereof.
 

static

Registered User.
Local time
Today, 07:01
Joined
Nov 2, 2015
Messages
823
DAO.Database, has properties that DAO routines won't question and that won't confuse Intellisense, so speed or no speed, it has advantages in clarity and ease of use.

DBEngine is DAO.

Code:
Class DBEngine
    Member of DAO

CurrentDb = DBEngine.Workspaces(0).Databases(0)

Code:
Property Databases As Databases
    read-only
    Default member of DAO.Workspace

And Intellisense works as normal.

I'm not saying everyone should use DBEngine(0)(0) instead of currentdb.
I only mentioned it because this is a thread about performance. But as you say, nobody would ever notice the difference.
 

JMongi

Active member
Local time
Today, 03:01
Joined
Jan 6, 2021
Messages
802
I bumped this thread because I couldn't find anything to go with this quote:

15. has currentdb been assigned to a database object which is referred to in code rather than currentdb
16. does the front end maintain a persistent connection with the back end?
- if 15 is done right, it will
It might be helpful to explain exactly what the "done right" would be, no? Especially for us dummies! ;)

I've seen a few experienced users talk about using their user logging/session logging tables as the persistent connection. I'm not sure how to go about that exactly (I haven't really used recordsets or that whole paradigm unless I was copying a code example).

Just looking for some more explanations on this idea and figured this was a good a place as any to get it.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 02:01
Joined
Feb 28, 2001
Messages
26,946
@JMongi - I suspect that there is a lot of misinformation if you let your imagination run too wild here.

In re #16: One common way to have a persistent connection is to create a teenie-tiny little bound form (could even be single-field) that starts and ends its life invisible and for which the control is locked and not visible either. The form can be totally ugly since you will keep it invisible at all times. The trick will usually be to remember to close the form when your app is closing down if that is the approach you take.

It absolutely does not matter what you do to item #16 (specifically in regard to comments mentioned in item #15) as long as the bound object stays open even if minimized and invisible. What matters is that a connection is opened. You can pick the clunkiest or the most elegant syntax. As long as that connection is open and STAYS open, Access can ride the connection to the BE file easily because it is managed at the network level where the syntax of the query isn't an issue.

The idea of opening a recordset to a logging table will also cause a persistent connection to exist. Again, the trick is to never close the connection until the DB is ready to close down.

The explanation for #15 (for CurrentDB vs. another object) is that CurrentDB re-evaluates every time you use it whereas if you create a DB object and set it to CurrentDB once, it is more like a snapshot. For this reason, if you use CurrentDB.Execute to execute an action query, the next time you use CurrentDB.RecordsAffected, you lose your answer due to that re-evaluation. Whereas if you create a DAO.Database variable and call it something (like maybe DD) then if you do a DD.Execute on an action query, you can in the next instruction issue a call to DD.RecordsAffected to see how busy SQL really was.
 

CJ_London

Super Moderator
Staff member
Local time
Today, 07:01
Joined
Feb 19, 2013
Messages
16,521
I agree with Doc - there are potentially many ways you might do this which is why I didn't go into detail:-

A form - which might be the login form or a navigation form for example which opens a BE table and keeps it open.

my preferred method is to have a global db in a module

Code:
Option Compare Database
Option Explicit

public BEdb as dao.database 'can be any name but seems a reasonable name if all tables are linked. Could use AppDB or ProjDB for example

function openBE()

     if BEdb is nothing then set BEdb=Currentdb

end function

call openBE from whatever form opens first in your app

and as Doc suggests, don't use Currentdb.Execute or currentdb.Openrecordset, use BEdb.Execute or BEdb.Openrecordset

Also has the benefit that you can get the recordsaffected value for action queries.
 

JMongi

Active member
Local time
Today, 03:01
Joined
Jan 6, 2021
Messages
802
@The_Doc_Man , @CJ_London - Thanks for the replies!

I've moved away from rolling my own user login form. I'm letting Active Directory handle user authentication and then I'm just grabbing the username when they connect to the database. There eventually will be some additional database specific permissions/groups that I manage within the database, but that is for another day.

So, just to make sure I understand:
1. You are creating a self-healing database object that refers to the current database. This has performance benfits when using the CurrentDB methods with this object.
2. You call this object from the first form.

#2 is where it is a little hazy. Are you saying that calling your openBE function (and thus instantiating the dB object) is enough to create a persistent connection?
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 02:01
Joined
Feb 28, 2001
Messages
26,946
Addendum to my previous discussion: If you decide to open a recordset for logging purposes, you must do so by having a general module (perhaps one set aside for logging-related routines) and the recordset must be a public variable in that general module. Then call a routine from that module to open the recordset and have an entry point in the module for actual logging. (And one more to close the recordset for use during app shutdown.)

Reading CJ's code, his answer and example was more along the lines of what to use when not using CurrentDB. Since there were two questions on the table, it is possible to confuse the applicability of the answers.

Merely assigning CurrentDB to another DAO database variable doesn't count as opening anything. Opening a DAO recordset on the other hand DOES count as opening a connection. He DID mention using a form to open and keep open a BE table, and any bound table does that. My rambling discussion just pointed out that you could do this invisibly. In fact, if you do this at all, you WANT to keep that form invisible, since if it is visible, some user will immediately try to close it because they don't know what it is. Users do that sometimes. Which is why we suggest hiding the navigation pane and the other works "behind-the-scenes."
 

JMongi

Active member
Local time
Today, 03:01
Joined
Jan 6, 2021
Messages
802
That makes more sense to me. Thanks!

I think your logging and my logging might be talking about different things? I have user logging code in a module. One routine writes using an SQL statement to a table of users that changes the user status. It also writes a record in a related session logging table that notes the user and the db login time. Another routine writes new data on logout.

If I'm understanding you correctly, I could open a recordset to the session login table in my login sub. This would use the aforementioned currentdb function and that should keep the persistent connection until I close that recordset with the logout sub. Do I have the general idea correct?
 

CJ_London

Super Moderator
Staff member
Local time
Today, 07:01
Joined
Feb 19, 2013
Messages
16,521
not quite sure what you are asking - easy thing to do is try it - look for the creation and destruction of the .laccdb file


There is a bit of confusion on my part - my style is to not have any tables (linked or otherwise) or queries in the FE. The simple bit of code I provided is what I use during initial development before I have split the app. I populate my forms and reports with recordsets rather than using recordsources. The recordsets may be created from queries in the (password protected) BE or dynamically created in vba.

what I should have provided is

if BEdb is nothing then set BEdb = DBEngine.OpenDatabase("pathtoBE") 'complete other parameters as required and error checking

at that point a persistent connection is made and will remain until the app is closed. If the app errors then connection is lost - but using the function means it is self healing if it is called on a regular basis

then in a form load event I might use

openBE
set recordset=BEdb.openrecordset("SELECT * from sometable")

(I actually use a function to return that as well, but outside the scope of this thread)

My FE's are all supplied as .accde and although you can lock people out of the navigation window, there are those who can get round it and I really don't want users messing with direct input into tables or modifying queries. Can't stop them writing their own, but if they don't know the password, they have to do a lot more!
 

Users who are viewing this thread

Top Bottom