Best approach to large data sets in Access

JoseO

Registered User.
Local time
Today, 13:29
Joined
Jul 14, 2013
Messages
72
How many rows of data can I have in my Access database table before I have to seriously start worrying about awful performance?

I know this topic/question has been floating in the Internet for a number of years. But it seems that the answer is always to walk away from Access into SQL and other platforms. What if walking away from Access is not an option? I am interested to know:

  • How much performance degrading are we talking about? Run a query/report, go make a cup of coffee and 10 minutes later the query/report is complete?
  • Will users experience all sorts of problems and weird glitches with the front end?
  • What other shenanigans do I have to worry about?
I appreciate any answers, many thanks.
 
In most cases, I think it's not about the number or rows in a table, but how the FE was designed that matters when it comes to performance. You could have a small amount of data; but if the forms are not designed to be efficient, then you can still have performance issues. The opposite (large data set + efficient forms = good performance) can also be true. How many rows do you have now?
 
I had a table with 1 million records and came back in quite some seconds. Tho not 1 minute.
But I'd say its different for the pc speed, network speed, etc.

Currently our standard avg amount is 20k recs. Still very fast. If you are storing super large volumes (millions), then I'd move to SQL.
 
That is a very subjective question, hinging on YOUR definition of "awful" performance.

Many other factors come into play as well as absolute number of records.

But give us some specifics that you are dealing with in your current situation. Perhaps having some context will help throw some light on the potential problem.
 
First, here are some published limitations.


Next, the answer to your question is "total size matters, piecemeal size matters less." Total size for a back-end file cannot exceed 2G minus some overhead tables. However, if you want to build wide records or narrow ones - or a combination of both - doesn't matter. I've had a narrow table frequently exceed 600K records and many text-and-number tables have exceeded 100K records. Therefore, it really depends on what you are doing and how you divide them up. Sort of like the old game TETRIS where different size-and-shape blocks have to be carefully dropped into place.

In practical terms, if your table is "update-centric" then you need to not exceed 1.0 to 1.3 Gb (very approximate) for the BE file because of issues with the Compact and Repair operation having trouble with REALLY big files - and also with "bloat" issues clogging up the DB more often. If your tables are updated once and then used only for lookups, you can use more table space than if you are doing frequent updates.

One way to get around some of these limits is if you have a logging table that does not have to reference anything via formal relationships. Such tables can be in a second BE file dedicated to logs and other things that don't have declared relationships to the main data. That (a) gives you another 2 Gb for table space in the logging BE and (b) reduces the strain on the main BE. NOTE, however, that you cannot establish relational integrity between two table, each in different BE files.
 
**THANK YOU SO MUCH** everyone for all the great, great answers. I have not started developing my DB because I first wanted to get expert thoughts/opinions before I proceed.

Once again, thank you so much for all of the great responses. I really appreciate it!
 
I think poor performance is more likely to be the result of indexing issues in many cases. What can be difficult is that access might take a different plan to resolve a query than makes best sense.

If you want to find sales in October above a certain amount, you would expect the query to first find the October sales, and then check the amounts. A query that works the other way might not work so well, but access can be obstinate about being as smart as you want it to be.
 

Users who are viewing this thread

Back
Top Bottom