Is 32768 the maximum number of objects in a MS Access database? (1 Viewer)

xavier.batlle

Active member
Local time
Today, 22:28
Joined
Sep 1, 2023
Messages
114
The short answer is: No

So you might be wondering: “What is the maximum number of objects in a database?”

The short answer is: “Nobody knows it!”

I’m going to explain what tests I’ve done and what the results are:

I have developed an easy procedure that is able to create any number of new forms, with or without the property HasModule=True, so I I’ve only tested the maximum number of objects in a database creating new forms, I think the same applies to reports, macros, queries, etc.

These are the databases I’ve created and its number of objects (mostly forms)

- Database 1: 15047 forms with 200 text controls each one and the property HasModule=False (The size of the database is about 1,2 GB)

- Database 2: 73815 forms without controls and the property HasModule=False

- Database 3: 44370 forms with about (20-30) controls each one, and the property HasModule = False

All above tests were stopped by me after 15-20 hours of execution, so you can still add new objects on each one of the above databases.

- Database 4: 5450 forms with 2 text controls and the property Hasmodule=True

This latest test ended up raising an error that it couldn't save more modules, so you can’t create more modules, forms or reports with the property HasModules=True

The more objects have a database the slower it is. Opening some of the databases with thousands of objects may take about 5-10 MINUTES!

To sum up:
  • The maximum number of objects = 32768 is not a hard limit, but the more objects have a database the slower it is, and at some point the database is useless because of its slowness
  • Form and Report controls are not counted in this Access specification limit.
  • Most of forms and reports contains code, so in this case the maximum number of objects with code is about 5400
  • Although it is technically possible to work with databases with a large number of objects, it is not recommended at all!
If you want to know more about this and other limitations of MS Access, I suggest you to read this interesting and useful article by Colin Riddington:
 
Last edited:
I tried that once, I also went way beyond the 40k forms. It worried me when I tried to rename them to a consecutive number according to their order of appearance in the AllForms collection with a For Each loop: it skipped some forms. I would guess it becomes erratic when you have that many objects.

Also, don't try to Compact & Repair, it's gonna take forever.

Within my tests, I also noticed it almost reached 1k forms opened at the same time, 930~ forms or so, they were all empty for that test.

I might have noticed a few other problems, but I don't remember them all.

Not recommended to work with that many objects.
 
Should try telling that to a user over in Utter Access. :)

 
I believe that over time, Access has been incrementally redesigned to allow more things to be created of any particular type. It has been moving towards the only REAL limit - which you hit when the addressing space is exhausted. I watched the Access Specifications numbers go up from version to version, but the 2 GB addressing limit has always been pretty much THE limit - until articles regarding the "Large Address Aware" option started coming out. This "LAA" option allows Access to store things anywhere within the 4 GB limit implied by 32-bit addresses. As long as Access does not undergo the final change - switching all internal addresses over to 64-bit - I think that the "LAA" 4 GB limit is going to be a hard one.

I doubt that Access will take the plunge to 64-bit internal addresses. First, that would put it in competition with other MS software. Second, with the greater implied amount of data but no internal security/permission structure, it would be a lousy big database.
 
Nope. But I do wonder why people wonder that and it scares me.
Call it laziness, expedience or lack of imagination. The only time I ever wondered about the number of objects I could create was when I was experimenting with creating dynamic menus on-the-fly and creating new command buttons on startup. I bruised my ego on the lifetime limit to the number of controls you could create on a form. That cured me of pondering other ways to try to exceed the limits of Access objects of any kind.
 
The only serious limit I ever personally hit was when I was doing my genealogy database to build a family tree and had to employ recursion. I got it wrong on the first try and got a "Stack overflows heap" error. There is now a flag in my person entries that says "Family tree builder has already been here for branch inspection." So far I don't have any "cousin marriages" - which would have caused that kind of recursion error - people who do not share parents but who DO share a grandparent or deeper and aren't siblings. Doesn't happen often, but my mother's family IS from a part of the USA where cousin marriages were not unknown in the mid-1800s.
 
Nope. But I do wonder why people wonder that and it scares me.
Perhaps you (and others) are missing the point of this thread? The Access team provides a list of specifications about supposed limits in Access databases. These provide useful guidance information for developers.

Some of the stated limits are accurate e.g. 2GB file size, 64 characters in object or control names, 255 characters in short text fields.

However other stated limits are misleading or completely inaccurate. For example, the maximum number of characters in a database password has been 20 since A2010 (not 14 as stated).

Another example: the specified maximum number of code modules = 1000. However I have several large databases which over the years have accumulated far more code modules (including form and report modules) than that. The largest has about 2500. I’m not trying to exceed limits. That just occurred naturally as the project expanded. And yes it is well designed…. 😏

So, like Xevi, I’ve tested several of the limits to see whether they are accurate or not. I’ve published my findings as information for other developers.

In some cases, stated limits are hard limits that cannot be exceeded. In other cases, they should be treated as guidelines. Although it may be possible to exceed them, doing so will have a negative effect on the database performance.

However there are a further set of specifications that are both incorrect and can be exceeded, where there is a good reason to do so, with no adverse effects.

The important distinction is to know which category any such ‘limit’ falls into.

Certainly nobody should be trying to circumvent limits for the sake of it though one particular member at AWF and another at UA do appear to see that as their goal.
 
I have never given any thought to hitting a limit with Access because I can't imagine it happening!
I've been of the opinion that from the days of XP/NT onwards, limits were a thing of the past.

Back in the day some worked on the idea that it shouldn't take longer than three seconds to locate a record on your PC. Never mind on a network.
It never ceases to amaze me that I can type a search on the internet and it comes back in milliseconds. Basically as fast as a search in Access.
 
It never ceases to amaze me that I can type a search on the internet and it comes back in milliseconds. Basically as fast as a search in Access.
We're still standing at the start line. Wait for the next 10 years to see how it'll be.

maxresdefault.jpg

1efault.jpg
 
Last edited:
While I see the point in precision about the limits -- we are, after all working with databases where ambiguity can be deadly -- I'm not convinced that we need to be concerned about whether those limits amount to limiting factors on the great majority of production databases. We like to point to the outliers, of course.

The question always comes up when discussing them, though: What's the potential for creating a usable, working database application with tens of thousands of objects in it?
 
We're still standing at the start line. Wait for the next 10 years to see how it'll be.

View attachment 116678
View attachment 116679
Hm. My iRobot self-driving vacuum cleaner only connects to a 2.4 G WIFI network and refuses to work with the 5G WIFI in my apartment. Now, I'm supposed to worry about 6G? If I have to "upgrade" my vacuum cleaner every couple of years, I might go back to the old manual Hoover.
 
The last robot vacuum cleaner we had was a hand-me-down gift from someone else who won it in a drawing and couldn't make it work. So we tried it. It kept on getting cornered - and unable to get out of the corner. But I will grant that the corner WAS pretty clean.
 
The last robot vacuum cleaner we had was a hand-me-down gift from someone else who won it in a drawing and couldn't make it work. So we tried it. It kept on getting cornered - and unable to get out of the corner. But I will grant that the corner WAS pretty clean.
Mine tends to get high-centered on carelessly discarded shoes and extension cords. Other than that, though, I've not had to rescue it from corners.
 
While I see the point in precision about the limits -- we are, after all working with databases where ambiguity can be deadly -- I'm not convinced that we need to be concerned about whether those limits amount to limiting factors on the great majority of production databases. We like to point to the outliers, of course.

The question always comes up when discussing them, though: What's the potential for creating a usable, working database application with tens of thousands of objects in it?

My article includes the following comments:
With the exception of modules, it may well be there is no actual limit.
In reality, it is likely that other limits will be hit first e.g. total number of code modules, total size of database (etc).

However in practical terms, a database with anything like this number of objects will be almost unusable. For example:
a) the database may take several minutes to open.
b) the navigation pane will become unresponsive making searching or sorting very slow.
c) running queries / code will become increasingly slow and corruption may become more likely.
d) compacting the database may take many minutes.

A well designed database should never need anything like the stated limit of 32768.
Just because you CAN exceed this number of objects doesn't mean you SHOULD do so.

In fact I would strongly recommend that no Access database should ever have more than a few thousand objects at most.

My largest FE database is about 130 MB with around 4000 objects including over 1800 code modules (almost double the stated limit).
This has been running in various UK schools with very large SQL Server BEs for over 20 years - typically with up to 200 users in each, often simultaneously.

It still works well though its unlikely that I will ever create anything that large again.
 
My largest FE database is about 130 MB with around 4000 objects including over 1800 code modules (almost double the stated limit).
This has been running in various UK schools with very large SQL Server BEs for over 20 years - typically with up to 200 users in each, often simultaneously.
wow, you can say 2000 users and we will still believe!
 
The last robot vacuum cleaner we had was a hand-me-down gift from someone else who won it in a drawing and couldn't make it work. So we tried it. It kept on getting cornered - and unable to get out of the corner. But I will grant that the corner WAS pretty clean.
We had vacuum cleaner but we threw it out because it was only gathering dust.:)
 
Better keep it that way and not upgrading for a while. This is a 6 days old news.

View attachment 116681

Oddly enough, the only bad language currently involved is when I swear the robot for getting tangled up in shoestrings, out of reach, under the bed.
 

Users who are viewing this thread

Back
Top Bottom