memory limits (1 Viewer)

neuroman9999

Member
Local time
Today, 17:27
Joined
Aug 17, 2020
Messages
827
does anyone know what this is for Excel? In terms of declarations getting way out of hand? and/or opening so many instances of excel? at what point (thread concurrency, mem usage, stack overflow, etc...) does application crashes become a risk due to overload? does MS have a KB on this?
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 17:27
Joined
Feb 28, 2001
Messages
27,147
First, it will depend on whether you are talking Excel 32-bit or 64-bit. Here is what I could find.


Many of the entries say "limited by available memory." Therefore, the combination of physical memory (RAM) and your system's virtual memory file size will set your limits. The virtual memory size is normally (i.e. by default) equal to your RAM size, though there is no reason you couldn't make it bigger if the disk has the room. You might need to do a "defrag" on the disk if you want to make a really big virtual file. If what you want to build is less than 8 GB total, a system with 8 GB RAM will probably have enough virtual space for you - though that is subject to variation depending on what else is running at the time.

Some of the "limit" entries say "limited by available memory and system resources." Those limits (e.g. "Open workbooks" is one such limit) also depend on registry settings like maximum number of open files and a couple of other non-Excel limits.

Excel ALSO has a limit for the 32-bit version in that the 2 GB limit applies to workbooks used for data modeling. Those limits do not apply to the 64-bit version.
 

neuroman9999

Member
Local time
Today, 17:27
Joined
Aug 17, 2020
Messages
827
Richard,

I only asked this question because I just finished a job whereby the employer wanted specific tasks to be done, which honestly made no sense because the work actually resulted in a backwards efficiency being accomplished. But he paid me for it, so be it. The code that I wrote, per his specs ""TO THE ''T''"", which was the first version I wrote, opened a total of 342 workbooks over an execution time of 5 minutes and 45 seconds. I know, nuts right? LOL. When I realized that it was totally impractical, and he concurrently complained that the VBE was freezing on him, I was like ""well duh, of course it is!"". I've had that happen many times, and even though it does, the code keeps executing and the process goes on, and when it's all over the freezing stops. Even if that takes a long time. I fixed that issue by sorting the dataset of 342 recs, checking for dup values in the loop and keeping books open until new vals were encountered. I attached the log files to the end of the post to show you how crazy the differences in execution times were.

Can you explain how ""open workbooks"", in terms of being related to available memory and system resources works? I'm under the assumption that open books take almost no memory with regard to their object size (slot usage). If a book is opened within the same instance of the XL APP, does it really matter? It would seem to me that the increase in mem usage for a simple book would be marginal at best.

and can you also explain what "data modeling" has to do with anything? 2GB applies to books used for data modeling? how would I know if a book was being used for that or not?
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 17:27
Joined
Feb 28, 2001
Messages
27,147
Fair question. An open workbook (itself) takes up a relatively small amount of space inside of Excel. Probably less than 300 bytes counting a long file-name string, file path, and file properties. However, workbook content will include one or more worksheets, which contain some amount of data. When you open the Workbook, Excel will try to load all of it and there is where your memory will be consumed. If it is actually a "virgin" workbook, then you would be quite right that its load would be minor. But if it is a full workbook, ... not so minor.

The part about system resources is that Windows has a limit on the number of files it can have open at one time and you would not believe how many files Windows itself will have. Open the Windows Task Manager and look through the various Processes and Services. EVERY ONE of the entries in those panels of Task Manager represents at least one open file.

Windows reads the .EXE file to load its content before execution. So from the viewpoint of the file system, an image file is just another open file. I would have to get into Windows paging dynamics to tell you EXACTLY how it is used, but trust me when I say that it counts as an open file that STAYS open for the duration of the process that opened it.

Where that open file gets significant is that this is not a part of Access that is affected by the open file. It is a part of Windows, and unfortunately, Windows will automatically allocate a fixed number of file slots (actually called "file handles"). If you open too many user-level files at once, you run out of a system resource - open slots in the open file table. You can fix it by finding the registry entry, but the pre-allocated file table does not change until next reboot, because it is a boot-time-only parameter.

As to Data Modeling: Other than saying that there is something on one of the ribbons that has something o do with data modeling, I really can't tell you much. I do my modeling in Access, which you might have guessed. However, for whatever reason, the Excel "limits and specifications" makes this distinction about modeling, which means they think it makes a difference. I believe it has to do with what is pre-loaded when you launch a spreadsheet with data modeling involvement. Beyond that, I'm out of my depth.
 

Users who are viewing this thread

Top Bottom