system resources exceeded when exporting queries (1 Viewer)

Blueskies

Registered User.
Local time
Today, 00:33
Joined
Oct 23, 2009
Messages
69
Hi

I've got some code which loops through all a bunch of queries which are looking at linked tables in my database. For each query it runs this:

Code:
Sub export_tab_to_csv(mytab As String)

    Dim qname As String
    Dim exportpath As String
  
    qname = "Export " & mytab
    exportpath = "C:\Access Exports\" & mytab & ".csv"
  
    DoCmd.TransferText acExportDelim, , qname, exportpath, True
  
    Debug.Print mytab
    DoEvents
      
End Sub

I'm getting a 'system, resources exceeded' message after 4 loops through this code.

There are 71 tables and they are hefty - just under 400,000 records each, and each query is looking at another query which looks at the data, so it's a little fiddly, but necessary for me to get all the fields in the way I want. I can provide more details on this, but thought I'd keep this simple for now!

As all the data is linked from other databases, so I don't understand why I'm getting a resources issue? - if it works for 4 loops, I would have thought it should be OK for going through all and exporting all the queries?

Can anyone suggest what I can do differently?

Thanks!

PS I'm using Access 2013
 

Ranman256

Well-known member
Local time
Yesterday, 20:33
Joined
Apr 9, 2015
Messages
4,339
try exporting to excel. see if it fails. docmd.Transferspreadsheet….
if not, then just save as text from xl.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Yesterday, 19:33
Joined
Feb 28, 2001
Messages
27,001
First and foremost, a quick answer is often available if you look into the "Similar Threads" section at the bottom of any post pages - like this one.

Second, Windows Resources are defined this way:


Here is the reason you "run out of resources." Windows, at boot time, looks at every installed app file to see if it has a resource file reference inside it. A table is built during this boot process to record the location for each identified resource so that when it gets activated in whatever way is appropriate for that particular type of resource, there is a "fast pointer" to it. Windows pre-allocates a memory-resident table of limited size for this purpose. The size of the table is limited because of the way they use memory management hardware to create the reference. When you run out of resources, you have actually really run out of the pre-allocated resource slots that hold these resource pointers.

EACH APPLICATION FILE will have some kind of resource involved in some way. Opening that file adds a resource or two or six to the resources list. The problem is that closing the file doesn't REMOVE the resource entries. They stick around until your session ends, at the very least. In the worst cases, a reboot would be needed.

When you create a new Excel file, you consume as many resources as that file type naturally builds. That means you consume resources until there are no more to be consumed.

This link points back to a post that has several relevant links in it related to the error you reported. Some parts of it may be helpful. In particular, if you can increase the max locks per file, that is often the way around the problem at least in the short term.


You should be able to find other articles using an advanced SEARCH for "System Resources Exceeded" because this problem has come up often.
 

Blueskies

Registered User.
Local time
Today, 00:33
Joined
Oct 23, 2009
Messages
69
Hi - thanks for the posts guys.

Oddly enough I seem to have got around the problem - I put in an error handler that simply re-ran the 'export_tab_to_csv' sub when it got to a resources error.

I thought this would only be the start of solving this, but the error handler only tiggered once during 71 exports, and seemed to fix it. After that, the rest of the loop exported all the queries without complaint. I suppose it's a bit of a brute force thing, but if it works....

I am running with max locks per files increased in another part of the code for when I'm running some update queries:

Code:
DAO.DBEngine.SetOption dbMaxLocksPerFile, 50000

... so maybe this helps. If I get any more problems, I'll do some more digging as you suggested Doc_Man.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Yesterday, 19:33
Joined
Feb 28, 2001
Messages
27,001
Interesting. This COULD be a side effect of having an error handler. You see, the program doesn't necessarily stop because of the error itself. It often just stops because of which error handler was used. When you get the message box with an error and the "DEBUG" button as one of the options, that usually means you took the Access "Last Chance" handler, which is the error handler within MSAccess.EXE itself - as opposed to any other handler you wrote in any event routine. That is the most unforgiving of error handlers because Access has no clue as to how to recover from your error. It just assumes that your app is now officially hosed.

When you handled the error with your own handler code, you might not have actually remedied the problem - but you dismissed it before it got to that unforgiving handler. And to be honest, in the specific use you described, it probably doesn't matter that you ran into a limit. So you didn't actually register a resource you weren't probably going to use right away anyway. So what? The Access "attitude" on these things is that if an error isn't instantly fatal (like "Disk just caught fire and flew out of the box and rolled out the door") then it is continuable. By dismissing the interrupt via the RESUME NEXT or RESUME label option, you implicitly gave Access permission to continue through the error.

I joke with my error description sometimes, but there actually WAS a case about 45 years ago when an old Digital Equipment Disk had a profound mechanical failure. It was a brand new ultra-fast disk, very high RPM for that era, and it ran with a chain-drive because rubber pulley drives weren't strong enough. Damned disk platter was over 4 feet in diameter (remember, 45 years ago, so heads weren't as small as they are now) and the platter was fairly solid to prevent vibrational distortion that would cause a head crash. This was a head-per-track style disk, so we are talking massive by modern standards even for an external drive.

Now anyone who ever drove a chain-drive motorcycle knows that the drive chain CAN break. This one did, and the 4-foot diameter, 3/16" thick disk became a flywheel carrying a TON of momentum at 12K RPM. The shock of the chain suddenly jamming everything to a stop caused the drive motor to catch fire. The outer case broke, allowing the disk to roll out the door and down the hall for a short distance before it hit something strong enough to stop it. And at the same time, the console printed the message, "$DKB3 is off-line." Nobody could disagree with that assessment.
 

Blueskies

Registered User.
Local time
Today, 00:33
Joined
Oct 23, 2009
Messages
69
Fascinating!

Your explanation of the the error handler behavious makes sense - liking your idea of a fatal error too!

As a mere 51 years old I'm afraid your description of chain-drive disks is nothing I've ever encountered, but I loved reading about it!
 

Users who are viewing this thread

Top Bottom