Database growing hugely (1 Viewer)

CedarTree

Registered User.
Local time
Today, 03:29
Joined
Mar 2, 2018
Messages
404
Hi - I have a front-end and I store all my data in a back-end database. I run batch calculations that impact tens of thousands of records, and for every 10,000 records going through the batch, I close and compact the back-end database. But for some reason the front-end is growing as well like crazy. I can't pinpoint what would cause that since all my records are stored in the back-end. Suggestions on how to begin debugging? Thanks!
 

shadow9449

Registered User.
Local time
Today, 03:29
Joined
Mar 5, 2004
Messages
1,037
Ok, how about the most obvious: do you have any temporary tables in the front end? Including any that get created dynamically when doing some processing?
 

Micron

AWF VIP
Local time
Today, 03:29
Joined
Oct 20, 2018
Messages
3,478
Are images or other types of files involved here? Even if you only store the path in tables, Access can cache files in a system folder if you use Access to open them.

How much bloat are we talking about here?
 

CedarTree

Registered User.
Local time
Today, 03:29
Joined
Mar 2, 2018
Messages
404
Enough bloat to hit the 2 Gig limit. I keep looking through my code and I don't believe I'm using temporary tables. Everything needed for the calcs gets loaded into the Linked tables.

I'm trying to think of a way to monitor when the size of the database changes so I can see at what step things change?
 

theDBguy

I’m here to help
Staff member
Local time
Today, 00:29
Joined
Oct 29, 2018
Messages
21,454
Hi. Can you elaborate on exactly what is involved in "batch calculations?" Are you using code? If so, can you post it?
 

Micron

AWF VIP
Local time
Today, 03:29
Joined
Oct 20, 2018
Messages
3,478
If the 2GB info was directed to me, then 1 out of 2 ain't bad I guess. :)
It might be the updating operation itself, such as opening multiple recordsets without closing any of them - or other operations that are causing temporary space or memory to not be released. I'm with tDBg; the code for the operation(s) might be a good place to start.
 

CedarTree

Registered User.
Local time
Today, 03:29
Joined
Mar 2, 2018
Messages
404
Relatively simple code... let me spend some time looking and see if I failed to close a recordset or something.

(But it's client specific coding so I can't easily post).
 

CedarTree

Registered User.
Local time
Today, 03:29
Joined
Mar 2, 2018
Messages
404
Okay - one thing I'm noticing (which hasn't impacted other projects like this)... the size grows when I run a query inside the front-end database (currentdb.execute(sql)) even if the sql only impacts a table in the back-end. Maybe I should run the query in the back-end database itself?

(Or maybe this has impacted other projects but I never noticed it as much as this project b/c the # of records in this case is much larger).
 

jdraw

Super Moderator
Staff member
Local time
Today, 03:29
Joined
Jan 23, 2006
Messages
15,379
Compact the BE and FE and record their size. Run your "batch" and record their sizes after processing. Let us know what "growing hugely" actually means in this case by posting before and after sizes.

Can you describe the batch calculations enough to give us some idea of what is being done?
Do you change a bunch of queries during the processing of a Batch?
 

CedarTree

Registered User.
Local time
Today, 03:29
Joined
Mar 2, 2018
Messages
404
Basically running through 100,000 records and that blows up both FE and BE quickly to 2Gigs. But I think I found the culprit.

SubRoutine A would pick the 100,000 records and call the calculation procedure which would grab the history for each person and process it. The culprit is that for 100,000 records I was (1) grabbing records from BE tables but using currentdb.openrecordset and (2) updating BE tables using currentdb.execute. Both of the currentdb references I replaced with running things in the BE itself (since that gets compacted every 10,000 records). Now my FE doesn't change at all in size. Thanks guys!
 

Micron

AWF VIP
Local time
Today, 03:29
Joined
Oct 20, 2018
Messages
3,478
Too bad you couldn't post pseudo code to give us an idea of things. Each time you invoke
CurrentDb.DoSomething you are creating a pointer to the current db in memory. Thus after the line processes, you repeat the creation of a pointer to the same db in memory. If instead you Set db = CurrentDb then
db.DoThing1
db.DoThing2
etc.
it's always the same reference to the db in memory. I've never heard of this causing bloat but who knows? I would think that would be more of a processor/memory problem so it may not be related. It could be an indicator that you are doing things inefficiently and thousands of recordsets may be one of them. I expect you will find your be becomes the bloat problem instead.
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 02:29
Joined
Feb 28, 2001
Messages
27,140
Here is the issue: The FE is where all the work occurs, which includes some operations behind the scenes. If you have a really big query with a huge JOIN, or (even worse) if there is some sort of combinatorial query (i.e. looking through combinations of things using a Cartesian-product JOIN), Access makes a temporary list representing what it thinks you will be touching. This can get quite large if the number of combinations is large.

When you did your scan of 100,000 records and did a sub-scan for the history, you were making 100,000 sub-scan lists. Even if this putative internal list was short, 100,000 of anything can mount up.
 

Users who are viewing this thread

Top Bottom