VBA Recordset DB Corruption

VincilLabs

Registered User.
Local time
Today, 04:00
Joined
May 11, 2011
Messages
13
I built a process to separate first and last name and remove certain names from various tables that are imported to my database. It works great when the recordset is fairly small. But when I do say 250,000 records at one time the database bloats nearly to the point that it corrupts(the dreaded 2GB limit.). I run compact and repair and all is fine, accept I have to do this every time I run my process.

Here's the problem. Let's say I also need to standardize the address, city, state, zip at the same time. Then the database blows up to the 2GB limit and no longer functions until I repair it. The database starts at 470 MB currently and will hit 1.5 GB + when I do the name cleanup. The table I'm trying to clean is only about 250MB.

I'm not allowed to post my code but I can generalize.

Name:
sql - alter add cleanName, firstName, LastName

Open recordset to be cleaned (main)
open recordset of bad names (groups)

loop through main until eof
loop through groups until eof
if main.name = groups.name then
goto next main record
else
update recordset(main).cleanName = main.name
groups.movenext
groups.movefirst

separate name function
cleanLastname = lastNameAlg()
cleanFirstName = FirstNameAlg()

update recordset(main).lastname = cleanLastName
update recordset(main).firstName = cleanFirstName
main.movenext
main.movefirst

repeat above for address, city, state, zip (prettymuch)

'Cleanup and close everything at the end
main.close
groups.close
main = nothing
group = nothing
 
Well without details there is only generic suggestions I can give, but

A) I would look and see if what you want to do can be accomplished through SQL queries. Likely they would be faster than recordset manipulations, especially in larger recordsets.

or

B) If queries won't work or aren't an option create a temp database and move the data over there, work in it, then clear out your local tables and copy over the completed/updated data when you're done. When you are done with the temp database simply delete it and recreate as needed.
 
Unfortunately the algorythms that I have to run the names through to separate them pretty much eliminates the use of sql queries. If the files I got were in the same format every time I might be able to make it work, but we get random stuff with god-knows-what in them. I created this process to save time in cleaning up the files. Funny enough I already did B) it was much worse before I did this; I could not even run one of the processes before it corrupted.

The real question here is does working in "large" recordsets always inflate the access databases like baloons?

*IMHO 250,000 records is not all that large...
 
You're right in that 250K records isn't that large...depending on how wide your table is, 250K records by 10 columns...not bad at all, 250K records by 150 columns...nightmare. Sans the details of what your tables and code look like it is difficult to suggest anything specific that might help you, if you are doing a lot of table updates that will cause database bloat quickly, try to minimize the actual table writes as much as you can.
 
Yep - for the name I update 4 fields for each row. On the 250,000 record table I'm dealing with I update every single one. so that's 1,000,000 updates for the name alone. Sheesh...

I have an idea that Might work. I can use VBA to do the cleanup and dump the clean parts into variables then update the table using sql queries rather than recordset.update. Do you think that will keep it from bloating (as much) as it is?
 
That could work, won't really know unless you try. The less you can hit the tables the better in this case I think.
 
Word. I give it a try and let you know what happens. This may take a while.
 
It's been a while. I've been slammed!
Yesterday and this morning I rebuilt my large data cleanup process to update the records using sql queries instead of the record set. It works! It's running right now and I estimate its about 1/2 through the names columns. Normally it would have bloated by more than 100% at this point. I started around 30-35 MB and its at 36 MB at the moment (I reduced the initial size of the database by using linked tables and a temp table to do my cleanup) WOOHOO. Thanks for the help DJKarl. See you around!
 

Users who are viewing this thread

Back
Top Bottom