Large data imports expanding file size (1 Viewer)

thommeread

New member
Local time
Today, 05:14
Joined
Dec 7, 2007
Messages
2
Morning all,

I'm having a problem with mdb file size. I'm importing a large amount of data from a number of tab delimited text files via a simple transfertext function. The process goes: empty the tables in the database, then import the data into the tables.

All this works fine, but the file size rockets to over 1.5Gb. When I then compact and repair, it goes down to 420Mb. I'm not deleting and recreating the tables, and at no point is there 1.5Gb worth of real data, so what's causing this?

N.B. I realise I can call compact and repair following the import, but this is going to take too long as they are user-initiated imports.
 

rsmonkey

Registered User.
Local time
Yesterday, 21:14
Joined
Aug 14, 2006
Messages
298
from ms..

Running the Compact Database utility within Microsoft Access can also improve the performance of the database. This utility makes a copy of the database file and, if it is fragmented, rearranges how the database file is stored on disk. When completed, the compacted database has reclaimed wasted space, and is usually smaller than the original. By compacting the database frequently, optimal performance of the database application is ensured, and page corruptions due to hardware problems, power failures/surges, and so on are resolved.

If a primary key exists in the table, compacting re-stores table records into their Primary Key order. This provides the equivalent of Non-maintained Clustered Indexes, and makes the read-ahead capabilities of the Microsoft Jet database engine much more efficient.

Compacting also updates the table statistics within the database that are used as Jet optimizes queries. These statistics can become outdated as data is added, manipulated, and deleted from the various tables. Query speed will be enhanced significantly, because they are now working with data that has been rewritten to the tables in contiguous pages. Scanning sequential pages is much faster than scanning fragmented pages. Queries are forced to recompile/optimize after each database compaction.

During compaction, you can use the original name for the compacted database file, or you can use a different name to create a separate file. If you use the same name and the database is compacted successfully, Microsoft Access automatically replaces the original file with the compacted version.

In Microsoft Access 2000, you can set an option to automate this procedure. To do so, on the Tools menu, click Options, click the General tab, and select (check) the box labeled Compact on Close. This will automatically compact and repair the database as it is closed.

http://support.microsoft.com/kb/209769/EN-US/

for more
 

DCrake

Remembered
Local time
Today, 05:14
Joined
Jun 8, 2005
Messages
8,632
Simple Software Solutions

Footnote to the above.

When Access went from 97 to 2000 Microsoft changed the page set from 64 to 128 this is why a lot of mdb's bloat significantly. It may be worth while cnverting your back end back to 97. This will not cause any problems with connectivity and may result in the performance increase you are searching for.

With regard to the Compact on Close option in 2000 this only relates to the open database not the connected database. This is assuming you have a FE & BE setup.


Code Master::cool:
 

thommeread

New member
Local time
Today, 05:14
Joined
Dec 7, 2007
Messages
2
Hi, thanks for the replies, I still need a solution here:

The file with the size problem is the front end, not the back end, and I can't switch the front end to '97. Network speed problems have made it impossible to use anything but the smallest linked tables, so the bulk of the data imports comes into the front end - not ideal, but better than waiting 15 minutes for the imports to run.

My question is: is there a more efficient way of importing 100 Mb of data into an access 2000 database than a simple transfertext? Surely there must be a way to import data without having to compact and repair each time?!
 

DCrake

Remembered
Local time
Today, 05:14
Joined
Jun 8, 2005
Messages
8,632
Simple Software Solutions

Hi

Thinking off the wall there is another possible workaround, it may sound long winded and I have not tried it, but hell why not give it a go....

First pick up your text file and dump it into an array. Then copy that to an Excel file see (http://support.microsoft.com/kb/247412/en-us)

Then get access to link to the spreadsheet. Ok the data won't be embedded into the mdb but it should make the availability a bit faster and prevent the bloating.

You may be able to poke the text file directly into Excel withou using an array?

Code Master::cool:
 

Users who are viewing this thread

Top Bottom