Data Import performance issues

toddbailey

Registered User.
Local time
Today, 15:27
Joined
Aug 26, 2011
Messages
23
We are using a simple C# program to cleanup and then import 100,000 records into a access table.

Import time is pathetic, takes over 10 minutes, the same data into sql server takes 1:45.

we are using oledb, and run a simple insert into.
we basically open the database, create a sgl cmd, prepare then execute the command

The table is empty before the insert process, there are no keys/indexes, it's just raw data and text fields.

any suggestions
 
Rather than Import the data, would a linked table perform the same data availibility service to the Access database?
 
Thanks for the reply, however linking to the database (source) is not an option. It's a licensed product (mfgpro) and I am told the only data I can have access to is the "canned reports" thus the use of an external program (written in C#) to filter the headers and footers and perform direct calls to the access database via insert statements.

is there any modifier I can use in the insert statement or oledb connection string to improve the responce times?



Here is the basics of what the program does

public const string Access_CONN_STRING = "Provider=Microsoft.ACE.OLEDB.12.0;Persist Security Info=False;Data Source=";
conn = new OleDbConnection(Access_CONN_STRING + Access_DataBaseName);
conn.Open();

sqlstring = "Insert into MfgProImport1 values (";
+ all the values

OleDbCommand objCmd = new OleDbCommand(sqlstring, conn);

objCmd.Prepare();
objCmd.CommandTimeout = 90;
objCmd.ExecuteNonQuery();
 
Hi Todd ... that code is a little above my pay grade I'm afraid!

However what I can add is that we use MFGPRO here at work also, and regularly export TXT files that can then be read/linked into both ACCESS and EXCEL for subsequent manipulation and reporting.
 
When dealing with text reports I open them as a TextStream, read and parse them line by line and add to a recordset. All in VBA.

I have one that reads 30 separate text files from a server, extracting an average of over 1000 records with 27 fields from each. There are only 15 records to a page so there are plenty of lines to strip out as headers too.

Moreover the records are on two lines, must be decompressed and rejoined during parsing. The destination is a disconnected ADO recordset which is then loaded to Excel for display.

This processing of the files only takes about 20 seconds on a very ordinary PC so ten minutes for 100K records is certainly abysmal.

It appears your program is opening a new connection for each record. Writing to an open recordset is a lot faster and I expect integrating the parsing and writing into one process would be a much better approach.
 
I only open the connection once, then loop through the text file, and process each record then execute a insert into. btw: There are 130 fields that get loaded.

However, I'll research the above methods and see if I can get any improvements.

If not then I'll tell the user community it is what it is.
Yes I'm aware that Mfg Pro can be directly linked, but Corp. HQ says no. SO I'll play with the card I've been given.. I'll post the new benchmark numbers in case anyone is interested


thanks
 
Linking to exported txt files from the program (as suggested by PaulO) is different to linking to the program itself.

It just cuts out the process of transfering the data from txt to mdb/accdb.
 
Interesting concept, but in this case I still need to import the data physically into the database
 
When dealing with text reports I open them as a TextStream, read and parse them line by line and add to a recordset. All in VBA.

I have one that reads 30 separate text files from a server, extracting an average of over 1000 records with 27 fields from each. There are only 15 records to a page so there are plenty of lines to strip out as headers too.

Moreover the records are on two lines, must be decompressed and rejoined during parsing. The destination is a disconnected ADO recordset which is then loaded to Excel for display.

This processing of the files only takes about 20 seconds on a very ordinary PC so ten minutes for 100K records is certainly abysmal.

It appears your program is opening a new connection for each record. Writing to an open recordset is a lot faster and I expect integrating the parsing and writing into one process would be a much better approach.


could you post a code sample on how you create the record set and execute the update?
 

Users who are viewing this thread

Back
Top Bottom