Question Slow performance with back-end on network share

tomazo

New member
Local time
Today, 14:54
Joined
Jun 23, 2010
Messages
1
Hello, this is my first post to these forums.

I have created an Access 2007 database/vba application which is split. The application works well when placed locally, but becomes extremely slow when I place the back-end on a network share (mapped drive in windows 7).

Basically, I have to read a text file and import all the data into an empty table. I read the file in a while loop and use sql insert queries to insert the data. One insert is used for each line in the text file. The text file has about 3000 lines, thus generating 3000 inserts. I use a default accdb file for the back end and front end.

After a bit of searching, I enclosed the loop in a DBEngine.BeginTrans/DBEngine.CommitTrans statement believing that this would eliminate most of the file i-o operations with the back end on the network share. However the performance did not improve by doing this.

My code is something like this:

Code:
On Error GoTo ErrorHandler

Set dbs = CurrentDb

dbs.Execute "DELETE FROM NyKnutepunkt;", dbFailOnError

On Error GoTo ErrorHandler

 Dim lp() As String
tic = Timer

'[...] code to get filename etc

Set a = fs.OpenTextFile(strInputFileName, 1, False)

DBEngine.BeginTrans
numitem = -1
i = 0
Do While a.AtEndOfStream <> True
L = a.readline()
'split the comma separated values
lp = Split(L, ",")

If lp(1) = "BUSES" Then
numitem = Int(Val(lp(0)))
L = a.readline()
L = a.readline()
'Init progress bar
SysCmd acSysCmdInitMeter, "Reading buses...", numitem
End If

If (i < numitem) Then
'Progress bar
SysCmd acSysCmdUpdateMeter, i
lp = Split(Trim(L), ",")
i = i + 1

If IsNull(lp(7)) Or IsEmpty(lp(7)) Or Len(lp(7)) < 1 Then
lp(7) = "N/A"
End If

sqlQ = "INSERT INTO NyKnutepunkt (Nummer,Navn,Area,Zone,Code,BasekV,Voltage,Angle) VALUES (" & lp(0) & ",'" & lp(7) & "', " & lp(2) & ", " & lp(3) & ", " & lp(1) & ", " & lp(4) & ", " & lp(5) & ", " & lp(6) & ");"
dbs.Execute sqlQ, dbFailOnError

End If

Loop
a.Close        
 DBEngine.CommitTrans
Debug.Print Timer - tic

'lots of other code

dbs.Close
 
Basically, I have to read a text file and import all the data into an empty table.

I would suggest that you do the initial import into a local back end. Once all the data has been imported, append the data to the desired table.

If you did thins, then you could do simple import of the CSV file. Once it is in the local temp table, run update queries to clean up the data. Once the data is clean, append the data to the desired table. This way you are working will all the records at once. This would avoid all the inserts.
 
To compound what has already been sugested you are actually running 3000+ insert queries during the loop, one for each record. you may be better off using recordsets instead.
 

Users who are viewing this thread

Back
Top Bottom