Speed up SQL in VBA statement (1 Viewer)

Djblois

Registered User.
Local time
Today, 10:46
Joined
Jan 26, 2009
Messages
598
Hi,


I have this procedure that takes a few seconds. It is very simple inserting a single line of data into a table, so I don't think it should take a few seconds.



How can I speed this up to be near instantaneous?



Code:
Sub TrackChanges(ByVal TableName As String, ByVal ProductKey As Long, ByVal FieldChanged As String, ByVal ValueChanged As String, ByVal ModifiedBy As String, ByVal ModifiedDate As Date)

    Dim sql As String
    
    On Error GoTo UnknownError
    DoCmd.SetWarnings False
    sql = "INSERT INTO tblTrackChanges(TableName, ProductKey, FieldChanged, ValueChanged, ModifiedBy, ModifiedDate) " & _
          "Values('" & TableName & "', " & ProductKey & ", '" & FieldChanged & "', '" & ValueChanged & "', '" & ModifiedBy & "', #" & ModifiedDate & "#) "
    DoCmd.RunSQL sql
    
endProcedure:
    
    DoCmd.SetWarnings True
    Exit Sub
    
UnknownError:
    OpenErrorForm Err.Number, Err.Description, "PublicSubs", "TrackChanges", ""
    GoTo endProcedure

End Sub
 

theDBguy

I’m here to help
Staff member
Local time
Today, 10:46
Joined
Oct 29, 2018
Messages
21,496
Hi. Just as a test, try moving the table to the local FE and see if the speed changes. If so, maybe there's something about the network traffic. If the speed is the same, then we can try other approach.
 

Minty

AWF VIP
Local time
Today, 18:46
Joined
Jul 26, 2013
Messages
10,371
If you have a lot of indexes on the target table it will take longer to insert as the amount of data in that table increases.

Also I would possibly use CurrentDB.Execute, and then not have to turn the warnings on and off. I believe that those steps may shave a few nano seconds off the process.

Have a look for speed tests in the sample databases for some comparisons of different query methods.
 

Micron

AWF VIP
Local time
Today, 13:46
Joined
Oct 20, 2018
Messages
3,478
I would try a new procedure byRef. Could be faster if passed values are on the large side, although all the parameters seem to be smallish in size. Not that I think you'd notice any difference in cycling warnings, but I do agree it's not the best. At least you properly constructed an error handler around that. So many do not.
 

gemma-the-husky

Super Moderator
Staff member
Local time
Today, 18:46
Joined
Sep 12, 2006
Messages
15,661
try timing it
I use this to log 1000's of a second

Code:
Public Declare Function my_gettickcount _
    Lib "kernel32" Alias "GetTickCount" () As Long

store the count before and after, and see how long the operation takes.
maybe the bottleneck is somewhere else.

Then try some suggestions. I use this to execute a sql statement
currentdb.execute sqlstring, dbfailonerror

Code:
timers(1) = my_gettickcount
run the insert statement
timers(2) = my_gettickcount
msgbox "time taken = " & (timers(2)-timers(1))/1000 & " seconds"
 

Micron

AWF VIP
Local time
Today, 13:46
Joined
Oct 20, 2018
Messages
3,478
As long as we're talking speed here, note that when the line

CurrentDb.Exedute mySqlHere

is processed, a copy of the db is created and persists only as long as it takes to process the line. I would declare db object if there will be multiple uses of CurrentDb so that one instance of it will persist from beginning to end of a process. In fact, it might be better to just get in the habit of doing so. Future code modifications could end up with multiple actions where each one ends up creating a new db object. It's not like simply using CurrentDb is saving any memory allocation as that will occur regardless of whether you dim it or not.
 

NauticalGent

Ignore List Poster Boy
Local time
Today, 13:46
Joined
Apr 27, 2015
Messages
6,367
Code:
endProcedure:
    
    DoCmd.SetWarnings True
    Exit Sub
    
UnknownError:
    OpenErrorForm Err.Number, Err.Description, "PublicSubs", "TrackChanges", ""
    GoTo endProcedure

End Sub

Just as a point of clarification, wouldn’t a Resume endProcedure be required to properly handle and clear the error?
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 12:46
Joined
Feb 28, 2001
Messages
27,226
Concur with NG. Using a GoTo is wrong to terminate the error handler. You WILL have issues if an error occurs. Among other things, I wonder if you can even take another event after taking the error trap and ending it that way.
 

isladogs

MVP / VIP
Local time
Today, 18:46
Joined
Jan 14, 2017
Messages
18,247
Agree with all the previous comments about shaving small amounts off the time.
Also with the Resume instead of GoTo in the error handling.

I have written several speed comparison test articles both here and on my website.
Particularly relevant is the article on optimising queries where I go through a number of different methods which can make a significant difference to the time taken. See http://www.mendipdatasystems.co.uk/speed-comparison-tests-8/4594556613

Hope you find it useful

EDIT I also compared 6 different methods of timing. See http://www.mendipdatasystems.co.uk/timer-comparison-tests/4594552971

From my tests using either the Timer function or TimeGetTime were the most consistent/reliable. Like GetTickCount these can measure to millisecond intervals though in reality accuracy is around 1/60 second for each method was they are all based on the system timer. There is also a High resolution timer that measures to less than 1 microsecond … but is less accurate!
 
Last edited:

Cronk

Registered User.
Local time
Tomorrow, 03:46
Joined
Jul 4, 2013
Messages
2,772
Getting back to the original question, when was the last time the db was compacted?
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 12:46
Joined
Feb 28, 2001
Messages
27,226
You got comments about testing this locally in your front-end file - but my question is, what is the normal back end of this database, or is it even split at all? From what you showed, we cannot see the context so don't know what to expect as "normal."
 

Users who are viewing this thread

Top Bottom