Append Query Vs VBA Query (1 Viewer)

accessNator

Registered User.
Local time
Today, 02:59
Joined
Oct 17, 2008
Messages
132
When I run an Append Query, it only takes 2 seconds to complete for testing. It appends 25 records to a link SQL Server table.

But If I create the same type of Query in VBA using

Code:
            Dim str2 As String
            str2 = "INSERT INTO dbo_CompanyToTypeOfElectionStatus ( FK_KCN_RefID, FK_TOES_RefID) "
            str2 = str2 & "SELECT qryResultsDifferences1_Insert.kcn_RefId, qryResultsDifferences1_Insert.election_id "
            str2 = str2 & "FROM qryResultsDifferences1_Insert;"
            
            CurrentDb.Execute str2, dbSeeChanges
This process takes about 25 seconds.

Any ideas on how to speed this up in VBA code? I cant have the user go to a Append Query and manually click on it to execute.

Thoughts?
 

pr2-eugin

Super Moderator
Local time
Today, 08:59
Joined
Nov 30, 2011
Messages
8,494
I hope you know that you CAN run the already built in Query through VBA.. using the DoCmd.OpenQuery ??
 

vbaInet

AWF VIP
Local time
Today, 08:59
Joined
Jan 22, 2010
Messages
26,374
Why the dbSeeChanges?
 

accessNator

Registered User.
Local time
Today, 02:59
Joined
Oct 17, 2008
Messages
132
I hope you know that you CAN run the already built in Query through VBA.. using the DoCmd.OpenQuery ??

Thanks. I didnt think of that.

I don't understand the whole bit of the underworkings why it takes longer, but if I code it the way I posted, it took 25 seconds.

When I changed it to an already defined append query and executed the

Code:
DoCmd.SetWarnings False
DoCmd.OpenQuery("NAMEOFQUERY")
DoCmd.SetWarnings TRUE

It when all the way down to 3 SEC. A significant drop.

Care to explain the differences? Appreciate the help you gave.
 

pbaldy

Wino Moderator
Staff member
Local time
Today, 00:59
Joined
Aug 30, 2003
Messages
36,137
To throw out an alternative, you can use a pass-through query which will force the processing to happen on the SQL Server.

Why the dbSeeChanges?

It is required for recordsets on SQL Server tables with Identity (autonumber) fields.
 

pbaldy

Wino Moderator
Staff member
Local time
Today, 00:59
Joined
Aug 30, 2003
Messages
36,137
I was thinking of just executing a saved pass-through query, but if you're happy I'm happy.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 03:59
Joined
Feb 19, 2002
Messages
43,565
Saved querydefs are static SQL and SQL strings in VBA are dynamic SQL. The difference is that for querydefs, an execution plan is generated and saved when the querydef is saved but for the SQL string in VBA, the execution plan is created each time the query runs. This generally adds a small amount of time but nothing as noticable as what you are experiencing. I'm going to guess that for some reason the execution plan generated from the dynamic SQL is creating a seriously poor plan to get the requested data and it is actually the query running very slowly that is causing the problem rather than the plan generation itself.

Try compacting the BE to get Access to recalculate its statistics. That may improve the plan and get rid of the time difference.

Access uses cost based optimization and so it is important that the stats of the DB be current or it won't be able to accurately choose between using indexes (and which index) and full table scans, etc.
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 17:59
Joined
Jan 20, 2009
Messages
12,860
qryResultsDifferences1_Insert
Is it a table or a query?
 

accessNator

Registered User.
Local time
Today, 02:59
Joined
Oct 17, 2008
Messages
132
Saved querydefs are static SQL and SQL strings in VBA are dynamic SQL. The difference is that for querydefs, an execution plan is generated and saved when the querydef is saved but for the SQL string in VBA, the execution plan is created each time the query runs. This generally adds a small amount of time but nothing as noticable as what you are experiencing. I'm going to guess that for some reason the execution plan generated from the dynamic SQL is creating a seriously poor plan to get the requested data and it is actually the query running very slowly that is causing the problem rather than the plan generation itself.

Try compacting the BE to get Access to recalculate its statistics. That may improve the plan and get rid of the time difference.

Access uses cost based optimization and so it is important that the stats of the DB be current or it won't be able to accurately choose between using indexes (and which index) and full table scans, etc.

I dont fully understand in regards of the stats so I will take your word for it. I did do a compact on the database per your advice, but saw no difference.
Caveat is that the access DB is not split. I know that ideally I should have a split system but for testing purposes, this hasnt been done. All tables reside in this access file with the exception of a few SQL tables that are linked in a different location where the tables related to the SQL in question from my original post does interact remotely.
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 17:59
Joined
Jan 20, 2009
Messages
12,860
its a query.

I recon that is the issue.

When saved queries take data from another query, Access has the opportunity to optimise across both stages before sending the consolidated query to the engine. Effectively it rebuilds it with an optimised subquery structure.

In the VBA scenario that you posted, the two levels are isolated and there is no opportunity to optimise. Essentially the entire data from the source query is returned by the engine to a hidden temporary table before the Execute is run on it.

If you rebuilt the VBA query using a subquery structure with only tables as the data source it would perform more like the saved query.
 

accessNator

Registered User.
Local time
Today, 02:59
Joined
Oct 17, 2008
Messages
132
I recon that is the issue.

When saved queries take data from another query, Access has the opportunity to optimise across both stages before sending the consolidated query to the engine. Effectively it rebuilds it with an optimised subquery structure.

In the VBA scenario that you posted, the two levels are isolated and there is no opportunity to optimise. Essentially the entire data from the source query is returned by the engine to a hidden temporary table before the Execute is run on it.

If you rebuilt the VBA query using a subquery structure with only tables as the data source it would perform more like the saved query.

Thanks for the followup. So wwould you think if I took the results of the query qryResultsDifferences1_Insert and appended to a temp table would be a better option? Then run an Insert on it?
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 03:59
Joined
Feb 19, 2002
Messages
43,565
No. Do not create temp tables unless there is no other option. They simply bloat the database. Access will attempt to optimize nested queries whenever possible. You remove that option by making a temp table. On the other hand, if the first query returns a hundred rows from a million row table and you are going to be doing lots of other queries on that small set, it's worth the bloat to avoid running the query against the million row table multiple times.
 

accessNator

Registered User.
Local time
Today, 02:59
Joined
Oct 17, 2008
Messages
132
No. Do not create temp tables unless there is no other option. They simply bloat the database. Access will attempt to optimize nested queries whenever possible. You remove that option by making a temp table. On the other hand, if the first query returns a hundred rows from a million row table and you are going to be doing lots of other queries on that small set, it's worth the bloat to avoid running the query against the million row table multiple times.

Okay. I guess the only real performance option is to keep using the
DoCmd.OpenQuery option as that gives me the best performance by far.
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 17:59
Joined
Jan 20, 2009
Messages
12,860
Thanks for the followup. So wwould you think if I took the results of the query qryResultsDifferences1_Insert and appended to a temp table would be a better option? Then run an Insert on it?

By doing that you would be basically emulating the slow performance of the Execute against the query. Probably considerably worse because the table will be on the disk instead of in memory.

As Pat said, temporary tables are best avoided if possible. Sometimes they are hard to avoid.

However to avoid bloating and any potential for corrupting the FE, they should never be held in the Front End. This practice is a very common mistake, surprisingly even among experienced programmers. Temporary tables should be held in separate local database. I call this a Side End and you can read the discussions about them on this forum by searching for that term on this site with Google.

I am finding myself attracted to using ADO recordsets for many such purposes. They do take a while to get familiar with and they are different enough to DAO recordsets to require a considerable learning curve.

They also seem clumsy to work with because they are only edited record by record. However this aspect should not be mistaken for being slow. When they are disconnected they exist only in RAM and without the disk being involved they are blisteringly fast to read or write.

Moreover we should not mistake the simplicity of a query with what the database engine actually has to do. The engine converts the query to a set of instructions and ultimately has to write to the values in the records one by one. It is just that we are sheltered from that process.

The database engine does have facilities to optimise its processing so it is still faster in most situations but it is stuck with accessing the disk a lot which is where the ADO disconnected or fabricated recordsets do have a distinct advantage for certain tasks.
 

Users who are viewing this thread

Top Bottom