Solved Query Execution Time (1 Viewer)

Bellage

New member
Local time
Today, 04:32
Joined
Jul 15, 2020
Messages
3
Hello All!

I'm executing the below code as part of a larger procedure and it's taking ~3-4 mins to complete which feels lengthy.
I'm wondering if there is a more efficient approach that I could use or if this time would be usual for itterating through ~100 update statements?

CurrentDb = linked tables to Azure SQL Server via ODBC Driver 17 for SQL Server.
TrayQty = 105 (typically)

Code:
For i = 1 To TrayQty
    uSQL = "UPDATE [Element Weighing] SET " & _
           "[Final Melts By] = '" & Me.[Operator].Value & "', " & _
           "[Date of Final Melts] = #" & Format(Date, "yyyy-mm-dd") & "#, " & _
           "[Retaining Cap] = '" & uDataSet(i, 1) & "', " & _
           "[Spacer Washer] = '" & uDataSet(i, 2) & "', " & _
           "[Mounting Frame] = '" & uDataSet(i, 3) & "' " & _
           "WHERE [Unique Ref] = '" & Me.[Order_Number].Value & "-" & Me.[Tray_ID].Value & "-" & i & "';"
                
     CurrentDb.Execute uSQL
Next i

Any help appreciated
 

Minty

AWF VIP
Local time
Today, 04:32
Joined
Jul 26, 2013
Messages
10,371
Hi And welcome to AWF !

You could send the data to a stored procedure via a pass-through query to do the update on the Server.
It would be almost instantaneous.

Alternatively, you can use a numbers table and use it in a join to run one update statement based on a cartesian join. That would be significantly faster as it would only run once.
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 13:32
Joined
Jan 20, 2009
Messages
12,851
Alternatively, you can use a numbers table and use it in a join to run one update statement based on a cartesian join. That would be significantly faster as it would only run once.

Sounds like that would be a join between a local table and SQl so I don't think it would perform very well.
 

Minty

AWF VIP
Local time
Today, 04:32
Joined
Jul 26, 2013
Messages
10,371
Sounds like that would be a join between a local table and SQL so I don't think it would perform very well.
Possibly true - but you could do the same with a SQL numbers table? I still think that one query would be much quicker than 100 separate updates.
 

Isaac

Lifelong Learner
Local time
Yesterday, 20:32
Joined
Mar 14, 2017
Messages
8,777
I couldn't tell from the OP whether the query already might be pass through or not, but if not, I agree any time you're not seeing the performance speed you want, it's a great thing to try. Also never hurts to plop a sample (fully filled out) Update statement into SSMS and see if the execution plan suggests an additional index on Unique Ref ID.
 

CJ_London

Super Moderator
Staff member
Local time
Today, 04:32
Joined
Feb 19, 2013
Messages
16,605
presumably udataset is a 2 dimension array?

other options:
1. have a stored procedure, on looping, pass the values as parameters and execute the update in azure
2. have code to populate a temporary local table, then have a single update query linking on your unique ref.

performance wise, what level of Azure licence do you have? - perhaps your license has performance restrictions that you are now up against
 

namliam

The Mailman - AWF VIP
Local time
Today, 05:32
Joined
Aug 11, 2003
Messages
11,695
Using a linked table is the main source of performance trouble as well as going over the connection 100 times.

Create a pass through query: Yourquery

Code:
uSQL = ""
For i = 1 To TrayQty
    uSQL =  uSQL & "UPDATE [Element Weighing] SET " & _
           "[Final Melts By] = '" & Me.[Operator].Value & "', " & _
           "[Date of Final Melts] = #" & Format(Date, "yyyy-mm-dd") & "#, " & _
           "[Retaining Cap] = '" & uDataSet(i, 1) & "', " & _
           "[Spacer Washer] = '" & uDataSet(i, 2) & "', " & _
           "[Mounting Frame] = '" & uDataSet(i, 3) & "' " & _
           "WHERE [Unique Ref] = '" & Me.[Order_Number].Value & "-" & Me.[Tray_ID].Value & "-" & i & "';" & chr(10) & chr(13)
               
Next i

currentdb.QueryDefs("Yourquery").SQL = uSQL

CurrentDb.Execute "YourQuery"

Not sure if this wil work "as is", its somewhat aircode... it has been a while .... me using access.... but I hope you get the idea.

Another option is to create a table to store your uDataSet in the target database, then you can create one statement to fire and do the updates for you.
 

Bellage

New member
Local time
Today, 04:32
Joined
Jul 15, 2020
Messages
3
Thank you all for your responses they have been very helpful and I have learned a few things!

My understanding so far is:
  • linked tables even with an ODBC connection don't operate as a passthrough
  • Without a passthrough queries will use the MS Access Database Engine which can be slower and limited to executing a single query
  • A passthrough query will use the SQL Server Database Engine which can be more powerful and execute batches of queries
So based on your advice I have constructed a passthough query that is built by iterating through my array and appending SQL to a QueryDef object.
This is of course building a batch that can be executed over the connection once.
Assuming I have done it correctly I now have the below.

Code:
Set qDef = CurrentDb.QueryDefs("uPST")

uSQL = ""

For i = 1 To TrayQty

uSQL = uSQL & "UPDATE [dbo].[Element Weighing] SET " & _
              "[Final Melts By] = '" & Me.[Operator].Value & "', " & _
              "[Date of Final Melts] = CAST('" & Format(Date, "yyyy-mm-dd") & "' AS DATE), " & _
              "[Retaining Cap] = '" & uDataSet(i, 1) & "', " & _
              "[Spacer Washer] = '" & uDataSet(i, 2) & "', " & _
              "[Mounting Frame] = '" & uDataSet(i, 3) & "' " & _
              "WHERE [Unique Ref] = '" & Me.[Order_Number].Value & "-" & Me.[Tray_ID].Value & "-" & i & "';" & vbNewLine
Next i

uSQL = uSQL & "UPDATE [dbo].[Element Tray Data] SET " & _
              "[Status] = 'Complete' " & _
              "WHERE [Tray ID] = '" & Me.[Order_Number].Value & "-" & Me.[Tray_ID].Value & "';" & vbNewLine

qDef.SQL = uSQL

CurrentDb.QueryDefs("uPST").Execute dbFailOnError Or dbSeeChanges

Even if I haven't quite done things correctly - I can confirm that the above executed correctly.
However... the performance of the update method did not provide an improvement. :(

Access shows the execution to have a similar duration: 3m26s
17/07/2020 12:01:55: (Start)
17/07/2020 12:01:55: (Validation)
17/07/2020 12:01:55: (Array Build)
17/07/2020 12:01:55: (Update Tables)
17/07/2020 12:05:21: (Batch Allocation)

Following this I copied the entire query into SSMS and ran it with an execution plan.
This too executed the batch in a similar time. 3m29s

I have to confess that I do not know how to interpret the execution plan but each line is returing the same info.
1594992461880.png


Assuming that I have done things correctly and that there isn't much else in the way of other options I'm begining to wonder if CJ_London has a point.
I don't know the specification, but I am aware that the current Azure license is for one of the lower performance tiers and perhaps I need to follow this line of attack up.

Thanks again for your help!
 
Last edited:

Isaac

Lifelong Learner
Local time
Yesterday, 20:32
Joined
Mar 14, 2017
Messages
8,777
What you are seeing in the execution plan from SSMS is what I alluded to in post 5. Are you able to create the missing index? If so, I bet you'll see a big change.

I assume that the QEP's "1%" reference is due to just one of the Update statements out of many(?)
 

CJ_London

Super Moderator
Staff member
Local time
Today, 04:32
Joined
Feb 19, 2013
Messages
16,605
I agree with Isaac, doesn't matter which rdbms system you use, with very few exceptions any field used for sorting/filtering/criteria/joining on a regular basis has to be indexed otherwise performance is affected, particularly for relatively large datasets. You might find this link of interest. It's about data retrieval, but that is effectively what your are doing - retrieving them to update them https://www.access-programmers.co.u...ing-is-important-for-good-performance.291268/

I note the execution plan provides you with the sql to create the index - all you need to do is copy it, give it a name - which could be the field name of fieldname prefixed with say 'idx' and then execute it. Never tried as a passthrough query but you can always run it in azure.
 

Minty

AWF VIP
Local time
Today, 04:32
Joined
Jul 26, 2013
Messages
10,371
I agree with CJ And Issac, add that missing index and things will massively improve.

However, I would look at creating a temp SQL table with your array data, then in a Stored Procedure link to that and a numbers table and do the whole lot into one go.
 

Isaac

Lifelong Learner
Local time
Yesterday, 20:32
Joined
Mar 14, 2017
Messages
8,777
On a funnier note, I've been guilty at least twice of running too fast with scissors (or copy paste), and ended up creating indexes named "Name of Missing Index". Doing so does not help professional appearance at work! 🤣
 

namliam

The Mailman - AWF VIP
Local time
Today, 05:32
Joined
Aug 11, 2003
Messages
11,695
The index will indeed make the biggest impact, alternative is to make a (dummy) table and mass update once...
Or use an intermediary table to copy the records into... the full table scan (without the index) is hurting you ... doing 105 full table scans.

Though by their nature Indexes perform best on integer values, not on a varchar value....
Also your compound key adding 3 parts information together is "bad practice", your best off having 3 fields seperated
 

Pat Hartman

Super Moderator
Staff member
Local time
Yesterday, 23:32
Joined
Feb 19, 2002
Messages
43,257
there is a certain amount of over head to binding and running a query. Why not open a recordset that selects the rows you want to update and loop through that using DAO or ADO.

Also, you are using a function. What does the function do?
 

Isaac

Lifelong Learner
Local time
Yesterday, 20:32
Joined
Mar 14, 2017
Messages
8,777
there is a certain amount of over head to binding and running a query. Why not open a recordset that selects the rows you want to update and loop through that using DAO or ADO.

Also, you are using a function. What does the function do?
hmm.. That sounds slower to me. Compared to executing this on SQL server?? I didn't see a function being used I might just be missing it as I scan over the sql.
 

namliam

The Mailman - AWF VIP
Local time
Today, 05:32
Joined
Aug 11, 2003
Messages
11,695
Also, you are using a function. What does the function do?
Do you mean the cast??

Main problem remains, finding the records which is slowed considerably by
1) Beeing a varchar
2) not having an index.
 

Pat Hartman

Super Moderator
Staff member
Local time
Yesterday, 23:32
Joined
Feb 19, 2002
Messages
43,257
You are correct, ~3-4 minutes is too long to update 100 records.
Since you are updating 100 records, you are running 100 queries along with all the overhead to create an execution plan for each one.

uDataSet(i, 1) -- looks like a udf to me and you are running it three times for each record. What is the function doing?
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 13:32
Joined
Jan 20, 2009
Messages
12,851
Though by their nature Indexes perform best on integer values, not on a varchar value....

I used to think that but it is not always so. Indexes are often hashes which perform really well no matter what data they are based on.

Also your compound key adding 3 parts information together is "bad practice", your best off having 3 fields seperated

In SQL Server a covering index with the three fields works fine on compound keys.
 

namliam

The Mailman - AWF VIP
Local time
Today, 05:32
Joined
Aug 11, 2003
Messages
11,695
The difference between varchars and integers is far less noticable now a days, but the difference is definately there... If only because varchars tend to be longer and less sequential vs integers. 20 years ago it was really really noticable....
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 13:32
Joined
Jan 20, 2009
Messages
12,851
Since you are updating 100 records, you are running 100 queries along with all the overhead to create an execution plan for each one.

In SQL Server, queries that are similar enough will run reusing the same stored execution plan. The optimiser can easily recognise the similarities between adhoc queries well beyond just changing values embedded in their SQL strings.
 

Users who are viewing this thread

Top Bottom