Hello,
I was searching for some help trying to improve the performance of VBA function in access. I have two functions that are taking 40+ minutes to complete against a table that has 300,000+ records. I'm running on a quad i5 processor laptop.
I'm not sure if my query can be improved, or if this is expected.
My first function loops through the table and for each record, queries a lookup table for some mapping data and updates the first table. I had to reset the dbMaxLocksPerFile to make this work since the table has so many records. The code looks like: (for brevity sake I only show the major logic flow)
This next function also takes 40+ minutes to run. Here, I need to back through the first table and run a sum of sub-totals. I insert these results into a second table. The condensed code looks like:
Would anyone see anything obvious that could improve the performance of this please?
Thank you
I was searching for some help trying to improve the performance of VBA function in access. I have two functions that are taking 40+ minutes to complete against a table that has 300,000+ records. I'm running on a quad i5 processor laptop.
I'm not sure if my query can be improved, or if this is expected.
My first function loops through the table and for each record, queries a lookup table for some mapping data and updates the first table. I had to reset the dbMaxLocksPerFile to make this work since the table has so many records. The code looks like: (for brevity sake I only show the major logic flow)
Code:
...
sql = "SELECT * FROM " & stagingTable
Set db = CurrentDb
Set rs = db.OpenRecordset(sql, dbOpenDynaset)
On Error GoTo Handler
DAO.DBEngine.SetOption dbMaxLocksPerFile, 750000 ' temporary increase lock count for large aetna
Do While Not rs.EOF
...
tsql = "SELECT [Plan Code],[Status] FROM " & mapTable & " WHERE [Plan Code] = " & planCode
...
Set rs2 = db.OpenRecordset(tsql, dbOpenDynaset)
If Not rs2.EOF Then
result = rs2![Status]
End If
rs2.Close
...
rs.Edit
rs![Validated] = result
rs![error] = message
rs.Update
ResumeLoop:
rs.MoveNext
Loop
rs.Close
db.Close
...
This next function also takes 40+ minutes to run. Here, I need to back through the first table and run a sum of sub-totals. I insert these results into a second table. The condensed code looks like:
Code:
...
Dim db As DAO.Database
Dim rs As DAO.Recordset
Dim sql As String
sql = "SELECT [Division],[Plan Code],[Status],Sum([Claim]) AS [SumOfClaim] FROM stagingTable " _
& "WHERE ((([Recorded Date]) Between #" & startDate & "# And #" & endDate & "#)) GROUP BY [Division],[Plan Code],[Status];"
Set db = CurrentDb
Set rs = db.OpenRecordset(sql)
...
DoCmd.SetWarnings False
Do While Not rs.EOF
...
tsql = "INSERT INTO " & outputTable & " (" & outputColumns & ") VALUES (" _
& Format(startDate, "\#yyyy-mm-dd\#") & ",'"
& rs![Division] & "','" _
& rs![Plan Code] & "','" _
& rs![Status] & "','" _
& rs![SumOfClaim] & "');"
...
DoCmd.RunSQL (tsql)
...
ResumeLoop:
rs.MoveNext
Loop
rs.Close
db.Close
...
Would anyone see anything obvious that could improve the performance of this please?
Thank you