Elegant ways to handle multiple queries within a single sub (1 Viewer)

Richard Horne

Member
Local time
Today, 11:03
Joined
Oct 15, 2020
Messages
55
Morning guys,

This is something of a general VBA/query/structure question as I'm interested to hear how other people approach this issue.

When you're writing a typical VBA sub which requires you to do multiple queries, how do you keep your code well structured, organised, clean and tidy?

Here's what I do and it seems messy to me and I can't help but feel there has to be a better way.

For starters I have a default chunk of connection/query code which I recycle more times than I can remember.

Code:
'Set our DB connect variables
Dim db As ADODB.Connection
Dim rs As New ADODB.Recordset
Dim str_sql As String
Set db = CurrentProject.Connection
Set rs = New ADODB.Recordset

'Dim the variables we will use to capture the query outputs
Dim str_my_var1 as string, int_my_var2 as integer, dbl_my_var3 as double
   
'Build our SQL query
str_sql = "SELECT * FROM WHATEVER"
Set rs = db.Execute(str_sql)

'Capture the results and convert to variables
If IsNull(rs(0)) Then
    str_my_var1= ""
Else
    str_my_var1= rs(0)
End If

If IsNull(rs(1)) Then
    int_my_var2= 0
Else
    int_my_var2= rs(1)
End If

If IsNull(rs(2)) Then
    dbl_my_var3= 0
Else
    dbl_my_var3= rs(2)
End If

'Do something with the vars

'Call my separate sub containing further queries or code
Call my_separate_sub
   
Set rs = Nothing
Set db = Nothing

If I need to do additional queries outputting multiple fields, I usually put that in a separate sub (my_separate_sub) to try and keep things organised and readable and to help with troubleshooting. If I'm querying 10+ fields multiple times then the code gets pretty unwieldy, but of course this method means also often having to create public variables in order to be able to reference them across multiple subs.

If I need to do multiple queries within the same sub then I have to declare multiple recordset variables (rs, rs2, rs3), multiple sql strings (str_sql, str_sql2, str_sql3) and then remember to close them all.

Is this an acceptable way or is there a neater more efficient method? Is it inefficient continually opening new recordsets across multiple subs?

For example:

Code:
'Set our DB connect variables
Dim db As ADODB.Connection
Dim rs As New ADODB.Recordset, rs2 as New ADODB.Recordset, rs3 as New ADODB.recordset
Dim str_sql As String, str_sql2 as string, str_sql3 as string
Set db = CurrentProject.Connection
Set rs = New ADODB.Recordset, rs2 = New ADODB.Recordset, rs3 as New ADODB.recordset

Dim int_my_var as integer, int_my_var2 as integer, int_my_var3 as integer
   
'Build our SQL query
str_sql = "SELECT"
Set rs = db.Execute(str_sql)

If IsNull(rs(0)) Then
    int_my_var= 0
Else
    int_my_var= rs(0)
End If

                  'Build our 2nd SQL query
                  str_sql2 = "SELECT * FROM OTHER_TABLE WHERE field= " & int_my_var
                  Set rs2 = db.Execute(str_sql2)

                  If IsNull(rs2(0)) Then
                      int_my_var2= 0
                  Else
                      int_my_var2= rs2(0)
                  End If


                            'Build our 3rd SQL query
                            str_sql3 = "SELECT * FROM OTHER_TABLE WHERE field= " & int_my_var2
                            Set rs3 = db.Execute(str_sql3)

                            If IsNull(rs3(0)) Then
                                 int_my_var3= 0
                            Else
                                 int_my_var3= rs3(0)
                            End If


Set rs = Nothing
Set rs2 = Nothing
Set rs3 = Nothing
Set db = Nothing

Thanks in advance.
 

arnelgp

..forever waiting... waiting for jellybean!
Local time
Today, 18:03
Joined
May 7, 2009
Messages
19,233
if you expect a Field to contain string and want to avoid Null values:

str_my_var1=Nz(rs(0))

for Field containing numbers:

int_my_var2 = Nz(rs2(0), 0)

therefore you avoid using If...Else...End If statement.
 

Richard Horne

Member
Local time
Today, 11:03
Joined
Oct 15, 2020
Messages
55
if you expect a Field to contain string and want to avoid Null values:

str_my_var1=Nz(rs(0))

for Field containing numbers:

int_my_var2 = Nz(rs2(0), 0)

therefore you avoid using If...Else...End If statement.

That seems like a very obvious oversight on my part and certainly helps minimise the code somewhat, so thank you. :)
 

Auntiejack56

Registered User.
Local time
Today, 20:03
Joined
Aug 7, 2017
Messages
175
Also FWIW, where the task is very complex, as yours seems to be, consider your main sub to be just an index or ToC for the work you are trying to do. This is easier where the main queries are snapshots and you can confidently pass the variables into other subroutines.
So for example
Code:
Private Sub cmdMainCommand_Click()
Dim dbs As DAO.Database, rs As DAO.Recordset, strSQL As String
On Error GoTo Catch_Error

    Set dbs = CurrentDb()
    
    'Step 1: Check the boondoggle
    strSQL = "SELECT * FROM tblBoonDoggle WHERE bID = 4;"
    Set rs = dbs.OpenRecordset(strSQL, dbOpenSnapshot)
    
    If Not goCheckBoonDoggle(rs(0), rs(1), rs(2)) Then
        MsgBox "Boondoggle failed"
        GoTo Proc_Exit
    End If
    
    'Step 2: Update the whatsis
    strSQL = "SELECT * FROM tblWhatsisController WHERE wBoonDoggle = " & rs(4) & ";"
    Set rs = dbs.OpenRecordset(strSQL, dbOpenSnapshot)
    
    If Not goUpdateWhatsis(rs(0), rs(3), rs(9)) Then
        MsgBox "Whatsis failed"
        GoTo Proc_Exit
    End If
    
    'Step 3: Find the hairy bits
    strSQL = "and so on"
            
            
Proc_Exit:
        Set dbs = Nothing
        Exit Sub
Catch_Error:
    MsgBox Err.Description & vbCrLf & "Cannot check boondoggle, update whatsis and report on hairy bits.", vbCritical, "Please contact support"
    Resume Proc_Exit
    Resume
End Sub
Then all your more complicated management of variable types and whatnot is contained in each of the subroutines. You can just keep reusing the same variables.
I always steer clear of public variables - if you have to query the same data repeatedly, and there is a logical reason to separate the queries, then just repeat the query again for the second time tautologically (taking into account speed issues of course).

Jack
 

Richard Horne

Member
Local time
Today, 11:03
Joined
Oct 15, 2020
Messages
55
Also FWIW, where the task is very complex, as yours seems to be, consider your main sub to be just an index or ToC for the work you are trying to do. This is easier where the main queries are snapshots and you can confidently pass the variables into other subroutines.
So for example

It's interesting you should mention that, as that's exactly the approach I've taken with this complicated process I'm currently working on. There are 5+ separate checks that need to be run that all have their own queries and trying to keep all that in one sub was proving to be a nightmare and what prompted this question in the first place.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 06:03
Joined
Feb 19, 2002
Messages
43,223
You are running action queries and you seem to be collecting the results of the action. What are you capturing and what are you doing with the data?
 

Richard Horne

Member
Local time
Today, 11:03
Joined
Oct 15, 2020
Messages
55
You are running action queries and you seem to be collecting the results of the action. What are you capturing and what are you doing with the data?

I am currently building an importer that takes data from a spreadsheet and saves it into a temporary table, runs a boat load of checks and then creates firm orders once all checks are passed.

My process is to save the data from a speadsheet in a temporary table, then query each line of this imported data and perform various checks against it. For example, is the price our agreed price, is the expected date within the expected window, have we had any discrepancy reports against the saved product in the past, do we have any stock of the product and more. Each of these are multiple queries against each line in the temp table.

This is probably among the most complex things I've had to code in VBA but you can appreciate trying to do all this from one sub gets very messy.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 06:03
Joined
Feb 19, 2002
Messages
43,223
So, the code you posted bears no resemblance to the actual process?
 

Richard Horne

Member
Local time
Today, 11:03
Joined
Oct 15, 2020
Messages
55
Not specifically, no, as my subs for this process are at 1,000+ lines at the moment. I was just simplifying for brevity and simplicity. The logic is sound, though.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 06:03
Joined
Feb 19, 2002
Messages
43,223
Except that your sample shows you running action queries, not opening recordsets so that was confusing.

You would run each query once per table NOT once per row. I also would not import the data, I would link to it to avoid bloat or I would import it to a separate database that I can delete and replace.
 

Richard Horne

Member
Local time
Today, 11:03
Joined
Oct 15, 2020
Messages
55
Except that your sample shows you running action queries, not opening recordsets so that was confusing.

You would run each query once per table NOT once per row. I also would not import the data, I would link to it to avoid bloat or I would import it to a separate database that I can delete and replace.

I'm mostly interested in how others approach similar multi-query tasks, that all.

Also, maybe I didn't explain myself properly, but I have a spreadsheet with for example, 100, orders on it, I need to run checks against each individual line hence why I'm running queries per row.
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 06:03
Joined
Feb 19, 2002
Messages
43,223
I think we are not speaking the same language. I've imported complex data dozens of times. I understand the concept. You don't run separate queries for each row. You might need to run multiple queries to validate and report on multiple types of errors but you don't run them for each row.

The sample code you posted was meaningless and had nothing to do with anything. If you want us to offer suggestions regarding specific code, you will need to post it.
 

Auntiejack56

Registered User.
Local time
Today, 20:03
Joined
Aug 7, 2017
Messages
175
... runs a boat load of checks
Your description is interesting. You know what I would be tempted to do? After importing the data, run a big wide query against it, including if possible all the fields from all the tables that you will be checking - that is, link in the products and stock, supply windows, prices, original order, in fact the whole box and dice.
Then, read through the query once, and just compare the fields on each record.
Even if you can't compare absolutely everything, you could probably significantly reduce the complexity of the process. For example, you might need to check that the same customer hasn't placed total orders above their limit. Well, you could give it a 'pass' in phase 1 and then add the total passed orders and find that it gets a 'fail' in pass 2.
 

Auntiejack56

Registered User.
Local time
Today, 20:03
Joined
Aug 7, 2017
Messages
175
It reminds me of a Treasury department job I had in a bank where there were many classifiable reportable transactions coming in from all over, and we had to do a very similar thing. I added five new booleans to the input data, and we ran five big queries against it. We scanned the data set of query1, checking field against field. If the checks passed, set tick box1 to true. Then the same for query2, setting tick box2, then query3 and so on. After running all five, the sixth process read thru the tick boxes looking for the necessary combinations and reported those records.
 

Auntiejack56

Registered User.
Local time
Today, 20:03
Joined
Aug 7, 2017
Messages
175
It was called the Data Dog, because it ran around burying bones in the data, and then when the bones were all buried we went thru and dug 'em up.
 

Richard Horne

Member
Local time
Today, 11:03
Joined
Oct 15, 2020
Messages
55
Let me try and explain what I've done here in a bit more detail, please bear with me. (You'll see why I tried to condense/simplify)

First of all, these orders are always, and will always be from the same customer.

My order system comprises two main tables: tbl_orders and tbl_order_details where tbl_orders stores, order number, customer ref, date, customer id, payment terms, etc etc.

Meanwhile, tbl_order_details stores line items, so item id, item number, product code, quantity, price etc.

You can have multiple order_details records per order.

The spreadsheet I'm importing comes as is - I cannot tailor it as it's auto-generated from another company's system. Basically it's sub-optimal but I can only work with what I have and I have to make it fit our system.

Each line on the spreadsheet contains information that is saved in both the orders and order_details tables so I have to separate the data and save to the appropriate tables. This also means I have to determine how many orders per import, as well as add any missing fields such as user_id, order_creation_date, customer_id etc.

I have a main form containing two combos: cbo_contract_id which lets the user specify the contract ref to upload the orders against, and cbo_delivery_address_id as this customer has hundreds of delivery addresses and we have to choose the right one to assign to all of the orders. There's also a text box that shows the progress of the import as it works through the various stages.

Finally there's a button that when clicked allows the user to browse for the spreadsheet then imports the data.

Code:
Private Sub btn_sip_importer_Click()

'Check for and determine Contract
If IsNull(Me.cbo_contract_id) Then
    MsgBox "You need to select a contract"
    Me.cbo_contract_id.SetFocus
    End
End If

'Show our scrolling text which we use to show the user stuff is happening in the background while it appears Access has hung
Me.textbox.Visible = True

'Empty our temporary table, browse for spreadsheet and upload the rows to the temporary table
Call browse_for_spreadsheet

'Once the data is uploaded we run a Do While loop to step through the imported lines and against each one run the following subs to perform separate checks.

'Check all NSNs exist within our product table
Call check_nsns

'Check all NSNs match up with TL number - check that the nsn and TL number refs we have saved in our product table match up with the pairs provided/imported - we do this because occasionally we have multiple product codes with the same NSN and this will break the import if we don't check for it
Call check_nsn_and_tl_number

'Check all NSN exist on the selected contract - check that the NSNs imported are all saved against the contract specified on the combo box on the main form containing the import button - done because occasionally the customer makes typos and sends us incorrect NSNs that don't match our system
Call check_nsns_on_contract

'Determine number of orders - determine how many orders there are on this import - some orders are single line, some are multiple lines
Call determine_number_of_orders

'Calculate Dates and compare with spreadsheet
'To do but we are checking that the requested date is in line with the date the system would automatically calculate were this order processed manually
      
'Calculate Price and compare with spreadsheet
'To do but we are checking that the requested price is in line with the price the system would automatically set based on the saved contract price were this order processed manually

'Load Issue Number
'These orders are against our contracts - every product is made to spec. For each imported order line we need to look up the latest specification issue number

'Generate DR information report
'For each imported line we need to check to see if there have been any disrepancy reports and collate all this information into a concise report. Normally each order would be processed manually and individually and the user would receive a prompt when entering the product code, because this importer potentially imports tens/hundreds of orders simulatenously, we need a way to view the DR logs for all newly created orders

'Set Sales Order information - once we have passed all checks and logged all warnings, actually create the reports
Call generate_sales_orders

'Hide the loading message
Me.textbox.Visible = False

End Sub

As you can see there is a lot going on. I don't need any help building this specific system - it's mostly done, I was just curious to see how others would approach handling the multiple queries - and from the extensive description above you can see why I simplified it.
 
Last edited:

Richard Horne

Member
Local time
Today, 11:03
Joined
Oct 15, 2020
Messages
55
It reminds me of a Treasury department job I had in a bank where there were many classifiable reportable transactions coming in from all over, and we had to do a very similar thing. I added five new booleans to the input data, and we ran five big queries against it. We scanned the data set of query1, checking field against field. If the checks passed, set tick box1 to true. Then the same for query2, setting tick box2, then query3 and so on. After running all five, the sixth process read thru the tick boxes looking for the necessary combinations and reported those records.

That's exactly what I've done.

Usually orders are processed manually and these checks run when a product code or nsn is entered using the AfterUpdate event on various combo boxes. As I'm importing the data and can't use events, these checks can't run, so I have to essentially perform them separately against each imported line and save pass/fail as a boolean which I then import into my orders tables and can combine with conditional formatting to draw attention to any issues. There's a lot going on.
 

Richard Horne

Member
Local time
Today, 11:03
Joined
Oct 15, 2020
Messages
55
See below for an example of one of the subs mentioned above check_nsns. This is one of the checks I perform against each line of the import.

Code:
Private Sub check_nsns()

Me.textbox = "............CHECKING NSNs.........................."

'Set our DB connect variables
Dim db As ADODB.Connection
Dim rs As New ADODB.Recordset, rs2 As New ADODB.Recordset
Dim strSQL As String, str_sql2 As String
Set db = CurrentProject.Connection
Set rs = New ADODB.Recordset
Set rs2 = New ADODB.Recordset

Dim str_nsn As String, int_number_of_products As Integer

'Query all NSNs from our temporary table
strSQL = "SELECT nsn FROM tbl_temp_sip_import"
Set rs = db.Execute(strSQL)

Do While Not rs.EOF

'Capture the NSN against each row
str_nsn = Nz(rs(0), 0)
        
    'Count how many products exist with the matching NSN. We are looking for where no product exists so we can abort the import and warn the user.
    str_sql2 = "SELECT COUNT(*) FROM tbl_products WHERE nsn = '" & str_nsn & "'"
    Set rs2 = db.Execute(str_sql2)
    
    'Count the number of products matching the NSN to see if it exists
    int_number_of_products = Nz(rs2(0), 0)
    
    'Prompt the user is a mismatch is found and end the process
    If int_number_of_products = 0 Then
        MsgBox "There are products on this spreadsheet with NSNs that are not saved in DEAN. The first NSN that failed this check is: " & str_nsn & vbCrLf & vbCrLf _
        & "This process will now abort to give you time to resolve the issue."
        End
    End If

rs.MoveNext
Loop

MsgBox "NSN CHECK COMPLETE"

End Sub
 

CJ_London

Super Moderator
Staff member
Local time
Today, 11:03
Joined
Feb 19, 2013
Messages
16,607
On my phone so can’t give a detailed answer but from what I can see this could all be done in sql. Your last post could simply be done with a left join and report all missing products in one hit rather than running the same query 200 times.

A couple of years ago I was involved with a project which among other things imported 1m rows monthly from a .csv file. The checking process took a couple of minutes and included data validation of every column (around 50). user reviewed the rejects (typically around 100) and then either rejected the whole file or imported the good ones and sent the rejects back with reason for rejection. The split of the data into around 10 tables was undertaken as part of the import process

this was achieved by importing the csv file to an unindexed table in a temp db, adding indexing to the imported data relevant to validation
 

Minty

AWF VIP
Local time
Today, 11:03
Joined
Jul 26, 2013
Messages
10,368
I think what @Pat Hartman and @CJ_London are getting at is that you don't need to loop the lines individually to get your query results.
you can join to your correct data and simply produce a list of items that don't match.
 

Users who are viewing this thread

Top Bottom