Query not reading all records

arc3636

New member
Local time
Today, 06:51
Joined
Mar 27, 2013
Messages
8
I have a table that has multiple values in a field separated by commas (KS, MN, MO). I wrote a VB script that separates the values and inserts them into separate records in a new table. I’m calling the VB script from a query in MS Access that reads the records in the source table. This works perfectly when I tested on a handful of records. However, when I have a large dataset it’s only updating the records that show on the screen when I open the query in datasheet view. As I scroll down the through the records in the query, it continues to load data into the destination table (via VB) as it appears on the screen.

I’m fairly new to access… is there a way to force the query to process all of the records. or a different way to call the VB script? I’d like to put query in a marco and have it run when a button is clicked.
 
1) What is the script? (post the code here)

2) How are you calling or running it?

3) Why are you having to run this process more than once? Are you working with a denormalized data set that is repeatedly imported from an external source?
 
I’ve included the script below. The goal is to turn a single record
KS, MN - 1.75
Into multiple records (in this case 2)
KS --------1.75
MN ------1.75

The data is loaded multiple times throughout the day and I’d like to get it reformatted into something we can use for creating reports.
I’m calling the script from a query that pulls the combined records (see below)

Query:
SELECT tbl_Rates.Per_Mile_Rate, UnGroupLocs([Rate],[State]) AS States
FROM tbl_Rates

Script:
Public Function UnGroupLocs(rate As String, st As String) As String
Dim ctr As Integer
Dim sql As String

'create an array with the grouped states
StArray = Split(st, ", ", -1)

'Loop through and insert a row for each state
ctr = 0
Do While ctr <= UBound(StArray)
sql = "Insert into tbl_TempUngroupedSt values ('" + StArray(ctr) + "'," + rate + ")"
DBEngine(0)(0).Execute (sql)
ctr = ctr + 1
Loop
UnGroupLocs = "Rows inserted: " + CStr(ctr)
End Function
 
How many rows are we talking about here for the imported data?

Does the imported data have a unique key value for each row?

Also, are the rates constantly changing? It seems like once you had a table established with all of the state abbreviations and rates, it wouldn't need to be updated that often.

Just trying to get a better understanding of what you've got going on so we can point you in the right direction.
 
I am simplifying the dataset in this example. In the final version, I’ll also need to add the service type and provider the rate corresponds to (which will increase the amount of data). The file I have from the first provider has 1,000 rows and we expect to get 40-50 bids.

The rates will change some in the future, but that will also be easier to manage if I can store the data in a normalized format.

The key will be the states (along with the service provider and service type).

Appreciate you taking a look!
Andy
 
OK, sorry, I got busy today and couldn't get back to this until now. I'm not sure there will really be a fast way to do this since you have to read each row of the source table and write multiple rows to the new table before moving to the next record in the source table.

I don't think I would run this as part of a function in a select query, I would just use a recordset in VBA. I tested this using a table (tblImport) containing 3000 rows of data like the following;

attachment.php


In this case it was writing 3 rows to the new table (tblTemp) for each row in the source table. Code looks like;

Code:
Dim strSQL As String, strInsert As String
Dim vArray As Variant
Dim i As Long

strSQL = "Select * From tblImport;"

With CurrentDb.OpenRecordset(strSQL)
    If .RecordCount <> 0 Then
        .MoveFirst
        Do While Not .EOF
        vArray = Split(!State, ",")
        For i = 0 To UBound(vArray)
            strInsert = "Insert Into tblTemp (State, Rate, ServiceType, Provider) " _
                      & "Values (""" & Trim(vArray(i)) & """, """ & !Rate & """, """ _
                      & !ServiceType & """, """ & !Provider & """);"
            CurrentDb.Execute strInsert, dbFailOnError
        Next
        .MoveNext
        Loop
    End If
End With

Resulting in 9000 rows in tblTemp;

attachment.php


This took 6 minutes, 27 seconds to run in a split db running over our corporate LAN. Not sure if this is acceptable performance for you or not. You could probably speed it up a bit by removing the Trim function if you don't need that for your source data (i.e. if the source State field does not contain spaces).

Maybe someone else will jump in on the thread with a better method.
 

Attachments

  • Capture.PNG
    Capture.PNG
    6 KB · Views: 174
  • Capture2.PNG
    Capture2.PNG
    11.6 KB · Views: 188
Looks like it's working. The first file I ran with 12,000 records took less than 1 minute. Thanks again for your help!
 

Users who are viewing this thread

Back
Top Bottom