Append Query Advice (1 Viewer)

MattBaldry

Self Taught, Learn from the Forums
Local time
Today, 11:38
Joined
Feb 5, 2019
Messages
293
Hello all,

Is there a way of doing an append query, but a single row at a time to enable the creation of a counted progress label.

I have a table in Sage50 that I want to append to an SQL table so I can use the data faster. It is a transaction log that I use for some reports and it grows hourly.

As of time of typing there are 173,786 lines. If I append them in 1 go, I have no idea where the import is. If I could find a way of looping them 1 record at a time I could have a text that said # record of ###,### records, and after each import update the text to change the current record.

This would also allow me to add a stop button which would stop after the current loop. At the moment you can only stop when the entire query has finished.

Linked tables direct into Sage50 are stunningly slow, so appending this table (and others) to SQL will give me major benefits. None of the data needs to be changed, just read and used for reports/dashboards and such.

My database has always been used by 4 or 5 people, but now we are up to 12-15 and more remote users, it is slowing down. I am looking at ways to speed up the Sage50 part.

Or, am I trying something that there is already an easier solution for?

~Matt
 

Minty

AWF VIP
Local time
Today, 11:38
Joined
Jul 26, 2013
Messages
10,371
I assume you are using the Sage ODBC driver. I'm sure it's been designed to be as slow as possible...

Adding an individual record loop will make the import take a month of Sundays if you have 100k + records.
You may be able to break it down to do say 1000 records at a time, assuming there is some key field you can count?
Then you could get a progress indicator based on that.
 

MattBaldry

Self Taught, Learn from the Forums
Local time
Today, 11:38
Joined
Feb 5, 2019
Messages
293
I assume you are using the Sage ODBC driver. I'm sure it's been designed to be as slow as possible...

Adding an individual record loop will make the import take a month of Sundays if you have 100k + records.
You may be able to break it down to do say 1000 records at a time, assuming there is some key field you can count?
Then you could get a progress indicator based on that.
Hi Minty,

Thank you for your reply. There is an ID field and breaking it into 1,000 or even a 10,000 loop wouldn't be an issue.

Yes, the Sage ODBC is the bane of my life.

~Matt
 

Pat Hartman

Super Moderator
Staff member
Local time
Today, 06:38
Joined
Feb 19, 2002
Messages
43,263
I'm not sure I understand the request fully but when I append data from external sources, I assign a batch number so that all the rows appended in each append query end up with an ID that connects them to each other. Would that concept work here?
 

MattBaldry

Self Taught, Learn from the Forums
Local time
Today, 11:38
Joined
Feb 5, 2019
Messages
293
I'm not sure I understand the request fully but when I append data from external sources, I assign a batch number so that all the rows appended in each append query end up with an ID that connects them to each other. Would that concept work here?
Hi Pat,

The data I am appending already has a TransactionID. What I am trying to do is append them in smaller batches, rather than a single query doing them all in one go. I want a query to loop through 1-1000, then update my label, then run through 1001-2000, then update the label, and so on.

It is purely because the Sage ODBC is so slow I need to track roughly how many records have been appended while it is running.

~Matt
 

MajP

You've got your good things, and you've got mine.
Local time
Today, 06:38
Joined
May 21, 2018
Messages
8,527
The data I am appending already has a TransactionID.
I think Pat was suggesting that everything imported at one time gets the same import BatchID. I assume that could be a date time stamp. That would tell you that they are all related in that they were imported together.

If you have transaction IDs and they are increasing then you could find the max transaction ID in your database then Select Top 1000 from Source where TransactionID > YourMaxID into your database. I have no idea if that would help. Then you can call the code repeatedly until the Max TransactionID in the Source = the Max TransactionID in your DB (i.e. all imported)
 

MattBaldry

Self Taught, Learn from the Forums
Local time
Today, 11:38
Joined
Feb 5, 2019
Messages
293
I think Pat was suggesting that everything imported at one time gets the same import BatchID. I assume that could be a date time stamp. That would tell you that they are all related in that they were imported together.

If you have transaction IDs and they are increasing then you could find the max transaction ID in your database then Select Top 1000 from Source where TransactionID > YourMaxID into your database. I have no idea if that would help. Then you can call the code repeatedly until the Max TransactionID in the Source = the Max TransactionID in your DB (i.e. all imported)
Thanks Maj,

I now have the import working in flash. The issue now is deleting the data from the SQL table. I need to find a way to TRUNCATE TABLE in access as the delete function takes upwards of 20 minutes each time. I have no reason to log the records that are being deleted as they are only being deleted to be replaced by the new Sage data.

~Matt
 

jdraw

Super Moderator
Staff member
Local time
Today, 06:38
Joined
Jan 23, 2006
Messages
15,378
What is the issue with TRUNCATE Table "yourTableName";

Also, there have been a number of posts related to Access and SAGE (and other pkgs).
It might be helpful to others if you could detail and/or post what you have done to
have the import working in flash.
 

MattBaldry

Self Taught, Learn from the Forums
Local time
Today, 11:38
Joined
Feb 5, 2019
Messages
293
What is the issue with TRUNCATE Table "yourTableName";

Also, there have been a number of posts related to Access and SAGE (and other pkgs).
It might be helpful to others if you could detail and/or post what you have done to
Truncate doesn't work in MS Access, not that I can find.

With regards to the import, I have 2 queries and use a make table to move the data locally, then append the local table my SQL table.

These 2 queries run faster than trying to append the Sage table to SQL.

Now I just need a working VBA to truncate the SQL table rather than trying to delete it.

Because the data can only be edited in Sage I will always be deleting all data and copying it fresh from their (terrible) tables, so truncate is the fastest way.

Just as an additional note, this is only being done 1 or 2 times a day by certain people. I had been doing it with local tables but if I can have SQL tables, I can automate this "import" process on the server so all these people can run off the same tables and not need to download it locally.

~Matt
 

Gasman

Enthusiastic Amateur
Local time
Today, 11:38
Joined
Sep 21, 2011
Messages
14,268
Why not have a template table, copy, populate, delete copy, rinse and repeat? :)
 

jdraw

Super Moderator
Staff member
Local time
Today, 06:38
Joined
Jan 23, 2006
Messages
15,378
Couldn't find an example of TRUNCATE with Access.
Here's how I use DROP Table
Code:
Sub testDROPTable(TblName As String)
    Dim sql As String
    sql = "DROP TABLE " & TblName & ";"
    CurrentProject.Connection.Execute sql, dbFailOnError
End Sub

Invocation:
testDROPTable("CopyOfActivity")

Also, I agree with Paul. Create a table template as he suggested.

What exactly is your environment? Local table vs SQL table??
 

jdraw

Super Moderator
Staff member
Local time
Today, 06:38
Joined
Jan 23, 2006
Messages
15,378
UPDATE:

I copied my table "CopyOfActivity", structure only and saved it as MySageTemplate

This revised code will drop the Table, then create a new instance of "CopyOfActivity" using
the structure of MySageTemplate.

Code:
' ----------------------------------------------------------------
' Procedure Name: testDROPTable
' Purpose: Drop a Table from this database and create new table based on MySageTemplate
' NOTE: MySAGETemplate is a template with structure ONLY same as Table "CopyOfActivity"
' Procedure Kind: Sub
' Procedure Access: Public
' Parameter TblName (String): Table to be dropped
' Author: Jack
' Date: 29-Apr-22
' ----------------------------------------------------------------
Sub testDROPTable(TblName As String)
10        On Error GoTo testDROPTable_Error
          Dim sql As String
20        sql = "DROP TABLE " & TblName & ";"
30        CurrentProject.Connection.Execute sql, dbFailOnError
40        DoCmd.CopyObject , "CopyOfActivity", acTable, "MySAGETemplate"
          
50        On Error GoTo 0
testDROPTable_Exit:
60        Exit Sub

testDROPTable_Error:

70        MsgBox "Error " & Err.Number & " (" & Err.Description & ") in procedure testDROPTable, line " & Erl & "."
80        GoTo testDROPTable_Exit
End Sub
 

Minty

AWF VIP
Local time
Today, 11:38
Joined
Jul 26, 2013
Messages
10,371
Just use a pass-through query to do the truncate on the server?
 

MattBaldry

Self Taught, Learn from the Forums
Local time
Today, 11:38
Joined
Feb 5, 2019
Messages
293
Just use a pass-through query to do the truncate on the server?
How would I do this from access though? I need this to be able to be done on any client FEs that need to do it.

~Matt
 

MattBaldry

Self Taught, Learn from the Forums
Local time
Today, 11:38
Joined
Feb 5, 2019
Messages
293
I have this working now and the TRUNCATE on the pass through works perfectly.

Finally I can make use of Sage tables without having to use the horror that are the Sage tables. Thank you all.

~Matt
 

Minty

AWF VIP
Local time
Today, 11:38
Joined
Jul 26, 2013
Messages
10,371
Glad you got it working.
A pass-through can be very useful in certain circumstances.

The reason a Truncate beats a "Delete" on SQL server is that it doesn't transaction log each row it deletes.
Hence significant speed improvements.
 

Users who are viewing this thread

Top Bottom