Transfer Access table to SQL Server (Azure) table via VBA (1 Viewer)

Derevon

Registered User.
Local time
Today, 05:42
Joined
Jan 14, 2014
Messages
50
Hello everyone,

What's the best way to use VBA to transfer the contents of an Access table to a Microsoft Cloud SQL server (Azure) on a daily basis?

I have a table with around 60-70 rows, and it takes hours when for example linking the SQL Server table and using INSERT INTO takes many hours.

Using some kind of stored procedure and sending multiple lines in every execution is reasonable fast at first but seems to slow down a lot after some time for some reason.

Thank you
 

theDBguy

I’m here to help
Staff member
Local time
Yesterday, 20:42
Joined
Oct 29, 2018
Messages
19,184
Just thinking out loud, but my first thought was maybe try using SSIS.
 

Pat Hartman

Super Moderator
Staff member
Local time
Yesterday, 23:42
Joined
Feb 19, 2002
Messages
36,908
"hours" for 60-70 rows??? You could retype them faster than that. What method are you currently using?
 

pbaldy

Wino Moderator
Staff member
Local time
Yesterday, 20:42
Joined
Aug 30, 2003
Messages
35,749

GPGeorge

Grover Park George
Local time
Yesterday, 20:42
Joined
Nov 25, 2004
Messages
783
HOW are you moving these records? Why is it done daily?
 

arnelgp

..forever waiting... waiting for jellybean!
Local time
Today, 11:42
Joined
May 7, 2009
Messages
16,833
just maybe create a Pass-through query.
then run your Stored proc again the pass-through query.
 

Minty

AWF VIP
Local time
Today, 04:42
Joined
Jul 26, 2013
Messages
9,372
I've posted a vaguely possible answer in the original thread.
The why and what the source of the 60k records is important, and may lead us to a better set of options.
 

Derevon

Registered User.
Local time
Today, 05:42
Joined
Jan 14, 2014
Messages
50
So the whole story goes something like this:

We have used an Access database used by perhaps 50 employees in total, on average maybe 5-10 concurrent users with a back-end on a local shared drive for years. And then Covid 19 came, and home office and VPNs were a thing. The shared drive has absolutely dismal performance over VPN, so data corruption is pretty much a daily thing, and IT wants to retire this shared drive altogether.

We now want to migrate the back-end to Microsoft SQL Server (Azure). The new data is extracted using ODBC SQL queries from our financial system inside the company network using pass-through queries in Access. Also some additional data based on some manually extracted reports are incorporated into the database. Ideally of course it would be best if the updating of the database itself could be done inside SQL Server, but for the time being the updating has to be handled by an Access database with VBA code. So basically first the SQL server database data is copied to Access, all the updates are made locally and when ready, the data on the SQL server is replaced with the data from the Access table.

This solution seemed to work pretty fast on a non-Azure SQL Server that I was experimenting on earlier (under 10 min), but with this new Azure server things seem to be a lot slower (1.5-2h).

I'm kind of a beginner when it comes to Microsoft SQL Server, so feel free to suggest better solutions.

Thanks
 

Pat Hartman

Super Moderator
Staff member
Local time
Yesterday, 23:42
Joined
Feb 19, 2002
Messages
36,908
Access does not perform well over the internet. It is too chatty. If you want people to use your application remotely AND you want to stick with the Access FE, you have two viable options. Citrix and Remote Desktop. Both work very well and don't require any changes to the application. I once had a Citrix installation that had local users in Hartford, CT but remote users via Citrix In a dozen cities around the US as well as in London and Paris. the Citrix users got equal or better performance than the local users did. The BE was SQL Server as it happened but it was hosted on our servers in Hartford as was Citrix.
 

Auntiejack56

Registered User.
Local time
Today, 13:42
Joined
Aug 7, 2017
Messages
168
When you transfer from one test SQL Server to a Prod version, then you (obviously) have to change the connection string. One pitfall when doing so is to use an IP address for the Production connection.
In some circumstances, the IP address will need to be looked up in the domain controller before validating that the connection is ok - and it will do this every single time. If you look in the SQL Log, you'll see all these 10 second pauses between each SQL command. I'm not saying this is a common occurrence, and of course it depends on your architecture, but I've encountered it more than once, and I don't use IP address connections anymore as a result.
This is the note I made for myself at the time:

Very slow execution when connecting to the database using an IP address​

Direct addressing refers to using the IP address of the database server when building a connection string in Access (instead of the server name).

After the database host is added to the domain, the IP address won't be accepted as a valid connection to the database host machine, but it doesn't throw an error - I don't know how this piece works exactly, but I suppose that the AD has to be consulted by the database host machine to see if the requester has permission to connect. For some reason, this is time-consuming - it takes about 8-10 seconds.

The trick is to change the connection string to DATABASE=[MachineName]\SQLEXPRESS,1433. This must force the requesting machine to do a DNS lookup which I'm guessing also validates the user in AD. So the connection is instant - the machine hosting the database no longer seems to do anything complicated to accept the connection whenever a SQL request of any type is being serviced thereafter.
 

Users who are viewing this thread

Top Bottom