DSN-less connection transition (1 Viewer)

Kheribus

Registered User.
Local time
Today, 04:28
Joined
Mar 30, 2015
Messages
97
Hello all,

I stumbled upon this link on the web that has some code to modify a linked table access project to use DSN-less connections.

http://www.accessmvp.com/djsteele/DSNLessLinks.html

However, it's not working on my project (Access 2010/MySQL) as the code is breaking and only setting up connections for some of my tables and not all.

Does anyone have any code like this that is updated, or may be able to help me to set up a DSN-less connection so that I can increase the performance of my queries?

Thanks
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 21:28
Joined
Jan 20, 2009
Messages
12,853
DSN-less connection won't make any difference to query performance.

From the linked page: "My advice is to create your linked tables using a DSN, and then convert them this way before you deploy the application."

This is not a good suggestion. The easiest way to build a DSN-less connection is to link the tables using a FileDSN. This automatically links them with a DSN-less connection.
 

Kheribus

Registered User.
Local time
Today, 04:28
Joined
Mar 30, 2015
Messages
97
So DSNless connections don't improve performance?

Situation is right now i'm a relative neophyte in linked tables/ mysql / vba but have built a pretty solid application that automates billpays/invoicing.

However, it takes a long time to run. I'm trying to think of whatever I can do to increase the performance. I have read some tips on how to increase performance using linked tables/ODBC. For instance, i'm not using passthrough queries for inserts or updates - which I believe should speed up the performance.

Program logic like this:

Pull distinct 'names' from client table into recordset.
Iterate over each day in the invoice window for each distinct name in the client table recordset and calculate billing metrics.
Insert that days billing calculations for that client into an invoice table.
Move to next distinct name in client recordset.

That's the barebones process. It takes too long like .. 1 second for each day, and there are 15 days in an invoice window, and maybe 800 clients. So that's like 200 minutes for an entire processing crunch!!! I will actually be running this every night so it will really only take ~15 minutes a night, but it has been a real effing pain in the A during testing to have to wait so long for this to run.

I read that it's better to do smaller inserts to tables, but perhaps inserting each specific day into my invoice table for each client is not the most efficient way. I'm thinking that there is a lot of overhead in throwing the data into the mysql table each day. Also, i'm not using passthrough queries on inserts or updates - as I thought a DSNless connection was a prerequisite for passthrough queries.

Any insight on this would be greatly appreciated.
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 21:28
Joined
Jan 20, 2009
Messages
12,853
The only time saving with DSN-less is the time it takes to read the DSN. Negligible.

The Access engine is quite intelligent. It if can, it will pass a query to the database engine for execution. I don't know how well this applies to MySQL but it certainly does work well with MS SQL Server. Consequently defining PassThrough often doesn't make much difference over linked tables.

Wherever possible avoid iterating though recordsets. Especially avoid Filtering and Finding in recordsets. Better to open a new recordset with the specific criteria than load a big one.

The best performance is by using Stored Procedures on the server. SPs are precompiled so they do not need to build a query plan each time. They can be run by Exec command on a PassThrough. However passing parameters needs to be done by changing the SQL property of the QueryDef so it is a bit clumsy.

The best performance is to Exec a stored procedure with parameters passed in an ADODB Command.
 

Kheribus

Registered User.
Local time
Today, 04:28
Joined
Mar 30, 2015
Messages
97
Hmm. Ok, so I'm a bit confused about how I would reduce the amount of recordset.movenext operations that i'm using.

I need to have a list of distinct clients, so I create a recordset of their distinct names from a client table. Those clients have subrecords tied to them, so i use their client ID to create another recordset of records that they own that I use for generating billing metrics. I generate client metrics by iterating through the sub-recordset, then insert a line into the invoice. I then iterate to the next client by using recordset.movenext.

Most of my SQL related procedures are inserts and updates. So, you're saying perhaps I need to create stored procedures on the MySQL admin and pass those parameters to use in the stored procedures, ie the field data related to the insert or update statement?

Or, perhaps you're saying that the most overhead my app is experiencing isn't from the inserts/updates, but rather the recordset iteration?

Thanks
 

Galaxiom

Super Moderator
Staff member
Local time
Today, 21:28
Joined
Jan 20, 2009
Messages
12,853
Just stepping through a recordset is fine but do you really need to use recordsets at all? The whole process would be much faster if it could be built as a query or set of queries.

What are the metrics you are generating? The data could probably be structured to generate them in a query.

Preferably do the entire job in a Stored Procedure on the server.
 

Kheribus

Registered User.
Local time
Today, 04:28
Joined
Mar 30, 2015
Messages
97
I will write out the pseudocode of my main program logic tomorrow as I've got to take off. Thank you very much for your support. More to come tomorrow.
 

Users who are viewing this thread

Top Bottom