optimizing DB w/ access front end & MySQL backend (1 Viewer)

JJSHEP89

Registered User.
Local time
Today, 17:45
Joined
Aug 18, 2016
Messages
121
So the day was upon us where our database system(s) needed to upgrade our back end to something that could handle all our data. The decision was made to consolidate our largest and most used databases into one front end and one SQL server for the data. Were early on in the consolidation but my big question is with regards to the most efficient way of pulling the data from the MySQL back end to the access front end.

I currently have one main form with a large number of queries so optimizing this process is crucial.

first off, which index(es) are used, the ones with MySQL or with Access? Does access require its own indexes?

secondly, would a Pass through query be best or would it be better if i created custom views within MySQL and then link to those views in my front end?

Can i create a pass-through query in VBA, currently most of the SQL is VBA generated based on the user selections and actions, which is then queried to a recordset or assigned to a recordsource property, can this same thing be done and then passed through?

running a MySQL back end and connecting via ODBC is new territory for me so I want to make sure i get this right. were early on in this transition and currently many of our forms with built in queries are pretty slow, enough to be frustrating.

iv'e googled for days now and haven't come across a good source for this kind of situation, any suggested reading would be appreciated.

TIA!
 
Once the data is in another database Access no longer has any Indexes it manages at all, it's all handled by the DBMS you have switched to.

Provided you have a good network, and your previous database was also a split FE and shared BE and worked okay, you should find the move to SQL server pretty painless.

Pass through queries can make a difference if there is a lot of processing required on the server.

You will need to examine all your queries for any VBA functions you may have added to Access. These will not work very efficiently in your new environment. This is because all the data has to be queried through to access then the local function run on the data, then presented to you as the query result - very slow.

They would be better passed back to the server and incorporated in Views (Stored queries) on the server, that can be accessed just like a table in Access.

Also try and learn to restrict recordsets on forms to only retrieve the minimum data you need.

For instance don't load a form with SELECT * FROM your Table, if you then search for a single record. Load it with SELECT * FROM Yourtable WHERE PrimaryKeyID = 0 instead. Then reload with your search string after user input.
 
so it seems to me that building the views and adding the indexes all within the SQL server is best. Now i have several instances where the VBA gnerates the SQL statement (ex. SELECT fields1, 2 etc. FROM Table WHERE ID =" & varID & ") i guess these would all need to become pass through queries right?

How would I specify a pass through query in VBA? when i change a query to pass-through from within access i don't see any difference in the SQL.
 
Ensure all queries use the sql server language, try to avoid using vba functions such as iif, dlookup, UDF's etc.

They can be a killer over ODBC because although the driver will make a good job of converting access sql to the host equivalent, anything that uses vba functions will need all relevant data 'passed back' for vba to evaluate.

If you have to use them, ensure the query is constructed in a way that the volume of data that vba needs to work on is minimised.
 
The vast majority of Access applications work quite well using linked tables and Access querydefs. So start with just a straight replace of Jet/ACE with MySQL.

As you go through and test everything, you will see what needs tweaking and what doesn't. Don't make any change that is not necessary.

This is essentially where I am at right now......

The biggest problem with converting from Jet/ACE to ODBC is that if you haven't used good client/server techniques to build the Access app, you'll have a lot more stuff to change.

Being the novice that I am, i'm sure i haven't been using the best techniques.


One of the biggest issues I have right now is with one of the primary forms in my database, its fairly large with a tab control and 8 tabs, 3 subforms, and roughly a hundred controls (all read only, except for one search box, edits occur in other pop-up forms). The form took a few seconds to open before the switch to SQL now it takes almost a minute. Almost all the information is pulled from queries generated in SQL, which all the SQL statements are generated in VBA because the search terms, login info, etc. etc. are stored in global variables that are then passed through to the SQL statement in the WHERE clause.

I've even gone through and broken down the SQL statements into a Select Case based on which tab is open, hoping it would run a little faster if I waited to query the data when it was needed rather than all at once when the form opened (I didn't see much of a difference)
 
there are many reasons that sql server/MySQL (which one by the way?) can be slower

The decision was made to consolidate our largest and most used databases into one front end and one SQL server for the data.
this is one of them - if many people are using the server - different apps perhaps - then the server can be slower. It needs to be sized and tuned. Indexes in particular need to be rebuilt on a regular basis on tables subject to many changes - similar to compact and repair.

A client of mine migrated the backend to sql server thinking it would be faster - in fact it was significantly slower, despite optimisation. Reason as above. One app (not access) was such a heavy user that all the other apps suffered. IT did not have the budget to increase resources and in the end, the backend was returned to the ACE environment.

with regards your form and optimisation, consider whether you have addressed the issues raised in this link https://www.access-programmers.co.uk/forums/showthread.php?t=291269&highlight=indexes
 
this is one of them - if many people are using the server - different apps perhaps - then the server can be slower. It needs to be sized and tuned. Indexes in particular need to be rebuilt on a regular basis on tables subject to many changes - similar to compact and repair.

A client of mine migrated the backend to sql server thinking it would be faster - in fact it was significantly slower, despite optimisation. Reason as above. One app (not access) was such a heavy user that all the other apps suffered. IT did not have the budget to increase resources and in the end, the backend was returned to the ACE environment.

with regards your form and optimisation, consider whether you have addressed the issues raised in this link https://www.access-programmers.co.uk/forums/showthread.php?t=291269&highlight=indexes

currently i don't think server performance is the issue since whats been moved over is still relatively small, less than a 5 gigs (it will grow to 100+ gigs as we consolidate our other systems, hence the move) I think its more the way i have things structured. I've been going through deleting sections of my main form just to narrow down the operations that are bogging things down and i'm seeing drastic improvements. Currently my form takes 13-14 seconds to load, removing a few things I've cut that down to 3 seconds (which is still too long)

Think's for the Link, I'll read through and see what i can gleam from it
 
Consolidating database FE's into one isn't necessarily a good idea. If the same user group uses the individual FE's, it might be more convenient for them to open only a single application but it could also complicate security and maintenance.

Consolidating the BE's is probably a bad idea unless there are relationships between tables where RI would be beneficial.

some of the front ends will be consolidated, the ones commonly used by the same people/department. the back-ends all use the same data but that data is currently stored and maintained in 30+ Back end databases all storing similar information, its a huge mess and not normalized by any means.
 
Ensure all queries use the sql server language, try to avoid using vba functions such as iif, dlookup, UDF's etc.

They can be a killer over ODBC because although the driver will make a good job of converting access sql to the host equivalent, anything that uses vba functions will need all relevant data 'passed back' for vba to evaluate.

If you have to use them, ensure the query is constructed in a way that the volume of data that vba needs to work on is minimised.

so this is a good point, i've started going through some of my queries and i have a few subforms based on queries with several IIF's and DLookup fields in them.

for example, here is one query that a subform is tied to, there are a series of checkboxes on the main form that are used to filter out the data based on the checkbox selection(s). I'd like to move this query to vba and have it generated then passed to the back end but im not sure the best method for doing that.

here is the SQL currently

Code:
SELECT presscalllog.PCL_ToolNumber, presscalllog.PCL_Date, presscalllog.PCL_Description, presscalllog.PCL_Employee, presscalllog.PCL_Operator, presscalllog.PCL_Category, presscalllog.PCL_Yank, presscalllog.[PCL_At Setup], presscalllog.PCL_Missfeed, presscalllog.PCL_DetailsReplaced, presscalllog.PCL_DetailsSharpened, presscalllog.PCL_MaterialThickness, presscalllog.[PCL_JOB#]
FROM presscalllog
WHERE (((presscalllog.PCL_ToolNumber)=[Forms]![Toolbook]![txtTM_ToolNumber]) AND ((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckSlugPulling],"Slug Pulling"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckWear],"Wear"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckMaterialBad],"Material Bad"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckMaterialVariation],"Material Variation"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckWorkInstructions],"Work Instructions"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckPressEquipment],"Press Equipment"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckOperator],"Operator"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckToolAndDie],"Tool & Die"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckChangeOver],"Change Over"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckSensorRelated],"Sensor Related"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckUnknownOther],"Other"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckDieTryOut],"Die Try Out")))
ORDER BY presscalllog.PCL_Date DESC;
 
WHERE (((presscalllog.PCL_ToolNumber)=[Forms]![Toolbook]![txtTM_ToolNumber])
because you are referencing a form variable, the ODBC driver will return everything to then compare against the form variable

you also have # in a field name - do not use non alpha numeric characters (except underscore) it can cause issues

with regards the iif - bad bit of coding. I'm not going to try to understand why you are doing it that way - would suggest wear etc should be in a lookup table and PCL_Category should be numeric linked to the lookup PK

with regards the vba, the build would be something like

Code:
dim sqlStr as string
sqlStr="SELECT......FROM.....WHERE PCL_ToolNumber=" & me.txtTM_ToolNumber & " AND PCL_Category IN (" & ……
me.recordsource=sqlStr
 
because you are referencing a form variable, the ODBC driver will return everything to then compare against the form variable

you also have # in a field name - do not use non alpha numeric characters (except underscore) it can cause issues

with regards the iif - bad bit of coding. I'm not going to try to understand why you are doing it that way - would suggest wear etc should be in a lookup table and PCL_Category should be numeric linked to the lookup PK

with regards the vba, the build would be something like

Code:
dim sqlStr as string
sqlStr="SELECT......FROM.....WHERE PCL_ToolNumber=" & me.txtTM_ToolNumber & " AND PCL_Category IN (" & ……
me.recordsource=sqlStr

a lot of this code is pieced from older & other databases, and some from back when i knew a lot less about what i was doing.... :confused:

I've significantly sped this form up just by eliminating a few IIF's, and Dlookups from some of my queries (i also deleted an entire tab that was running a pretty intense query) now i'm down to less than 1.5 sec to open which i can live with for now.
 
would the VBA code run to compile a complete SQL statement then send it to the server or does it send a portion of the sql, stop to execute the vba function, then send the rest of the sql? so essentially the server is tied up the whole time waiting for the vba to execute?
 
What Pat is saying is that SQL Server will return the entire unfiltered dataset to allow Access (via VBA) to perform the function locally.

Think of it like you holding a coin you need to compare to a big bunch of coins in a bank. Normally you'd send the description to the bank they'd use their clever coin sorting machine to find a match. However the coin you have also has an infrared ink mark on it that you also need to match , and the bank don't have the infra red glasses.

So they send you all the coins to look though yourself.
 
What Pat is saying is that SQL Server will return the entire unfiltered dataset to allow Access (via VBA) to perform the function locally.

Think of it like you holding a coin you need to compare to a big bunch of coins in a bank. Normally you'd send the description to the bank they'd use their clever coin sorting machine to find a match. However the coin you have also has an infrared ink mark on it that you also need to match , and the bank don't have the infra red glasses.

So they send you all the coins to look though yourself.

awesome example, i can only laugh at the thought of what this would look like... definitely helps me to understand what is actually happening.

So in order to avoid getting truckloads of coins dumped on me, the best course of action would be to execute my VBA and assign the resulting SQL statement to a string variable, then send the full SQL string to the server, correct? or is there a better way of doing it?
 
The ODBC drivers involved generally do a very good job of all of this.
So any already saved query will be "translated" into optimised T-SQL without you needing to do a lot.

The only time you really need to think clever is if you do Access side VBA functions, or have local tables joining to linked tables. There are some easyish fixes though.

VBA functions, I have one that gets exchange rates for instance from our Accounts package, and displays the current Sterling equivalent value, I simply rewrote the function into the SQL server, and I apply it there in a saved SQL view that I use as source for my form.

All the heavy lifting is done remotely on the server, I simply display the results in access.

This does mean learning T-SQL (for SQL server) but trust me it's not that hard if you understand Access SQL.
 
so this is a good point, i've started going through some of my queries and i have a few subforms based on queries with several IIF's and DLookup fields in them.

for example, here is one query that a subform is tied to, there are a series of checkboxes on the main form that are used to filter out the data based on the checkbox selection(s). I'd like to move this query to vba and have it generated then passed to the back end but im not sure the best method for doing that.

here is the SQL currently

Code:
SELECT presscalllog.PCL_ToolNumber, presscalllog.PCL_Date, presscalllog.PCL_Description, presscalllog.PCL_Employee, presscalllog.PCL_Operator, presscalllog.PCL_Category, presscalllog.PCL_Yank, presscalllog.[PCL_At Setup], presscalllog.PCL_Missfeed, presscalllog.PCL_DetailsReplaced, presscalllog.PCL_DetailsSharpened, presscalllog.PCL_MaterialThickness, presscalllog.[PCL_JOB#]
FROM presscalllog
WHERE (((presscalllog.PCL_ToolNumber)=[Forms]![Toolbook]![txtTM_ToolNumber]) AND ((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckSlugPulling],"Slug Pulling"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckWear],"Wear"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckMaterialBad],"Material Bad"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckMaterialVariation],"Material Variation"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckWorkInstructions],"Work Instructions"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckPressEquipment],"Press Equipment"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckOperator],"Operator"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckToolAndDie],"Tool & Die"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckChangeOver],"Change Over"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckSensorRelated],"Sensor Related"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckUnknownOther],"Other"))) OR (((presscalllog.PCL_Category)=IIf([Forms]![Toolbook]![ckDieTryOut],"Die Try Out")))
ORDER BY presscalllog.PCL_Date DESC;
I'd start by getting rid of ALL of those awful parenthesis. Access randomly adds those in to help, and I think that they are hogwash.
 
1594205483129.png
 

Users who are viewing this thread

Back
Top Bottom