How to accelerate Code

mary.h

Registered User.
Local time
Today, 18:08
Joined
Aug 21, 2003
Messages
48
Hi,
I've built a database in Access with frontend and backend db where all users have the frontend on their PC and accessing data which is hosted in the backend on a server computer.

The database task is to generate statistics as charts. That I've realized as reports with an OLE.Graph object. The code to retrieve and assembling the data runs in the frontend db.

When the database runs in the network it sometimes takes up to several minutes to generate such a report, if I have frontend and backend together on one PC it's never more than 30 seconds.

Are there anything like "good practice" how to make Access code faster?
Something like prefer queries to sql or viceversa, or should I run the code in the backend rather than in frontend ...

I am really lost here and would be very happy if somebody could give me hints even books, etc. where to get more info on this.

Kind regards,
mary.h

P.S. Hope you understand what I mean. I am not a native english speaker as you could guess :-)
 
I've already told you about attempting to speed up queries on another forum (VBAExpress) so we're onto code now.

As I said there, post what code you have.

It could be that it's repetitive.
It could be that you are not employing the correct data type.
It could just be that you're taking the long way around something.
 
mary.h said:
Hope you understand what I mean. I am not a native english speaker as you could guess :-)

Believe me when I say your English is better than a lot of the native English speakers on this forum; and English speakers in general.
 
Mary -

Search this forum for keywords: "Optimize performance", "database performance","database speed", etc... There have been numerous posts on this subject and you should find a bunch or really good posts on how to make your whole application in general, faster....

HTH,
Kev
 
Thank you first for your replies.

First (to KenHigg), there are currently about 10.000 records but people where just starting to put data in. So I need the same code work for a db size of 100.000 records.

Second (to Mile-O-Phile):
a) attached modCharts.bas which is the module in action (unfortunately not very much comments and mostly German). As I have no idea where the black holes are, I submit it like it is and will give detailed info to whatever question arises.
b) are you sure about the query-to-sql preference. I am quite certain I read something in a book, that any query needs to be translated into sql first, so if there are many queries this may take a lot of computing time. But I can't remember in what context this was written so. Could you tell me more hereto?

Third (also to Mile-O-Phile), a special thanks for your extra posting on my english. I lived in Glasgow for 5 month during a university course - learned English speaking there ...

Kind regards,
mary.h
 

Attachments

One of the issues that always comes up is efficiency of Access in a network situation. The clue (to me) is your statement

When the database runs in the network it sometimes takes up to several minutes to generate such a report, if I have frontend and backend together on one PC it's never more than 30 seconds.

If nothing else changes but the location of the b.e. and you see a big performance boost, you are probably looking at an issue of network slowdowns. Remember that to Access, the b.e. part of the DB is on a file server, not an applications server. So you are executing a select query on a workstation but what REALLY has to happen is that the ENTIRE TABLE has to cross the network, 'cause the filtration of the SELECT clause that picks out your data has to execute ON THE WORKSTATION, not on the file server.

Now, how do you solve that problem? Well, technically, you cannot make that better without spending money. Like, a hotter network backbone or make the file server an SQL server and run the b.e. DB on it. This will change the place where the query is executed.

Failing that, the ONLY other way to handle this depends on the frequency with which the same dataset is extracted. If your situation is that a particular view (SELECTion) of a table is plotted VERY OFTEN but with different filtration criteria for different plots, you can perhaps "narrow" the table to the minimum number of fields it would take for plotting. If the table data gets plotted very few times, it isn't worth the effort.

The "plotted very often" case is one of the few times where a properly normalized design should be tinkered a bit. Split the table into two parts, one of which contains only the raw plot data and the plot selection fields. If it turns out that this is the whole table anyway, game over. But if there exists a subset of the table, either subsetting by field or by records, you can do something useful.

In the case where you only use a subset of the fields for all plots, split the table to have two sets of fields in a 1-to-1 relationship and use a JOIN query for the times when you need all data together. But use only the table with the plot data when plotting. DON'T use a query of a query to select only certain fields when plotting. Use only the one query that establishes your data set from that raw (but reduced) table. Here, the goal is to reduce the width of a data row and ultimately to reduce the number of bytes sent across your network.

In the case where your subset is selection by records and you can ignore data past a certain age, create a temp table for plotting only, update it on some reasonable bases, and run your plots from it. Here, the goal is to reduce the number of rows and ultimately the number of bytes on the net.

In either case, remember that I was saying this is what you do when you don't have SQL Server or ORACLE or some other server-centric DB. You would do this ONLY when you have to do a lot of plotting looking at the same subset of data many times. You do not do this if it is (1) a very small number of plots (2) that potentially can select based on any field (3) over the entire content of the table. If any of these three criteria are true, it ain't worth the effort and doesn't help your performance problem.

By the way, don't worry about your English. Es ist sehr gut. Mein Deutsch is nicht sehr gut, ich habe nicht genug worte. Otherwise I would greet you more properly. But it has been too long since my college German classes.
Haben Sie ein guten Tag!
 
mary.h said:
Third (also to Mile-O-Phile), a special thanks for your extra posting on my english. I lived in Glasgow for 5 month during a university course - learned English speaking there.

Bitte schön, Mary. Wo wohnst Sie in Glasgow wann Sie war hier geblieben?
Wann ich war zwanzig Jahre ich gebleibt am Bremerhaven (in Nord-Deutschland) für sechzehn Wochen. Es war prima.
 
Regarding your question, why are you storing so much stuff in the Daten Variant array?

Also, there's an instance where you delete everything from tblStatistik - don't make the query in code. Create the query, save it, and - in the code - open it.

I am quite certain I read something in a book, that any query needs to be translated into sql first, so if there are many queries this may take a lot of computing time.

A saved query is fine - it's already SQL as far as you are concerned. It's compilable and faster than creating a query at runtime.

i.e.

Code:
Dim strSQL As String
strSQL = "DELETE * FROM tblStatistik;"
DoCmd.RunSQL strSQL
strSQL = vbNullString

The code above is unnecessary as the is translated to an SQL query at runtime which adds to database bloat and can slow the database. In such an instance it's easier and better to just create a query with the SQL, save it as qryDeleteStatistiks and open it at runtime.

i.e.

Code:
DoCmd.OpenQuery "qryDeleteStatistiks"



Dimensioning variables as Variants will also slow your database down.
 
Thank you all for the help.

1. I will think about splitting the main table into two - one housing the plotting data (have just given it a minute thought - need to think over the whole thing)
(thank you the_doc_man)

2. The Daten variant holds all the data for the statistics before it is written into the table. Thought it is more efficient than opening the table every time? Am I wrong??

3. Will reduce SQL statements to saved queries.

4. Will check the forum on more (Thank you to Kevin_S for the keywords).

5. Thank you for all your german statements. It's too kind. I was living in Carnarvon Street in Glasgow Woodlands and worked in Glasgow University. My spoken English still has some "strange" accent, as people are going to tell me.

Best regards,
Mary.h
 
That's OK, my German accent, since it is slightly tinged with Cajun, would drive you nuts in a heartbeat.
 
mary.h said:
2. The Daten variant holds all the data for the statistics before it is written into the table. Thought it is more efficient than opening the table every time? Am I wrong??

You only need to open a connection (using DAO or ADO) to the table once and from this you can directly write your values to the table.

Thank you for all your german statements.

How bad was my grammar and word choice?


I was living in Carnarvon Street in Glasgow Woodlands and worked in Glasgow University.

I know it. I used to spend the mornings (3 to 5) in the nearby Cafe Insomnia. :)
 

Users who are viewing this thread

Back
Top Bottom