Database Speed

bstice

Registered User.
Local time
Today, 13:40
Joined
Jul 16, 2008
Messages
55
Hello All,

I am a somewhat self-taught access user (with lots of help from forums like these). I building my largest database in terms of size and for the first time, I am noticing some speed issues and I would like some suggestions on how to improve the speed at which the db runs.

I have a form that displays financial data for a fairly large company. I have several dsum functions on a form that pull from a current results table, plan table, and prior year table. All in all there are close to 200k lines that this table will sort through to display the right data. Here is what I have done to make it as quick as possible, but obviously other help is needed.

I have summarized and totaled the tables for the information that I need. I have also changed the field lengths to appropriately reflect the values that will be in them - IE for text fields, I didn't leave the field size at 255, I changed it to the max value that is possible in that field.

What other help would you suggest? Thanks all in advance!

Brennan
 
Dont use DSum it is utterly slow and inefficient.

I.e. if you need 2 values from one table you need to run 2 DSums. Use queries instead.

Also do you have indexes on the fields that you are querying a lot?
 
I just put indexes on the fields and it really sped things up. I will also check into the queries vs dsum to see if it speeds things up. Thanks Namliam
 
Yes, DSum does run slow!

You didn't say, is your database split with a backend (Data tables) and front end (GUI)?

I've been running a table holding time information that normally holds 200,000 records. Needless to say I've ran into performance issues with it.

Frontend/Backend (For reports?)
It has helped to find ways to not query the table as much as possible. I've built some "holding" tables and transfer in the segment of data I want to look at and run queries off of the data in the holding table. I "dump/delete" the data either when I run a new query or when I exit the database.Most of my "slowness" happens when I try running modules/queries from a linked table and performance increases by a lot when I make the data "local" to the machine. I'm pretty positive to state that this is due to not using the connector.
(Not suggesting you import ALL of the data from the table)

Anyone else's thoughts?
 

Users who are viewing this thread

Back
Top Bottom