Why does my Query take so long to run?

Paulch

New member
Local time
Yesterday, 23:12
Joined
May 4, 2010
Messages
9
Evening All

I have a query which, when run as a select query, takes around 1 minute to run, and returns a recordset on screen which looks correct. When, however, the same query is run as an Append query, it takes - well so far it's at four hours, and it isn't even halfway yet.

Some details: The database is huge. The main records table has over 1.5m records, and there are other supporting tables. The database is designed to produce budget information by taking a set of cost values and apportioning them over various cost centres, using an apportionment matrix held in a separate table. This results in an Output table of over 3.1m records, and a file size of nearly 2Gb. The original designers used Access (2003) rather than Excel (2003) because of the restrictions on the no of records per worksheet in Excel, but even this is now at its limits.

The incoming data consists of a variety of Data Types, each of which has its own section in the apportionment table. The query in question deals with one Data Type. A filter in one of the selected field reduces the no of records to around 48000. There are three other selected fields and one calculated field, which is simply the product of the value field and its corresponding apportionment entries. This results in an output recordset of around 105k records.

Any suggestions or advice would be gratefully received. I have tried everything I can think of, and a few that others have thought of; the number of output fields has been reduced to an absolute minimum; all appropriate fields have been indexed - I'm at my wits end. The most frustrating part is the disparity between the speed of the two types of query - and being able to see the result without being able to do anything with it.

Thanks

Paul
 
Evening All

I have a query which, when run as a select query, takes around 1 minute to run, and returns a recordset on screen which looks correct. When, however, the same query is run as an Append query, it takes - well so far it's at four hours, and it isn't even halfway yet.

Some details: The database is huge. The main records table has over 1.5m records, and there are other supporting tables. The database is designed to produce budget information by taking a set of cost values and apportioning them over various cost centres, using an apportionment matrix held in a separate table. This results in an Output table of over 3.1m records, and a file size of nearly 2Gb. The original designers used Access (2003) rather than Excel (2003) because of the restrictions on the no of records per worksheet in Excel, but even this is now at its limits.

The incoming data consists of a variety of Data Types, each of which has its own section in the apportionment table. The query in question deals with one Data Type. A filter in one of the selected field reduces the no of records to around 48000. There are three other selected fields and one calculated field, which is simply the product of the value field and its corresponding apportionment entries. This results in an output recordset of around 105k records.

Any suggestions or advice would be gratefully received. I have tried everything I can think of, and a few that others have thought of; the number of output fields has been reduced to an absolute minimum; all appropriate fields have been indexed - I'm at my wits end. The most frustrating part is the disparity between the speed of the two types of query - and being able to see the result without being able to do anything with it.

Thanks

Paul

For starters, you are reaching the 2 GB limit for an Access database, so that will cause you problems in the very near future. I would recommend doing a Compact and Repair to reduce the clutter of temporary queried data it might be storing. This alone might solve your problems.

I would also recommend archiving off some data if possible. Access really isn't designed to store millions of records on one table. You may see your speed increase greatly with moving to a larger setting, like SQL Server.

You may also try breaking the query up so it doesn't have to do a lot of calculations on so many records. Try doing a subquery to lower the number of records your final criteria and calculations will have to look at. With so many records it has to do the calculations on, it can cause the run time to increase significantly.
 
Is the field that reduces the number of records to 48000 an indexed field.

Extract those records then do the linking to other tables if necessary.
 
Thanks for coming back.

Vassago:

Yeah, wrong tool for the job, eh? I'm compacting and repairing virtually after every test run. Unfortunately archiving isn't an option as all the data is "live", and the client is reluctant to talk about SQL Server just yet (but I'm working on it)

highandwild:
The filter field is indexed, but I am trying a variation on your smaller chunks suggestion. I'll let you know how I get on.

Thanks.
 

Users who are viewing this thread

Back
Top Bottom