Calculations take ages to run (1 Viewer)

Tay

likes garlic
Local time
Today, 22:28
Joined
May 24, 2002
Messages
269
I wasn't sure of the best section for this, so opted for 'General'.

I have a db which records the results of accuracy checks across different workstreams carried out in our organisation. Aside from the accuracy side of things, it also logs date information. For three workstreams, I need to work out the clearance times between when a form is received and when it is actioned. Thanks to some working days code (which takes into account public holidays) which I found linked to on here, I am successfully calculating this for each check. But, it is running dreadfully slowly, and on occasion, it crashes my machine. At present, I'm working on copies of the FE and BE of the db, rather than the live version which is on the network in a multi-user environment.

I've compiled the db and found no problems. I don't know if it's running slowly because of the code, or the amount of calculations it needs to do. At present, there are around 4000 records to check and this grows by 1000 each month. The db has replaced an Excel spreadsheet which had the same function - it was replaced as it was so slow. I don't wish to impose another appallingly slow system on people, especially when Excel produced better reports!

The working days calculation runs in a query. However, I wonder if I should store the results in a table. The dates aren't going to change, so this shouldn't be a reason for not storing the calculation results this way. The problem is that in order to report in the way that managers want, I need to:

*calculate the working days to establish clearance time per check per month
*calculate the average clearance time per check per month
*calculate the number of checks cleared in more than 3 days per month
*calculate the % of checks cleared in under 3 days (this is then used on a cumulative basis) per month

Unless I'm being dense, I figured that I'd need to achieve this through a series of queries. So for the average clearance time, I first need to work out the clearance times, then sum them (per month), then take the mean. I can't do all that in one query.

This is where the slowness is happening. The initial query runs in about 35 seconds. But when I use this query as the basis of the next query, it gets a lot slower. I don't see this improving when I get to the next query (which will be based on the second).

Any advice would be grateful as I'm unsure how to proceed with this.
 

MarkK

bit cruncher
Local time
Today, 14:28
Joined
Mar 17, 2004
Messages
8,185
Make sure any fields you use to search or sort are indexed. For you this'll be the dates. Go to table design, select the field and for the 'Indexed' property down below select Yes (Duplicates OK).
See if that makes a difference.
Mark
 

Tay

likes garlic
Local time
Today, 22:28
Joined
May 24, 2002
Messages
269
Thanks Mark. I'm sorry to come back to this thread so late; I was off work and only returned today.

I checked the relevant fields and indeed you were right, I'd failed to index them. The initial query now runs in a couple of seconds, with the final query taking just over a minute. At least it's not crashing my machine, but I do fear what will happen if I put it into place on the server for a number of people to run it.

Should I consider storing the calculated field in a table, or endeavour to find another reason why the query is running so slowly?
 

Galaxiom

Super Moderator
Staff member
Local time
Tomorrow, 07:28
Joined
Jan 20, 2009
Messages
12,856
Another thing to watch out for with dates is where functions are applied. This example returns records for the current month.

Code:
WHERE Month([datefield] = Month(Date())

However clauses like this are very slow because the function is applied to every record in the database before the deciding if it is to be returned. They grow increasingly slower with the number of records.

This alternative is very much faster because the functions are only applied to the criteria and the date of the record can be read directly from the index:

Code:
WHERE [datefield] BETWEEN DateSerial(Year(Date()),Month(Date()),1) AND DateSerial(Year(Date(),Month(Date())+1,0)

This is just an example and your precise needs will obviously be different but the universal principle with dates is to avoid processing the record value whereever possible and generate a date range for the criteria.
 

Tay

likes garlic
Local time
Today, 22:28
Joined
May 24, 2002
Messages
269
At least it's not crashing my machine, but I do fear what will happen if I put it into place on the server for a number of people to run it.

Famous last words! Running the same final query today, my machine crashes every time.:rolleyes:

Thanks, Galaxiom. SQL isn't my forte, but I think that this isn't the cause of my problem? I've pasted the relevant bit below from my initial query.

Code:
WHERE (((tbl_Main.Date) Between [Type the start date:] And [Type the end date:]) AND ((fNetWorkdays([Date_Received],[Date],1))<3) AND ((tbl_PPC_Mat_Med.New_or_Retained_ID)=1));

When prompted, if I put in the July's date range, the final query takes a couple of minutes to run. But, when I choose 01/04/2011 to 31/07/2011 (from when the db was born to now), my pc crashes.
 

Galaxiom

Super Moderator
Staff member
Local time
Tomorrow, 07:28
Joined
Jan 20, 2009
Messages
12,856
Usually coding problems will simply throw an error within Access. Crashing is more serious.

I suspect you have a problem with your PC. Queries can take a lot of memory and the bigger date range may be pushing into parts of the memory that are playing up.

Can you try it on another PC?
 

gemma-the-husky

Super Moderator
Staff member
Local time
Today, 22:28
Joined
Sep 12, 2006
Messages
15,709
it's hard to tell without seeing your code - but are your tavbles similar in nature to you spreadsheet - or are they properly normalised?

that me be the issue.

4000 records should be very few for access to process.
 

Tay

likes garlic
Local time
Today, 22:28
Joined
May 24, 2002
Messages
269
I've got the day off work today, so will try it on another machine tomorrow (and will see how it runs at home).

I believe that my db is properly normalised (or as near to it as can be), and is nothing like the original spreadsheet. Hopefully, that's not the issue.

It's just odd as the initial query runs ok (the one which works out the time elapsed between us receiving a form and then processing the form). It's the subsequent queries which use this data from the initial query to calculate average working time, count of checks which were completed in more than 3 working days and percentage of checks completed within 3 working days, where Access has issues and crashes.
 

Tay

likes garlic
Local time
Today, 22:28
Joined
May 24, 2002
Messages
269
Right. Tried it on a couple of machines at home. On the oldest (with just 128 meg of RAM) it ran in 12 seconds, and on a newer, more capable pc, it ran in 9 seconds.

So, I shall try it on a pc other than mine tomorrow to see if there is any improvement. I will keep my fingers crossed that there is. Otherwise, it means that our appallingly slow network is the reason and I dread to think where that will leave me.:eek:
 

Tay

likes garlic
Local time
Today, 22:28
Joined
May 24, 2002
Messages
269
Back at work today. Ran the query on my machine = crashed it. Ran it on another with less memory than mine, ran in about 2 minutes. And ran it on another machine with the same spec as mine and it ran in around a minute and a half. Tried it a few times on the other machines and logged a call to get my pc fixed.

2 minutes isn't great, but at least it runs. Is this a reasonable amount of time for a query to run (when the db is on a network)? It definitely seems as though my pc is not very well, but do I have issues with the query too?
 

Users who are viewing this thread

Top Bottom