slowness over network

cdoyle

Registered User.
Local time
Today, 14:16
Joined
Jun 9, 2004
Messages
383
Hi,
I've been developing a database 900k records so far, and with the DB installed on my local machine my queries that I use for my reports work pretty good. They will open in under 10 seconds.

but now that I've about done with the DB, I've moved it to the network, and now those same queries are taking forever to run some over a minute to open!!

Here is an example of one of the queries
Code:
SELECT q_complete_list_IIR_2Apt1.Main_ID, q_complete_list_IIR_2Apt1.Code, q_complete_list_IIR_2Apt1.Code_Description, q_complete_list_IIR_2Apt1.Display_text, q_complete_list_IIR_2Apt1.Static_Text, q_complete_list_IIR_2Apt1.Effective_Start_Date, q_complete_list_IIR_2Apt1.Effective_End_Date, tbl_IIR_description.IIR_code, tbl_IIR_description.IIR_description
FROM q_complete_list_IIR_2Apt1 LEFT JOIN tbl_IIR_description ON q_complete_list_IIR_2Apt1.Information = tbl_IIR_description.IIR_ID
WHERE (((q_complete_list_IIR_2Apt1.Effective_Start_Date)<=[Forms]![frm_report_screen]![txt_date_completelist]) AND ((q_complete_list_IIR_2Apt1.Effective_End_Date)>=[Forms]![frm_report_screen]![txt_date_completelist]));

I've already tried to compact/repair the database, and that brings the overall file size down, but really doesn't help these queries at all.

I thought at first, that maybe since I don't need to look at 'all' records, that I could maybe do a prefilter. So I created another query, to get only certain records, and then used that in the first query. But that actually made it even more slow. I guess it's because this new query, still has to look at the main table to start with, so it really just made another step in the process.

Any tips, to help get this to speed up some?
 
Is there nothing that can be done by me?
 
My experience with trying to subquery using large tables is that it does tend to make matters worse rather than better with Access.
You could however look at using a derived table if you have some set criteria that would reduce your 900k table to the 100k that you're actually interested in.

Lets just pretend for a minute that you've got two yes/no columns in your main table, if they are both set to "no" then you are interested in the data and that will filter out 500k records from your data. You can use a standard query:
Code:
SELECT *
FROM myTable
INNER JOIN someotherTable ON
  colum = column
WHERE [column1] = no AND [column2] = no
AND someothertable.[column] = something
What you are basically doing here is querying your entire mytable contents and performing the JOIN, then filtering out the stuff you don't want.

Using a derived table on the first FROM selects only the intial information that want from myTable to begin with then performs the JOIN and does the remaining filtering:
Code:
SELECT * 
FROM (SELECT * from myTable where column[1] = no AND [column2] = no) as Alias
INNER JOIN SomeotherTable ON
  Column = Column
WHERE someothertable.column = something

While I don't recommend using Select * at either point, you do need to make sure that you Select all the columns you need from your derived table query (liek the column you're going to join on, for example).


One of the main problems with sticking Access on a network server is that 9 times out of 10 it ends up on a file server which is basically designed for availability rather than performance (and has everyman and his dog hitting it to open their word documents etc).

You could also try using your current prefilter to drag the information down to a local "temporary" table and then running your report queries locally where you don't have the fileserver IO and network bottleneck to contend with.

You could also look at indexing the table in question, if it isn't already. Look at your where clauses and Joins, are these columns currently indexed? if not does indexing them make any appreciable difference?
 
Last edited:
I kinda see how the derived table works, but not sure how to write this.

Here is pt1 of my query,
If I can make Include_in_Web, from tbl_edit_type set to true. That should reduce the records.

Code:
SELECT T_tbl_main.Main_ID, tbl_IIR_description.IIR_code, tbl_Edit_Type_IRR.Issue, tbl_Edit_Type_IRR.Information, tbl_IIR_description.Static_Text, tbl_display_text.Display_text, T_tbl_main.Effective_Start_Date, T_tbl_main.Effective_End_Date, tbl_code_description.Code, tbl_code_description.Code_Description, tbl_edit_type.Include_in_Web
FROM tbl_display_text INNER JOIN (tbl_code_description INNER JOIN ((tbl_edit_type INNER JOIN T_tbl_main ON tbl_edit_type.Edit_Type_ID = T_tbl_main.Edit_Type_ID) INNER JOIN (tbl_Edit_Type_IRR INNER JOIN tbl_IIR_description ON tbl_Edit_Type_IRR.Issue = tbl_IIR_description.IIR_ID) ON T_tbl_main.Main_ID = tbl_Edit_Type_IRR.Main_ID) ON tbl_code_description.CodeID = T_tbl_main.CodeID) ON tbl_display_text.display_text_ID = tbl_IIR_description.display_text_ID
WHERE (((tbl_IIR_description.IIR_code)="2a"));


I've also been messing with the indexes, and it doesn't seem to be helping with the speed at all :(
 
Just to be sure, when you say DB is on network, do you mean the office LAN or across the WAN?
 
I've been working some more on this, and got this far

Code:
SELECT T_tbl_main.Edit_Type_ID AS Expr1, tbl_IIR_description.IIR_code, tbl_Edit_Type_IRR.Issue, tbl_Edit_Type_IRR.Information, tbl_IIR_description.Static_Text, tbl_display_text.Display_text, tbl_code_description.Code, tbl_code_description.Code_Description, alias.Main_ID, alias.CodeID, alias.Edit_Type_ID, alias.Effective_Start_Date
FROM tbl_display_text INNER JOIN ((([SELECT Main_ID, CodeID, Edit_Type_ID, Effective_Start_Date,Effective_End_Date from  T_tbl_main where Edit_Type_ID = 1]. AS alias INNER JOIN tbl_code_description ON alias.CodeID = tbl_code_description.CodeID) INNER JOIN tbl_Edit_Type_IRR ON alias.Main_ID = tbl_Edit_Type_IRR.Main_ID) INNER JOIN tbl_IIR_description ON tbl_Edit_Type_IRR.Issue = tbl_IIR_description.IIR_ID) ON tbl_display_text.display_text_ID = tbl_IIR_description.display_text_ID
WHERE (((tbl_IIR_description.IIR_code)="2a"));

This works, gives me the same results. Just not sure if this is written in a way that will help me get the results the fastest.

This query, will be used in another query.
 
Hi,

If you are not editing any of the records from your query and only viewing them then try setting the Recordset Type to Snapshot instead of Dynaset.

If this works then when you wish to edit one of these records run the query again using dynaset and ID of the record you want to edit, but this time because you are only pulling one record it should be faster.

Let me know how it goes.

Garry
 
Hi,

If you are not editing any of the records from your query and only viewing them then try setting the Recordset Type to Snapshot instead of Dynaset.

If this works then when you wish to edit one of these records run the query again using dynaset and ID of the record you want to edit, but this time because you are only pulling one record it should be faster.

Let me know how it goes.

Garry

These queries are for some reports, so there wont' be any editing using these.

I've been trying to figure out how to switch to snapshot, but I can't seem to find where to make this change?

do I go into query, design mode? Access Help is only giving me a way to export reports as a snapshot, but nothing on changing a query.
 
If the query is for a report it is automaticly Snapshot,
If you are viewing the data in form view the Snapshot/Dynaset can be set on the Data tab of the form properties.
 
My experience with trying to subquery using large tables is that it does tend to make matters worse rather than better with Access.
You could however look at using a derived table if you have some set criteria that would reduce your 900k table to the 100k that you're actually interested in.

Lets just pretend for a minute that you've got two yes/no columns in your main table, if they are both set to "no" then you are interested in the data and that will filter out 500k records from your data. You can use a standard query:
Code:
SELECT *
FROM myTable
INNER JOIN someotherTable ON
  colum = column
WHERE [column1] = no AND [column2] = no
AND someothertable.[column] = something
What you are basically doing here is querying your entire mytable contents and performing the JOIN, then filtering out the stuff you don't want.

Using a derived table on the first FROM selects only the intial information that want from myTable to begin with then performs the JOIN and does the remaining filtering:
Code:
SELECT * 
FROM (SELECT * from myTable where column[1] = no AND [column2] = no) as Alias
INNER JOIN SomeotherTable ON
  Column = Column
WHERE someothertable.column = something

While I don't recommend using Select * at either point, you do need to make sure that you Select all the columns you need from your derived table query (liek the column you're going to join on, for example).


One of the main problems with sticking Access on a network server is that 9 times out of 10 it ends up on a file server which is basically designed for availability rather than performance (and has everyman and his dog hitting it to open their word documents etc).

You could also try using your current prefilter to drag the information down to a local "temporary" table and then running your report queries locally where you don't have the fileserver IO and network bottleneck to contend with.

You could also look at indexing the table in question, if it isn't already. Look at your where clauses and Joins, are these columns currently indexed? if not does indexing them make any appreciable difference?


Does this look right
Code:
SELECT Alias.Main_ID, Alias.Edit_Type_ID, Alias.CodeID, Alias.Effective_Start_Date, Alias.Effective_End_Date, tbl_edit_type.Static_Text, tbl_display_text.Display_text, q_active_codes.Code, q_active_codes.Code_Description
FROM tbl_display_text INNER JOIN ((q_active_codes INNER JOIN [SELECT Main_ID, CodeID,Edit_Type_ID, Effective_Start_Date, Effective_End_Date from T_tbl_main where Edit_Type_ID = 11]. AS Alias ON q_active_codes.CodeID = Alias.CodeID) INNER JOIN tbl_edit_type ON Alias.Edit_Type_ID = tbl_edit_type.Edit_Type_ID) ON tbl_display_text.display_text_ID = tbl_edit_type.display_text_ID
WHERE (((Alias.Effective_Start_Date)<=[Forms]![frm_report_screen]![txt_date_completelist]) AND ((Alias.Effective_End_Date)>=[Forms]![frm_report_screen]![txt_date_completelist]));

access changes it a little after I make the joins etc.
 
OK, maybe I'm doing something wrong with the query I tried to make above. It will be working fine, and then it will all of a sudden just stop working, saying it can't find a valid field in the select.

so I can then delete any column from the query, and put it right back and it works again?

What gives?
It must have something to do with Access changing the SQL to what I posted above.
 

Users who are viewing this thread

Back
Top Bottom