Apply filter without requerying SQL back-end (1 Viewer)

dkmort

New member
Local time
Today, 00:20
Joined
Apr 29, 2009
Messages
11
I have a multiple items form that I'd like to be able to have the user click various buttons to apply different filters to form data. This is working via DoCmd.ApplyFilter

However, every time that this action is taken, it is actually sending the filter as a query to the back-end Microsoft SQL server, causing a significant delay, as these are pretty huge tables that take about 1 minute to display in the first place.

What I'd like to do is be able to do an unfiltered query of the form's data source initially, then apply various filters to the currently loaded recordset locally on the front-end only. Is this possible?

I think it would really speed things up, as most of our desktops are actually significantly faster than the server. Plus, the server's query actually pulls from several tables to build the query, which takes more time. I would think that once that the recordset is loaded into the access client's front-end in RAM, it would be a much quick and simpler process to run filters against what the access front-end sees as a single table (since the initial source query is returned in the format of a simple table.

I appreciate your help :)

-
Doug
 
Last edited:

Galaxiom

Super Moderator
Staff member
Local time
Today, 16:20
Joined
Jan 20, 2009
Messages
12,863
Are you sure the recordset is being recreated by the server? Maybe the filter just take a long time to apply.

If the recordset is larger than the available RAM in the PC the overflow will be written to disk. Applying the filter will then be quite slow while it reads from the disk. It is probably better to limit the records that are returned and let the server get requeried on the smaller number of records.

What recordset type do you have for the form?

Longshot but try applying the filter to the form like this and see it it makes any difference.

Me.Filter = "yourfilterstring"
Me.FilterOn = True

Also, have you indexed all the fields in the tables that are used in any Joins and the Where clause of the query? It makes a huge difference to the speed of a query.
 

dkmort

New member
Local time
Today, 00:20
Joined
Apr 29, 2009
Messages
11
There is no question that it is requerying my sql server. Any time I use docmd.apply filter in an action for a button, the cpu on my sql server goes to 100% until the data finally displays on my form on my front end.

I will check into your other suggestions.

Thanks.
 

jal

Registered User.
Local time
Yesterday, 23:20
Joined
Mar 30, 2007
Messages
1,709
Out of curiosity exactly how large are these "huge" tables? And exactly how long is the "significant delay"?

Without some kind of ballpark figures,it's hard for us to know how severe the problem is.
 

dkmort

New member
Local time
Today, 00:20
Joined
Apr 29, 2009
Messages
11
The filter just took 49 seconds to apply. During this time the Access front-end is unresponsive, and on the SQL 2005 server, the CPU is at 100%, with the bulk of it being from sqlservr.exe. At the same time that MS Access becomes responsive, the CPU on the server also drops down to nearly idle.

As for the table sizes:

I just checked & realized that the huge table that we're using for this query is not actually a table. It's a "view". It has 34,325 records. SQL Management Studio tells the following for the database size:
Size: 997.31 MB
Space Available: 253.05 MB

The SQL database itself actually has 823 tables. It is the SQL database for Microsoft Office Accounting (aka Small Business Accounting).

As far as RAM on my client computer is concerned (Access front-end), I have 8GB. Everything else on my PC is very snappy even while Access is not responding during the filter application.

Recordset type in the form is Dynaset.

I'm not sure on the indexed fields. I'll have to check.
 

boblarson

Smeghead
Local time
Yesterday, 23:20
Joined
Jan 12, 2001
Messages
32,059
Sounds to me like the SQL server tables and view need to be optimized first.

Also, if they do get optimized then instead of applying a filter, I would think it would be faster to just return a new recordset based on the criteria so that the server does more of the work.
 

Banana

split with a cherry atop.
Local time
Yesterday, 23:20
Joined
Sep 1, 2005
Messages
6,318
Bob's suggestion is probably what most, myself included, would do because it's simple and effective. Generally, I think it's very inefficient to dump several records that the user may not even want the records, so it usually make more sense to provide a search form first and let user specify the criteria to download the set of records.

I did have one occasion where I decided that it was actually more efficient to download a set of records then permit filtering by end-users because in this case, the end users did need all the records but may use filtering & sorting to help with their workflow as they work on different records. In this case, I had to use ADO recordset, which I can then disconnect and thus apply filtering & sorting without any roundtrip back to the server. However, this means the form become much more complicated and there are quirks with ADO that needs to be coped. Using a local temp table with DAO is arguably simpler to implement and I know some would create a "work database" to use as a place for temp tables and dump data in there and do whatever amount of filtering & sorting as they please and because it's still DAO, there's no surprises as is the case with ADO.

Of course, all of that only make sense if you've remembered to index and optimize your SQL Server tables accordingly, else we'd just be hiding a design flaw and thus complicating our application for no reason.

I hope this helps.
 

Users who are viewing this thread

Top Bottom