Stored Parameter Query vs. Built SQL Statement in VBA (FE/BE Environment)

directormac

Occasional Presence
Local time
Today, 04:24
Joined
Oct 24, 2001
Messages
259
Dear Assorted Access Gods:

I'm (maddeningly) slowly working on my first split FE/BE project. This will be a distributed .mde FE that hooks to an Access BE stored on a web server (via, I THINK, providing the FE with linked tables that point at the server--but I haven't gotten around to splitting the thing yet).

I know that I should base all forms & reports on queries, rather than the actual tables. No problem there. However, I have a lot of VBA code to generate all sorts of reports, export Word docs, send e-mails, etc. Most of these start from a selected record--a booking, in this case. Some use that selected record to look up various bits of info in related tables (venues, contacts, performance dates & times, etc.) to work their mojo.

Now I already know that I should avoid DLookup(), since the number of records and concurrent users will (someday) be moderately high--given Pat H.'s many firm opinions on the matter, I try to avoid DLookup anyway()! Likewise, I know Pat's penchant for stored QueryDef's. My own proclivity, however, has usually run towards building an SQL statement within VBA and using that as the condition on my .Open method. However, Pat's continuing argument that stored queries are more efficient because they do much of their work when compiled, rather than at call time, is persuasive.

So, here's my question (at last!) to my betters: What is the "best" method for retreiving one or a small set of records from one or more tables in a multi-user FE/BE setup? I want to factor in speed, efficiency, record-locking issues, and ease of coding (also, I try to avoid DAO whenever possible, on the grounds that ADO is "the future" - HA)

Some methods I'm using now:
  • Saved queries where the conditions get all their values from (hidden or not) forms.
  • Built-from-scratch SQL WHERE clauses applied to .Open method.
  • A "temp table" - when the user selects record "X" for the above described ministrations, a VBA function uses multiple SQL WHERE'D recordsets to gather info from all the various tables, then writes it all as the single record in my temp table. The ministrations then work off that single record and delete it after. In the FE/BE environment, this temp table would be one of only two "local" tables not stored on the server.
Some methods I'm investigating:
  • Saved Paramter queries where values are supplied from opening call.
  • Hidden form based on a long chain of saved queries to get the individual pieces of info in one place (then VBA ministrations pull their data from there).
  • QBF techniques.
  • Saved queries which then have filters applied (example: save query returns all the venues, filter then narrows it to venue related to selected booking).

Any comments, ideas, examples, what have you, VERY welcome!

--Curious Mac
 
Mac,

I know I'll catch a little bit of heat for this,
but here goes...

I have some apps running on a real nice Windows
2000 server. There are about a hundred or so users,
but never any more than about 10 or so. Financial
Tracking, personnel timecards, etc.

I generally tend to the query by form methods and
have quite a bit of recordset code (DAO). The
software has the code/libraries to import/export
to/from MS Project, a ton of reports, and export
to Excel for various things.

Performance has never been a problem. It was
initially thought that a "lean and mean" front
end might be put to use for the sole purpose of
entering timecards, but performance issues never
mandated it.

I know that the network is dismally slow at some
times, but the Access app doesn't seem to be
affected by it. I mean really slow.

I think that performance-wise, if you have your
tables defined properly, are properly indexed,
and streamline the amount of data that you access
it doesn't matter what tools you use. I build a
lot of dynamic SQL strings and use DLookup and
DSum. Even DSum on a form's OnCurrent to display
project/group budgets and expenditures. Traversing
20-30 thousand records in a few tables.

Even with all of the Project code residing in the
database, there isn't a big penalty on startup.

I've heard a lot of nightmare stories about Access
performance over the network. I'll be watching
this thread.

Wayne
 
Slightly off topic?

Thanks for opine-ing Wayne. Any tips/references on best methods to "streamline the amount of data that you access"?

--Beginner Mac
 
Because I initially spent so much time fighting with the syntax of the D-Something functions and because they are convenient code one-liners (easy to come back to and read) I am now reluctant to give them up, even if they might be a clock tick or two slower than a mess of SQL. I have never had a perceptible performance problem using DSUM or DCOUNT or DLOOKUP (make sure the criteria fields are indexed) until I put a DLOOKUP into the criteria of a query -- much slower than joining tables. So much depends on the specifics...

Broadly, in my experience, the most important thing to keep in mind when it comes to performance is to make sure you are using indexed fields in the criteria of your queries. Many Access books, including The Access Developer's Guide, have a chapter or two on optimization.

Regards,
Tim
 

Users who are viewing this thread

Back
Top Bottom