Does anyone actually use D<Function>?

Banana

split with a cherry atop.
Local time
Today, 07:54
Joined
Sep 1, 2005
Messages
6,318
I'm curious whether anyone else has found a use for D-functions (e.g., DLookup(), DCount(), DMax) where it was easier or better than different approach and in what cases?

Since using Access three years ago, my encounters with D-functions were brief and I quickly eschewed them because of fragility in the syntax and too many errors in building that I found it easier to write it out one way or other using either or combination of SQL and VBA.

Of course, it does no good to be dogmatic and thus I'm asking under what cases where D<functions> actually make more sense (however you define the 'sense', be it ease of use, performance or whatever) ?
 
I've used only because I couldn't find a different approach (there might be one, but I am one person making decisions in a vacuum). To date, any implementation of a DFunc I have used has been reliable, but I also qualify them by attempting to think and prevent possible errors to them. For instance, all DLookups have a IsNull condition and a way to handle the null.

As far as decisions per performance - I have no idears ... :D

For instance, I might have a query that analyzes/manipulates a set of data, then perform a DLookup on the query because the app logic needs to test before moving forward. Is this more or less optimized than calling the query through DAO or SQL? No idea; I've not tested for such and applications I write are for very specialized use (< 5 users) - so I do not see much of an impact - so in this sense used for ease of use and implementation.

-dK
 
i use them occasionally, usually like dkinley, on the fly if i need to check something along the way. but i think i (almost) always create a query first and look up the needed value from that query. it's usually a count of some kind and i set the first column's alias to 'Count' and the query's name to qryCountOfx and it's easy to look it up. can't think of another example offhand.
 
i use a single row constants table

i have one row, no adds or deletions, and i regularly use dlookup to get values from that table

i also quite often use dlookup generally to read field values - but if i need more than i field, i will probably use a recordset
 
and i use dcount a lot, for belt and braces verification of insert/delete procedures

to effectively do this

dcount(table)
dcount(appendquery)
run appendquery
dcount(table)

check new dcount to make sure insert worked correctly
 
Gemma, if you're using .Execute, you can use the RecordsAffected property to get the count.
 
As is often the case, I have a differing view with respect to others who try to avoid the DFunc calls. Trust me, DFunc calls are anything but defunct.

Look at the following paragraphs and tell me if this makes sense to you. DFunc calls tend to have DFunc( [Field], recordsource, criteria ) in them. So think about how this call actually executes.

You start with an empty string. Drop a "SELECT " then optionally insert keyword and brackets depending on things like DMax drops in "Max(" and ")" ; DMin drops in "Min(" and ")" - you can figure out the others. So drop in the leading keyword, field specifier, trailing parentheses, and keyword " FROM ". So far, you have "SELECT [field] FROM" or "SELECT Max([Field]) FROM" ...

Now put in the recordsource name, which is usually a table or query, right? So now you have "SELECT Count( [Field] ) FROM recordsource"

Then if the criterion field isn't blank, drop in keyword "WHERE " and the literal or concatenated criterion field. Add in a semicolon. What do you have? An SQL string, "SELECT Avg( [field] ) FROM recordsource WHERE selection-criteria ;"

So the function can then open a recordset based on a dynamic SQL string. If this was specified correctly, you get back a unique entry for the dynamic SQL recordset, usually based on a single field. Now open the recordset and read the first (hopefully, ONLY) record. And there is your answer. (Don't forget to close what you open...) The DFunc calls (Domain Aggregate functions) just build a dynamic SQL string using SQL Aggregates.

Read Allen Browne's code and you will see that he creates the same sort of thing. He claims it is faster than the DFunc calls, but I'm hard-pressed to see how. Any time you create your own SQL string, you've already slowed yourself down because such a string is not pre-parsed.

Given that most computers are able to execute thousands of instructions in the time required for a single disk seek, the difference between "rolling your own" function or just using the DFunc functions is going to be overwhelmed (underwhelmed?) by the speed of the disk as it seeks the data you want. Trust me, if you have a 7200 RPM disk, which is typical, you are looking at latency of 8.333 msec/rotation, which for a 1 GHz machine is still 8.33 MILLION clock cycle, probably 3 million instructions. You can do A LOT if you have 3 million instructions to do it in.

Therefore, from the pragmatic viewpoint, unless there is a very special situation to be considered, I always use the DFunc calls as opposed to "rolling my own" - because in most cases the return on the investment of my time is too small to be worth it. Stated another way, I only re-invent wheels where it does me a lot of good to do so, such as where the wheels I have are broken.

The DFunc tools are valid parts of your toolkit and I see no issue in using them when they correctly address the question you need to ask. Just think of a DFunc call as a "spot SQL query based on a dynamic string" and you know everything you need to know about the call.

Let's take a look at efficiency. The SQL Aggregate functions for Max, Min, Count, Avg, etcl would all be compiled to machine code, I would think. If you tried to roll your own versions of the call using VBA, that is NOT compiled code - it is pseudo-compiled into p-code - so cannot be faster than a compiled intrinsic function.

All that is left to consider is the amount of time it takes your function or Allen Browne's function or somebody else's function to build the dynamic string. How much is THAT going to cost you? Not a lot of time, but even there, DCount as an intrinsic library function is ALSO compiled to machine code. I'm at a loss to figure out how anyone thinks they will save that much time over the compiled functions already in the Access .DLL file libraries.

Not that I'm a fan of everything that Microsoft writes, you understand, but I just have a lot of problems with re-inventing wheels when the ones you have are rolling nicely already. Not to mention that re-inventing wheels might be an act of disloyalty to your employer, who wants you to avoid wasting time on things that you don't need to do.

Since I don't claim to know every problem, I can't claim to know that "line in the sand" when the DFunc calls become too cumbersome for your application. But if you take into account time to program vs. the miniscule advantage you get for most case, I would suggest that you use the DFunc calls whenever you can.
 
Since Doc_Man has come onto the scene, I only use Dlookup on descriptors, rightly or wrongly the database has multiple suppliers for one stock record. It is superfluous for our needs to attach the file with the Suppliers Name to the Stock record so DLookup suffices.

Never use them in queries!

Simon
 
Simon's stated use is reasonable. As he points out, a domain aggregate lookup is useful only when it has some assurance of a unique return - which he says in his case is not guaranteed for the problem he described. Therefore, inappropriate to use them there. But this also demonstrates an important fact: Know your data structure. Simon obviously does, because he knows when to expect non-unique results of a lookup. This is a sine qua non of good database design.

His admonition about queries is also important to consider, but I might choose to differ in certain limited cases. Reasons still exist as to why you might wish to use DFunc calls in limited cases in queries, but other strategies also exist.

Don't forget that in queries you have SQL Aggregate functions available. I might use SQL aggregate sub-queries that I then JOIN to my primary query. Which obviates the need for many but not all DFunc cases. And of course, using stored sub-queries means that they have been optimized when they were originally analyzed and stored.

As I have said many times before, just because a tool in your toolkit isn't the prettiest or best tool that could ever exist, that doesn't justify ignoring the tool when it is appropriate to use it. Forms and reports are places where an isolated DLookup or or DFunc call frequently makes sense. Also, sometimes in VBA code. Less often in queries.
 
I use them but I don't have special reasons for using them except: they work, they're pre-built, and they don't take a programmer-level skillset to get functioning. :)

Once I learned about explictly dealing with datatypes, the syntax seems pretty easy for me.

That said, my databases are not the millions-of-records monstrosities that some of you have to deal with or I might be paying much more attention to shaving off those milliseconds.
 
doc man

as i say, if i need to read some bits from a table not bound to the recordset i am currently using

eg -i am doing sales order entry, but i need to read 4 or 5 fields from a purchase account say forwhich i do know the unique PK - then would you say its quicker/more efficient to do several dlookups, or to open a recordset.

i tend to do the latter, as i assumed it was more efficient - but it may not be in the light of your comments - so is there a quantity of field lookups beyond which you WOULD stop using repeated dlookups?
 
One thing is that once you've created a Access query, or even a SQL statement stored in certain properties (e.g. Rowsource/Recordsource), it is already optimized. (To be precise, though, the latter does not hold for earlier version... before 2000? You had to save it all as a Access query in order to get the benefit). Therefore, any query that does a lookup will be always faster than any VBA, Expression Service and probably will be faster than a .DLL written in C. It's Jet's native language, so to speak. DAO is great, but it's still one more layer atop.

My personal issue with it, however, is not performance because computer are so fast, a cycle or even a handful of cycles wasted is going to go unnoticed, but rather the ease of use. I can write out a SQL statement and be done with it, whether I have to make sure my syntax for a DFunc are correct, *then* managing the cases where I could get an error. So to me, at least, Dfunc requires too much of my time and investment in constrast to what I can write out in SQL, test it and throw it into a control and forget about it. If it returns nothing, the control is blank as expected, none of that silly "#Error" or "#Name" errors.

In regards to Gemma's question, whenever I have a object that I find myself referring to more than two times, I throw it into a Property procedure and deal with it only through that procedure. It's just my way of enforcing One-Definition Rule and enforcing consistency between Access objects and VBA. It also simplify the entry for me by taking care of stuff for me so I only need get the object itself or pass parameters into it and not worry about setting up database variable, <DAO>def objects, recordsets and so such.
 
i use them in VBA to count little recordsets or a cheeky dlookup..it's a lot less code

i'd never use them in queries or on big queries/tables in VBA

oh and i use them every now and again when i have an updatable form based on a complicated query and i can't link in any more tables as the query becomes too complicated to be updatable... would love to know a better way :)
 
Darth,

I had the same sort of problem (too many databases are open) so I created building block queries (subqueries) and tabulated forms which even still were too slow, so I got rid of rowsources on combiboxes on the form, and made them on demand only (On Enter).

Simon
 
Domain functions are perfectly useful to look up SINGLE values. A name, a minimum value, an average. The operative word is SINGLE. Each domain function runs a query to return the results. That means that they don't belong inside queries and preferably not inside code loops either. The vast majority of DLookup()s found in queries can be easily replaced with a join and the impact will be an amazing performance improvement. Many aggregates such as sums and averages can also be replaced by creating a totals query and joining to it. This is also significantly faster than using a domain function inside a query.

I occassionally use domain functions in forms and code and more rarely in reports. The difference between a form and report in this case is that a form shows only one or just a few records at a time whereas a report is completely calculated and so will show all of its records and so you will see the slowness caused by the domain functions where you wouldn't notice on a form while viewing one record at a time.
 
Darth,

I had the same sort of problem (too many databases are open) so I created building block queries (subqueries) and tabulated forms which even still were too slow, so I got rid of rowsources on combiboxes on the form, and made them on demand only (On Enter).

Simon

ooh ta

i can see the logic there...will give that a go when next in trouble
 
My 2 cents (if it's not too late).

I saw no speed loss using dlookup until i split my database and placed the backend on the network.

Then things were really slugging - almost 2000 milliseconds during each record change. I replaced every dlookup with allen browne's elookup, and tweaked a few other things. In the end I got it down under 100 milliseconds.

So I think there is a huge speed difference that will show up in the right situations - and my database only had 1000 records.

Evan
 
gemma - your description of the scenario for the "how many is too many" question confuses me a little. I'll toss a thought your way anyway.

If you are in VBA where a recordset operation is not terribly expensive and you have the PK of the record hosting the lookup, stay with the recordset and do a single lookup for multiple values in the record.

Pat's comment is entirely appropriate but has TWO sides to the coin.

First, DLookup in an ugly query that is deep in a nest of nested queries will probably have to do a gazillion opens and closes of "spot" recordsets. Not efficient. On the other hand, where the DLookup is the top level of a query that will return only a few records, the cost might not be so very bad. Particularly if most of the filtering is done in a lower level (query of a query of a query...)

Second, if you have to look up a LOT of different items, that adds to the cost of the place that calls the lookup. I.e. if you want to know ONE thing from a recordset with a known PK, that's chump change. If it is inside a huge loop, you have to look at exactly how that loop is going to use the results. If you are looking up more than one field, or a LOT more than one field, you are giving yourself an invitation for trouble.

I rarely use DLookup when a JOIN sub-query is possible and I'm looking for a lot of lookups or anything where an SQL aggregate in a sub-query makes sense. But we also should consider time and ease of programming. On a form, which is a static stop-and-stare type of interaction, a DLookup for a couple of fields will be executed every few seconds at the fastest, and probably slower than that. Net system load is nil Net ease of programming in that context becomes more important.

In a report or query, that same DLookup might be called upon a very large number of times per second. There, the efficiency can be an issue. A JOIN would be the better choice if it is available. Your comment about "knowing the PK" of the lookup makes me ask if this can be imparted to a sub-query to make the "translation JOIN" you would need or is this some isolated case where the known PK is actually known for the wrong reasons? I.e. not because of the primary record but just because you already know the answer from an oddball source.

If your case is not a standard JOINable case, then do what you think is best. There are no hard-and-fast rules except ( a ) it has to work right ( b ) in your lifetime ( c ) in a way that is reproducible on demand ( d ) that doesn't cause you to have cardiac arrest when the boss tells you the "real" schedule.
 

Users who are viewing this thread

Back
Top Bottom