dbEngine(0)(0) vs Currentdb (1 Viewer)

NauticalGent

CopyPaster of the First Order
Local time
Today, 17:43
Joined
Apr 27, 2015
Messages
4,180
Thanks Col, really appreciate you putting your antique through hell for our benefit!

I guess to truly test the Mr. Dettman is saying would be to implement his Sub and call it instead of Setting a variable to Currentdb.

I did just that but didn’t notice any difference as a user. But then when your measuring in MS, the difference is going to be almost impossible to notice.

As a side note to this thread, I came across a similar thread on a google group where two hotshots started off talking about dbEngine(0)(0) vs Currentdb and it mutated into the need to destroy RS and DB variables by setting them to Nothing. Both agrees that they only lived for the duration of your routine (unless) you made them Global. The parting of the ways was whether or not it should be done for the sake of conceptual sound conceptual programming.

Got a little heated and I couldn’t stop reading. Rivaled our best water-cooler threads...
 

isladogs

CID VIP
Local time
Today, 22:43
Joined
Jan 14, 2017
Messages
15,417
Thanks Col, really appreciate you putting your antique through hell for our benefit!

Its been going through hell every day for 7 years & unlikely to get replaced any time soon ... unless I can first persuade lots of you to buy my apps!

Just had a reply from Jim Duttman (MVP), the original author of the article at EE.
He still believes DBEngine to be faster.
I've referred him to the same MS article stating CurrentDB is the preferred option (and has been since Access 2000!)

Following a comment Jim made about the tests, I repeated them on a large split database using an Access BE with the same data table (but with many additional fields)
Unsurprisingly the results were MUCH slower but the pattern remained much the same as before....

Code:
TestID      Workstation      TestType                  TestTime
9        COLIN-PC      Set CurrentDB                    200.03
10      COLIN-PC      Set DBEngine(0)(0)              201.7
11      COLIN-PC      Use CurrentDB                    199.88
12      COLIN-PC      Use DBEngine(0)(0)              201.67

I rest my case...m'lud! ;)
 
Last edited:

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 16:43
Joined
Feb 28, 2001
Messages
20,878
Colin:

I think you may have misread part of the EE article:

Nope, read it carefully more than once.

Both my results & Minty's show its no more than negligibly faster even without a refresh.

Actually, that supports the point I was trying to make. Let's see if I can explain the scenario better.

Doing a test that runs an SQL that does 2.6 million record actions references the database once, when the SQL pre-processor does its thing with the .Execute statement. After that, THERE IS NO INVOLVEMENT with the setup method. My comment is more along the lines that the variation you saw is too large for the differences being tested. BUT if you repeated the experiment and compared times again, you might find that the numbers have varied even for the "apples-to-apples" test. It would be tedious as all hell to do it, but if you ran the tests often enough, you could get a standard deviation of the timing and I'll bet that ALL FOUR cases are within one standard deviation from each other.

I've done the "SET XDB = CurrentDB" trick a thousand times and the timing claimed in the EE article makes no sense.

If you really wanted to do this painfully and excruciatingly correctly, you should be able to do something like this:

A. Set up a loop.
B. Take a TimeStamp
C. Set up the intended SQL scenario (one of the four possibilities)
D. Execute the SQL
E. Get the Ending Timestamp
F. Next Step of the loop
G. At tend of loop, determine average and standard deviation of timing for steps B-F..

The odds favor that the numbers will converge and that ALL of the averages will overlap with each other within one standard deviation. That "5000 times slower" is, to my way of thinking, not measurable. (No "hooks" that I know of.) But it could be estimable if someone had access to the code underlying CurrentDB and just came close to violating a confidentiality agreement.

I actually agree that you should not have seen much difference for the test in question because the SQL part was too big with respect to the VBA part.
 

isladogs

CID VIP
Local time
Today, 22:43
Joined
Jan 14, 2017
Messages
15,417
Hi Doc

I agree totally about the nature of the tests & also that the EE article just doesn't seem to be based on evidence.

Anyway, I've modified the tests to add 10,000 records to a table in 10 loops of 1000 each. Perhaps this is a more appropriate way of testing this.

#1 Set db=CurrentDB followed by db.Execute followed by Set DB = Nothing - both after clearing the data & again after each 1000 record loop
#2 As above but with db=DBEngine(0)(0)
#3 No variable set/destroyed. Used CurrentDB.Execute in each step
#4 No variable set/destroyed. Used DBEngine(0)(0).Execute in each step

I repeated each test 5 times and calculated the averages



As you can see, there's still not a huge difference in values between the tests
BUT overall method #1 using Set db=CurrentDB is faster than method #2 using Set db = DBEngine(0)(0)
Similarly #3 using CurrentDB.Execute is very slightly faster than method #4 using DBEngine(0)(0).Execute

And of course there is still the need to factor in refreshes as necessary using DBEngine(0)(0)

I've somehow ended up in a lengthy exchange at EE with the author of the article. He has just written:
And keep in mind, this was written almost ten years ago...Access was different back then as well as PC's and this was old news back when I wrote this. What was true back then might not be true today given that eighteen years have passed since A2000.
I haven't even thought about this in a long time....I've just used it in all my apps since then.

to which I replied:
Exactly the point I've been trying to make throughout.
I'm happy to accept your point about performance benefits using DBEngine(0)(0) may well have been true once.
However, so far I've seen no evidence that is still the case now.

I've attached my new test db.
I'm more than happy for you or anyone else to modify the tests if these can be improved. If so, please post your tests & results

@Minty
I expect your shiny new PC will manage each of these in less than 5 seconds.
But of course it may not still be there when you return on Tuesday! ;)
 

Attachments

  • SpeedComparisonTests - NEW.zip
    118.3 KB · Views: 54
  • SpeedTest3Results.PNG
    SpeedTest3Results.PNG
    41.3 KB · Views: 378

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 16:43
Joined
Feb 28, 2001
Messages
20,878
Excellent contribution, Colin. Questioning that which was sometimes is useful when the world of computer technology is one big moving target.
 

NauticalGent

CopyPaster of the First Order
Local time
Today, 17:43
Joined
Apr 27, 2015
Messages
4,180
Col, nicely done.

As to the other issue the Jim alleges regarding the aggregated effect of multiple Set calls as opposed to a Public Sub: using your tests would prove that was well. If I find some time today I will put it through it’s paces and report back here.

With the wife being in the states for two months, it seems time is a rare commodity. Never realized just how much she did and how little my contributions were.

I could say something like “I’m going to make a change and be a better partner...” but I’m a little too old to start lying to myself!
 
Last edited:

NauticalGent

CopyPaster of the First Order
Local time
Today, 17:43
Joined
Apr 27, 2015
Messages
4,180
So...here is my test results. They are not as consistent was Ridder's or Minty's - in fact there is a HUGE difference in the results.

I added a 5th test, one that uses the author of the EE article's best-of-both- worlds work around. I will let you all see for yourself.

Also, you will notice that there appears to be two PC's but there is not. I noticed that Ridders used DoEvents after his action queries. So I ran his tests while my PC's name was DESKTOP-8DE4TUC. I had always wondered if using DbEngine.Idle dbRefreshCache was a better method so I made the changes and ran them after I renamed my PC to HomeSweetHome. For my specs, the latter is definitely a better way to go!

I added a command button named Broken Arrow which runs all five tests back-to-back because I was too lazy to run each test separately.



This one of the few times the Use CurDb() clocked the fastest times. Normally the Set db = CurrentDb and Set db = DBEngin(0)(0) were slightly faster.



My averages...no idea why the disparity is so much greater on my PC than Col's and Minty's



My laptop info



Access version info - for the record, I am on the God-forsaken Office 365 merry go-round.

In summary, if this was an episode of Myth Buster's, I would have to say the EE article is totally busted. Even with Col's and Minty's results, the idea the DBEngine is faster is just not supported.

And finally, Kudos to Col (Ridders). We can all get on these threads and talk about what we THINK is accurate but it is demonstrations like this that really proves the point. Again, nicely done!

John
 

Attachments

  • AboutAccess.PNG
    AboutAccess.PNG
    69 KB · Views: 336
  • AverageTime.PNG
    AverageTime.PNG
    20.2 KB · Views: 375
  • ClockedTime.PNG
    ClockedTime.PNG
    35 KB · Views: 335
  • SystemInfo.PNG
    SystemInfo.PNG
    58.5 KB · Views: 326
  • SpeedComparisonTests - NG.zip
    105.8 KB · Views: 50
Last edited:

isladogs

CID VIP
Local time
Today, 22:43
Joined
Jan 14, 2017
Messages
15,417
EDIT: I wrote this before seeing NG's results.
Although I don't get the disparity of his results, I do see similar patterns.
I've not bothered setting a CurDB variable as in the EE article as, frankly, I don't see any benefit in doing so.

Having read NG's results, I will also test for myself whether DbEngine.Idle dbRefreshCache is faster than using DoEvents

=========================================================

One further update from me.

I've made a few changes so I can easily adapt the speed tests form for other purposes in the future

a) The number of loops or the records per loop can easily be altered from the test form
b) Added a results form including average values
c) Added a report showing average values including a chart
d) Added a button so I could run all tests in sequence.
The order of the tests is varied to avoid possible bias

I also wanted to compare the times when setting the db variable:
i) multiple times - before each loop and clearing after each loop
ii) just once at the start of the test and clearing at the end

I've run all tests 20 times and, as expected, the differences are mostly fairly small but some patterns are clear.

1. Setting the variable db=CurrentDB once or repeatedly is faster than using CurrentDB.Execute
2. Similarly setting db=DBEngine(0)(0) once or repeatedly is faster than using DBEngine(0)(0).Execute
3. It makes little difference whether the variable is set once or repeatedly
4. Overall CurrentDB is slightly faster than DBEngine(0(0)



This to me confirms that using CurrentDB rather than DBEngine(0)(0) is the correct approach

Please feel free to adapt the attached for your own tests
Or to just test your PC for speed compared to mine!

EDIT: I forgot to empty my data from the table tblSpeedTests in my first upload. I've now done so & replaced the zip file
 

Attachments

  • SpeedTest3 - AverageTimes.PNG
    SpeedTest3 - AverageTimes.PNG
    17.1 KB · Views: 364
  • SpeedTests3.PNG
    SpeedTests3.PNG
    51.4 KB · Views: 46
  • SpeedComparisonTests v3 UPDATED.zip
    171.5 KB · Views: 49
Last edited:

NauticalGent

CopyPaster of the First Order
Local time
Today, 17:43
Joined
Apr 27, 2015
Messages
4,180
Although I don't get the disparity of his results, I do see similar patterns.
I've not bothered setting a CurDB variable as in the EE article as, frankly, I don't see any benefit in doing so.

I wonder if me running 64 bit OS and Access with 32 bit code might be the issue?

Also, obviously the CurDb is not a factor. The only reason I tried it was because Jim Dettman (EE aurthor) claimed that repeated calls to CurrentDb even with the db variable caused an increaseing hit on overhead resources. Again, the tests do not support that theory...
 

isladogs

CID VIP
Local time
Today, 22:43
Joined
Jan 14, 2017
Messages
15,417
I wonder if me running 64 bit OS and Access with 32 bit code might be the issue?

Also, obviously the CurDb is not a factor. The only reason I tried it was because Jim Dettman (EE aurthor) claimed that repeated calls to CurrentDb even with the db variable caused an increaseing hit on overhead resources. Again, the tests do not support that theory...

I don't see why your setup would explain the disparity but for comparison I ran the tests on my Windows tablet (1.33GHz processor, 2GB RAM).


Its not really powerful enough to do intensive processing as you can see!
Like you, the final 2 tests are dramatically slower...though I don't understand why.

Although I don't get a large variation on my desktop, it does mean I need to change my habits and always use Set db = CurrentDB from now on (at least in looping procedures)
 

Attachments

  • Windows Tablet Results.PNG
    Windows Tablet Results.PNG
    43.2 KB · Views: 391

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 16:43
Joined
Feb 28, 2001
Messages
20,878
I will step in for one related but slightly tangential comment.

The difference between DbEngine.Idle dbRefreshCache and DoEvents is that the former is a DBEngine-only event, whereas DoEvents allows any pending Windows events from ANY PROCESS a shot at the system resources.

Stated another way, the .Idle method waits for a specific and narrowly-crafted event to occur whereas the DoEvents action allows even lower priority processes a shot at the CPU resource. As such, even if nothing is pending, it involves an O/S non-I/O call. To be clear, the difference is the nature of the O/S process scheduler event. The .Idle call is a voluntary wait state based on your process priority but the DoEvents is not.

Granted, on any multi-CPU or multi-threaded system, the odds of having to wait very long will drop significantly with the number of CPU threads. I have almost NEVER seen my system with 3 CPUs busy at once. (It has four CPU threads.)

Therefore, the .Idle method is faster because less can happen while you are waiting AND because less code has to execute to get to and then back out of the wait state.

As another side note, DoEvents is apparently not as friendly as some people might imagine. FMSINC often has useful tips. They suggest using something other than DoEvents and they offer some experimental evidence as to why.

http://www.fmsinc.com/microsoftaccess/modules/examples/avoiddoevents.asp
 

isladogs

CID VIP
Local time
Today, 22:43
Joined
Jan 14, 2017
Messages
15,417
Many thanks to both NG & the Doc for raising this important side issue of using DbEngine.Idle dbRefreshCache or DoEvents.

The main reason I use DoEvents is for for building in a delay in processing is to allow the display to update progress.
This can be in the form of a progress bar or (in this thread), a message like 'Test 3 ; Loop 4 of 10'.
Doing either is useful for users who know something is happening but it does create a performance 'hit'. In other words it makes the processing a bit slower.

Like many, some of my regular programming habits are because I've used that method for years without really questioning whether it was the best method.
That's why I like questions like this one as it allows me to check and where necessary update my approach
So far many of my coding habits have (luckily) turned out to be correct in terms of speed e.g.
1. Using Nz rather than Len or Trim to check for empty records - see post #13
2. Using CurrenDB rather than DBEngine(0)(0) - the focus of this thread

However this thread has raised an issue that I wasn't expecting & will force me to change my habits.
Using CurrentDB.Execute is slower (sometimes much slower) than setting db as a variable by using Set db = CurrentDB followed by db.Execute.

In this thread I had been using DoEvents after each record update building in 10000 pauses.
This was because I originally planned showed messages like:'Test 3 ; Loop 4 ; Record 352 of 1000'
This caused significant flickering & processing delays so I scrapped the Record part of the message
BUT I forgot to move the DoEvents code from after each record to after each loop

Until now, I've rarely used DbEngine.Idle dbRefreshCache so I've adapted this test database to do so.
I again used 6 tests to compare the effect of using
1. DoEvents after each record added
2. DoEvents after each loop
3. dbEngine.Idle dbRefreshCache after each record
4. dbEngine.Idle dbRefreshCache after each loop
5. Using the Sleep API to build in a fixed 10 millisecond delay after each loop
6. No delay between events

I used Set db=CurrentDB and db.Execute for each test.
I repeated each test 10 times. Here are the average results:

Results:



Summary:
As you can see using DoEvents after each record creates a significant delay. Unless you need that level of progress detail it is detrimental.
Using Idle dbRefreshCache after each record produces a smaller performance hit (for the reasons explained so clearly by the Doc)
Usng this after each loop is the fastest of all BUT both tests 3 & 4 make the progress indicator erratic - the CPU just doesn't have time to keep up.
As expected, having a fixed delay using Sleep is slow as it means the processor may be paused longer than needed
Perhaps surprisingly, having no delay isn't the fastest method even though I switched off the progress indicator for that test

I also tested this on a laptop ( faster as 8GB RAM) & a tablet (slow - 2GB RAM). The outcomes were the same in each case

Conclusions:
If you want a progress indicator, use DoEvents at appropriate intervals (e.g. after each loop) which allow the display to be updated without a significant performance hit.
If progress displays aren't important, use dbEngine.Idle dbRefreshCache instead ... at suitable intervals. You can always use the hourglass and/or a fixed message 'Updating records....' so the user knows something is happening.

NOTE:
The attached database also includes code to give detailed computer information.



I've done a new thread in the repository to explain that in detail: Obtain detailed system information
 

Attachments

  • SpeedTests4.PNG
    SpeedTests4.PNG
    50.7 KB · Views: 56
  • SpeedComparisonTests v4 - PauseTime.zip
    208 KB · Views: 63
  • ComputerInfo.PNG
    ComputerInfo.PNG
    19.1 KB · Views: 348
  • SpeedTestAverages v4.PNG
    SpeedTestAverages v4.PNG
    18.1 KB · Views: 330
Last edited:

isladogs

CID VIP
Local time
Today, 22:43
Joined
Jan 14, 2017
Messages
15,417
As a follow up to the last post, I modified the test database v3 used in post #29 so that the DoEvents code was after each loop instead of after each record.

Once again, I ran each test 10 times and obtained averages.
The results were significantly faster for all tests but the order was much the same:



Conclusion:
Once again, using CurrentDB outperforms DBEngine(0)(0).
For best results, use set db=CurrentDB then db.Execute instead of CurrentDB.Execute
To repeat, don't overdo the use of such built-in processing pauses or you will suffer a performance hit (as you would expect)

NOTE
1. All these test databases should work in both 32-bit & 64-bit Access
2. This example also includes the computer info code
 

Attachments

  • SpeedComparisonTests v5.zip
    207.8 KB · Views: 53
  • SpeedTestAverages v5.PNG
    SpeedTestAverages v5.PNG
    17.5 KB · Views: 333
  • SpeedTest5.PNG
    SpeedTest5.PNG
    54.4 KB · Views: 50
Last edited:

NauticalGent

CopyPaster of the First Order
Local time
Today, 17:43
Joined
Apr 27, 2015
Messages
4,180
Interestingly enough, it was Doc who first made me aware of the alternative to DoEvents on another thread where my Action Queries were not being allowed to finish before the next one would start.

Extremely frustrating
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 16:43
Joined
Feb 28, 2001
Messages
20,878
I will admit I still use DoEvents sometimes, particularly when the issue is a progress bar that doesn't update unless I "kick it in the pants" with a DoEvent at the end of a given loop. But if I don't need the progress bar, it IS a lot faster. Just not as pretty.

[Dramatic] Ah, what price vanity? [/Dramatic]
 

isladogs

CID VIP
Local time
Today, 22:43
Joined
Jan 14, 2017
Messages
15,417
LOL....! I didn't realise you were concerned about being pretty Doc.:D

I think I've probably now flogged this thread to death.
However, I'm very grateful to NG for raising the issue as it prompted some analysis I would never have done otherwise....
...and which the rest of you no longer need to do for yourselves :)

It also means I will change one aspect of my coding in future.

I've also now added this utility Obtain detailed system information to the code repository area.
I hope this is useful to others.

I'll now get on with doing other unrelated activities on this horrible rainy day.
Where did the UK heatwave & drought go.... :confused:
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 16:43
Joined
Feb 28, 2001
Messages
20,878
FYI - after careful consideration of the comments made by FMS, Inc. I have to add that they MAY have slightly misinterpreted their own results. Or at least, I think it is possible.
The reported performance spike can be attributed to the fact that they had a lot waiting to happen. If they didn't, the delay time for DoEvents to resume execution would be nil.

This still doesn't change my comment that DoEvents invites execution of more code than the DBEngine.Idle operation. When you use DoEvents, you are telling Windows to execute all pending events before returning to you. So OF COURSE the CPU is going to get blasted by whatever was pending but blocked by your execution priority. Event code is COMPUTABLE (i.e. can use the CPU resource). So OF COURSE the CPU load is higher right after a DoEvents call. The DBEngine.Idle code enters a specific class of wait-state, still voluntary in nature, but it is more like an I/O wait than a System Service call.
 

Minty

AWF VIP
Local time
Today, 22:43
Joined
Jul 26, 2013
Messages
8,831
Like you, the final 2 tests are dramatically slower...though I don't understand why.

I'm seeing a similar thing ;


My full specs;


@NG - have you been able to determine any possible reason for the last two tests slow down ?
 

Attachments

  • SpeedTestV5_TestAverage.PNG
    SpeedTestV5_TestAverage.PNG
    63.8 KB · Views: 273
  • SystemInfo.PNG
    SystemInfo.PNG
    67.5 KB · Views: 263

NauticalGent

CopyPaster of the First Order
Local time
Today, 17:43
Joined
Apr 27, 2015
Messages
4,180
I always (well mostly) set a reference to Currentdb. Using Currentdb not only means overhead issues but also problems if you need to refer to what has happened.


Eg
Code:
currentdb.execute "delete * from Table1"
debug.print currentdb.recordsaffected
will always give zero, no matter how many records were deleted in the previous line because it's a new instance.

Well Minty, the quick answer is "I don't know"...
However from Cronk's input and the example he/she has given, I believe it is because in the code we are establishing a new instance TWICE during each loop, we incur a "hit" that adds up.

That's the story I'm going with!
 

Users who are viewing this thread

Top Bottom