Early & Late Binding

Chris,

Great job. The first set of stats, to me, showed that
regardless of reference method, there isn't a "drastic"
performance hit in choosing coding styles.

But your last DB. Why on Earth would something update
that much quicker just because it was invisible? The
magnitude of that difference is very noticable.

In my old VAX days, any sort of I/O device (disk, video,
or whatever) was WAY slower than the CPU. They had two
kinds:

QIO - Send and go on with current code.
QIOW - Send and wait for an acknowledgement

I figured Access was spending the extra time updating
the controls on the Screen. I had the logic as follows:

Access updates the control in memory, which takes
one unit of time.

If the control is invisible, it's done.

Otherwise the amount of time to ship something out to
the video card, display it, (and I think) wait for an
acknowledgement takes 20 units of time.

I went into your Form1 to cut it by half the controls
and the stats stayed consistent.

BUT, during all of that time waiting for the "visible"
update, the display doesn't change! All controls are
updated only at the end of that subroutine. I expected
to see the changed controls snaking their way down the
screen, but no. I even tucked a little Me.Refresh after
a couple of early ones, but no luck.

This isn't what I expected. I'll take a look again in
a little while.

Wayne
 
Thanks Wayne and it is a bit of an eye opener.

As to why it is so much faster seems entirely due to the screen update time even though the screen doesn’t appear to update. :confused: The same sort of timing difference occurs if the Form is invisible or screen echo is off.

I too would have expected to see the controls snaking but it may be due to the fact they are all bound to the same field. I tried requering each control after it was updated with/without a doevents but still no snake, just an increase in screen flicker and execution time.

Would be nice to see the snake because if we have to wait so long we would at least have something to watch. :rolleyes:

Might have a bit more of a play with it. :D

Regards,
Chris.
 
Well we can snake the back colour but the first doevents seems to requery all text boxes.

Version 4 attached.

Regards,
Chris.
 

Attachments

Chris,

I made the controls unbound and got an amazing increase
in speed. This is not a display issue (i.e. the display method
is not twenty times slower.

Unbound Visible = 125
Unbound Invisible = 93

Now it doesn't make sense that "bound" controls have a
wide time descrepancy with respect to their visibility.

"Unbound" controls evidently do not.

More research ...

Wayne
 
Good point,

Could be a combination of Bound and Visible.

Yep…more research.

Regards,
Chris.
 
Thanks Pat.

400 Bound Controls.
Me.Refresh method.
Echo On 397,632 milliseconds
Echo Off 118,521 milliseconds

Version 5 attached.

Versions 6, 7 and 8 coming up.

Regards,
Chris.
 

Attachments

Thanks Pat.

400 Bound Controls.
Me.Text0.Requery method.
Echo On 8,913 milliseconds
Echo Off 680 milliseconds

Version 6 attached.

Versions 7 and 8 coming up.

Regards,
Chris.
 

Attachments

Thanks Pat.

400 Bound Controls.
Me.Repaint method.
Echo On 10,455 milliseconds
Echo Off 360 milliseconds

Version 7 attached.

Versions 8 coming up.

Regards,
Chris.
 

Attachments

Thanks Pat.

400 Bound Controls.
Me.Recalc method.
Echo On 550,812 milliseconds
Echo Off 118,390 milliseconds

Version 8 attached.

Regards,
Chris.
 

Attachments

G’day all.

At this point in time (pun intended) the fastest is Version 3 for bound Controls, but that could change.

The fastest method so far was mentioned by Wayne…
Keep the Controls unbound AND (invisible IOR Echo Off).

But if controls are unbound and still require writing their data back to a table then where will be and additional overhead in time. How much...I don't know.

Two things seem certain...
The method of writing data to a control is immaterial, and all timing needs to be timed.

Regards,
Chris.
 
G’day all.

While I’m in the mood for writing something, whether or not it is important I must leave up to you, here is a link to something that may help. Sorry if you have read it before, but the first paragraph says what I think.

We can talk about the theory of timing all day long but, in the end, we must go for the demonstrated facts.

Anyone want to add an alternative method, proven or otherwise, to the discussion?

Regards,
Chris.
 
I'd like to add something...

Access still sucks whether you're giving me 100 seconds or 1000 seconds, it's still dirt slow. You can take that to the bank!!!

Jon

:D
 
ChrisO said:
Thanks Pat.

400 Bound Controls.
Me.Recalc method.
Echo On 550,812 milliseconds
Echo Off 118,390 milliseconds

Version 8 attached.

Regards,
Chris.

Heh do you have a life ?:D :D
 
Heh do you have a life ? :D :D

Version 8 took longer to run than it took to write. :(

Regards,
Chris.
 
G’day all.

OK, almost everyone seems to be getting a bit bored with timing their assumptions.

Apart from Wayne Ryan and myself, nobody has posted their own “test results”.

Would all others prefer to live with their assumptions or do the ‘hard yards’ to prove them?

Seems simple to me; postulate or prove.

Sound tough? Yes it is, but we are in a tough environment.

If you wish to modify the source code then look at this.
In version 3 the line was: -

strSub = strSub & vbNewLine & vbTab & strTemp

In version 8 it looked like this: -

strSub = strSub & vbNewLine & vbTab & strTemp & _
vbNewLine & vbTab & "Me.ReCalc"

Anybody, and that means anybody, that wishes to have their say and pontificate, ought to be brought into question.

Would it not be beneficial to know the facts or simply play with our own imaginations?

Regards,
Chris.
 
Hey Chris,

Jon is "no longer with us" (Hi Jon!)

I hope that you have a life. Mine got me in trouble last night.

Yes, this has been a good thread. How Access works and how it
thinks is vital.

I haven't followed every second of this, but it's been informative.

Thanks,
Wayne
 
No problems here Wayne. :D

Yes I do have a life, a hobby, a favorite pursuit or pastime. ;)

And that is to try and get people into the habit of questioning things, in this case that which we are “told” about VBA and programming in general.

In the case of timing code, whatever form it takes, it is not only sufficient to find the fastest method but to also try and be realistic about the benefits of doing so.

For example, we run the risk of speed verses readability. If we really want speed then why not use assembly language? Two answers come to mind, one it is too difficult to write and maintain and two it has too high a lead-time to develop anything of substance.

Therefore, if we want speed write in assembler, if we want to get the job done in a reasonble time use something like VBA.

This I’m sure you will see as an exaggeration but the same sort of thing applies within VBA.

Things like the dreaded do lookups are dog slow, but if it is executed once from the click of a button which opens a report…who cares? If the report takes 50 milliseconds longer to open, again…who cares?

And please may I say in passing.

I wasn’t trying to be disparaging towards anyone that took part in the thread. More so to those people that didn’t take part, sit back, and watch the work that goes into it.

BTW, did you come to some understanding of the bound/unbound visible/invisible thing, because that’s where the real delay is hidden? :confused:

Regards,
Chris.
 
Chris,

No problems here either.

I appreciate the amount of work that you did on this.

In my Access dealings, I haven't seen any app that based on
a user action (command button, etc.) the wait was INCREDIBLY
slow.

I've seen some non-Access apps (parts break-down) where
when you hit the "go" button it's coffee break time.

Now in SQL Server, when you hit the expand database button
it's gonna be a while. And that's with only like 30 databases
on a "fast" server.

I grew up with assembly language (and even machine code). I
used to be able to read PDP-11 machine code. But they have no
place in today's apps. A 'C' .dll OK.

Most of my apps are financial and/or software problem tracking
and performance is not generally an issue. Even parsed-text
imports from such weird things like Word-Imperfect tables aren't
too bad.

This thread was however a worthwhile endeavor.

Hope to hear from you soon,
Wayne
 
Ah the old days. :rolleyes:

I got to a point that I could disassemble/understand 6800 Hex at 300 baud, not a thing I would like to go back to.

If you’re into financial stuff, I recently had to write my first aging summary form which was all nicely normalized. Re-calculate everything on the fly.

What a dog to open.

The form was displayed in about 10 seconds, but the grand totals in the form footer took about 70 seconds… after all, the grand total is based on all Customer balances.

Try as I might there seemed no way to overcome the delay.

Answer, when calculating individual invoice item totals in the items table for display purpose only, also save the calculated total back to the invoice table. The aging summary query can then simply sum the invoice totals.

Result, 70 seconds dropped below 1 second. That which was technically correct, but unusable, became technically incorrect but very usable.

Even what we are “told” about normalization needs questioning. It is the overall result that is important, not some theoretical position.

Hope that stirs some people into action. :D

Regards,
Chris.
 

Users who are viewing this thread

Back
Top Bottom