Why does autonumber started at 2 with 1 around the middle and 4 in the last? (1 Viewer)

Micron

AWF VIP
Local time
Today, 16:15
Joined
Oct 20, 2018
Messages
3,476
If you look at a Now value that is displayed as a double, the time component is expressed in milliseconds AFAK. If you enter the two prompts (?) and position the cursor one line below the second and at the end, you can execute these within a second or less. The values are quite distinct and more precise than just one second.

?cdbl(now)
43772.4315972222
?cdbl(now)
43772.4316087963
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 15:15
Joined
Feb 28, 2001
Messages
27,001
Micron, those ARE two different times as shown by experiment from my "Immediate Window."

Code:
debug.Print formatdatetime( 43772.4315972222 , vbGeneralDate)
11/3/2019 10:21:30 AM
debug.Print formatdatetime( 43772.4316087963 , vbGeneralDate)
11/3/2019 10:21:31 AM

You might indeed be getting 15 digits out of the DOUBLE variable, and it is ABSOLUTELY true that there is enough precision in there to give you milliseconds - in theory. But the pesky little conversion routines are the problem. If you expressed times to the fraction of a second, the conversion routines would deny that you had expressed a valid number. If you diddle with the fractions and convert them to dates, they will take you to the closest second, which is 1/87400.

If I recall this correctly, a DOUBLE gives you 53 bits. As we can see from the numbers, we are in the range where 16 bits are adequate to express the day number, and will continue for the next 22,000 days or so. If we look at 1/87400, that requires 17 bits (minimum) because 87400 (seconds per day) requires 17 bits. So far we have accounted for 33 of the 53 bits. The remaining 20 bits are where the fractions get interesting. To express 1000, you need 10 bits, so you still have room for milliseconds. BUT you only have 10 bits left for fractions, and you would have to start using the high-precision timer and a totally different way of storing the result if you needed faster than millisecond timing. That fraction actually CAN store microseconds but that is pushing it.

Before anyone asks, the old Digital Equipment / COMPAQ / HP OpenVMS system used the same concepts and had the same issues. How they are the same is a long story for a different venue.
 

Micron

AWF VIP
Local time
Today, 16:15
Joined
Oct 20, 2018
Messages
3,476
So if you're saying using the system clock in some sort of a routine isn't practical, I'd agree. All I'm saying is that in the course of normal use, the chances of 2 or more date/time values being duplicates down to the precision of double data type values is pretty much nil. This started with the supposition that going by Date alone was enough (IIRC) and I'm only saying it's not. One needs more precision.

As usual, your insights reflect a whole lot of experience in realms I've never dealt with, and are educational. Thanks for that!
 

isladogs

MVP / VIP
Local time
Today, 20:15
Joined
Jan 14, 2017
Messages
18,186
Both of you may perhaps be interested in my tests of 6 different methods of measuring time http://www.mendipdatasystems.co.uk/timer-comparison-tests/4594552971.

These include the Timer function, several different uses of the system timer such as GetSystemTime & GetTickCount as well as the High Resolution Timer.
Most of these do indeed measure to millisecond precision and the HRT to less than 1 microsecond. However its worth stressing that greater precision does not necessarily equate to greater accuracy.
 

CJ_London

Super Moderator
Staff member
Local time
Today, 20:15
Joined
Feb 19, 2013
Messages
16,553
87400 (seconds per day)
just a small correction - there are 86400 seconds in a day - plus 59 if you want to include leap seconds
 

The_Doc_Man

Immoderate Moderator
Staff member
Local time
Today, 15:15
Joined
Feb 28, 2001
Messages
27,001
Thanks, CJ. I always let the system do the conversions for me, for an obvious reason. You are quite right - 86,400 seconds per day, not 87,400. But then again, my wife swears there isn't enough time in the day, so maybe I was subconsciously trying to help her?

Micron, normally you don't CARE that you have two times that are identical. But if you do, then my discussion becomes significant. And most of the time, precise to the second is good enough.

Colin, agree with you on the issues with high-precision timer. The catch is that Windows doesn't always sample that time on-demand. It has been my finding that if you need as fine as milliseconds, the TIMER() is OK. If you need more than that, you are probably better off getting a special timer of some sort.

In my days at the University of New Orleans, one of my colleagues in the Chemistry department was doing some exotic work on very high speed events. He had a device that performed data capture to a storage unit that was essentially an early form of flash memory. He would perform his experiment, which would be over in a millisecond. (Yes, literally that fast.) Then he would download his data, both observation and time-base, as a 2xN array.

The problem he was facing, of course, is that when the instructions managing the interval timer start to be commensurate with the interval being timed, you have a sort of "Heisenberg" effect that blurs the measurement. I know the fastest I could ever get with the machine we were using was about 250 microsecond intervals. Faster than that and the hardware clock would start to miss interrupts.
 

Users who are viewing this thread

Top Bottom