Thanks, CJ. I always let the system do the conversions for me, for an obvious reason. You are quite right - 86,400 seconds per day, not 87,400. But then again, my wife swears there isn't enough time in the day, so maybe I was subconsciously trying to help her?
Micron, normally you don't CARE that you have two times that are identical. But if you do, then my discussion becomes significant. And most of the time, precise to the second is good enough.
Colin, agree with you on the issues with high-precision timer. The catch is that Windows doesn't always sample that time on-demand. It has been my finding that if you need as fine as milliseconds, the TIMER() is OK. If you need more than that, you are probably better off getting a special timer of some sort.
In my days at the University of New Orleans, one of my colleagues in the Chemistry department was doing some exotic work on very high speed events. He had a device that performed data capture to a storage unit that was essentially an early form of flash memory. He would perform his experiment, which would be over in a millisecond. (Yes, literally that fast.) Then he would download his data, both observation and time-base, as a 2xN array.
The problem he was facing, of course, is that when the instructions managing the interval timer start to be commensurate with the interval being timed, you have a sort of "Heisenberg" effect that blurs the measurement. I know the fastest I could ever get with the machine we were using was about 250 microsecond intervals. Faster than that and the hardware clock would start to miss interrupts.