Yes, David, you are correct. However, the formatting routines ALWAYS assume the system reference date/time. Time-card summation, date differences, and other date manipulation methods produce a time that is elapsed based on a different reference point. Therein lies the real problem. If you didn't mean the "real" reference date, you have to roll your own.
In another fine point, the formatting routines do not believe in fractions of a second whereas, if you had a high-precision timer (it's available through an API call), you could get fractions down to single milliseconds or even into the level of microseconds. Today's elapsed day count since the reference date is about 45.3K, which is less than 65K. That means it requires 16 bits. Seconds per day is just above 87K, which takes 17 bits to express. For the next 20 thousand days, you use 16 bits for days and 17 bits for seconds, or 33 bits total. Since DATE is merely a typecast of DOUBLE which has a 53-bit mantissa, you have about 20 bits left for the fraction, which equates to microsecond precision. But don't ask the time routines to make use of that precision, even if you use the HPT (High precision timer) API. The format routines won't even try.