I have a question about the accuracy of the timer-chip
frequency and the DOS Real Time Clock derived from it.
I noticed that the times reported on different computer by
DOS (using gettime() or time) differ significantly during a day.
I do the following:
1) Start two computer platforms of the same brand (e.g.
Dell 486DX2-66 tower)
2) Enter 'time<cr>' on both machines using DOS 5.00
3) Enter the same time on both machines, but don't press
4) Press 'return' on both machines at the same time
5) Start a program that continuously writes date and time on
the screen (both machines)
6) Monitor the output on both screens during the day
7) See that at the and of the day one PC appears 'slower' at
the end of a timing-interval.
I have done this experiment on a few types of compaq's and on
our newest (Dell) machines. On the compaq's I also noticed that
the difference increases faster if one PC is using disk I/O
extensively. The biggest difference I measered was several
(about six!) seconds in about half a day!
I wonder if this is the result of interrupts that are somehow
lost by the RTC-interrupthandler, or that the frequency at which
the RTC runs differ due to a slightly different crystal.
I eliminated the temperature effect, because both machines are
located next to eachother for about three weeks. The machines are
stand-alone and configured in the same way (same autoexec and config).
Even if one machine has several TSR's running and the other nothing,
it seems to me that this may not influence the long term accuracy of
the clock, because a TSR may never cause clock-interrupts to get lost!
Does anybody has any comments about this? Has anyone ever calibrated
their timing-activities using a PC? Is there a difference in (long
term) accuracy if I use the DOS-clock (gettime() and such) or if
I reprogram the timerchip myself?
I am very curious what you guys have for explanations...
X.400: C=NL,ADMD=400NET,PRMD=PTT Research, SURNAME=Hensbergen
Is there any life after the terminal?