AT Real Time Clock -> accuracy of

AT Real Time Clock -> accuracy of

Post by Eric Fehl » Wed, 11 Aug 1993 23:53:29



:I have a question about the accuracy of the timer-chip
:frequency and the DOS Real Time Clock derived from it.

:I noticed that the times reported on different computer by
:DOS (using gettime() or time) differ significantly during a day.
:I do the following:
:1) Start two computer platforms of the same brand (e.g.
:   Dell 486DX2-66 tower)
:2) Enter 'time<cr>' on both machines using DOS 5.00
:3) Enter the same time on both machines, but don't press
:   return yet.
:4) Press 'return' on both machines at the same time
:5) Start a program that continuously writes date and time on
:   the screen (both machines)
:6) Monitor the output on both screens during the day
:7) See that at the and of the day one PC appears 'slower' at
:   the end of a timing-interval.

:I have done this experiment on a few types of compaq's and on
:our newest (Dell) machines. On the compaq's I also noticed that
:the difference increases faster if one PC is using disk I/O
:extensively. The biggest difference I measered was several
:(about six!) seconds in about half a day!

:I wonder if this is the result of interrupts that are somehow
:lost by the RTC-interrupthandler, or that the frequency at which
:the RTC runs differ due to a slightly different crystal.

:I eliminated the temperature effect, because both machines are
:located next to eachother for about three weeks. The machines are
:stand-alone and configured in the same way (same autoexec and config).

:Even if one machine has several TSR's running and the other nothing,
:it seems to me that this may not influence the long term accuracy of
:the clock, because a TSR may never cause clock-interrupts to get lost!

:Does anybody has any comments about this? Has anyone ever calibrated
:their timing-activities using a PC? Is there a difference in (long
:term) accuracy if I use the DOS-clock (gettime() and such) or if
:I reprogram the timerchip myself?

:I am very curious what you guys have for explanations...

When a PC with a real time clock boots up, DOS reads the real-time clock
value and updates the time-of-day value stored in the BIOS data area
at 40:6Ch.  From then on, int-8 updates this time-of-day value stored
in 4 words.  Typing the DOS commands time and date allows you to change
the time-of-day value.  Any program that suspends interrupts (CLI assembly
instruction) or disables int-8 by setting the appropiate bit in the PIC's
IMR (interrupt mask register) will prevent the time-of-day value from
updating.  I suspect this is most noticable in disk operations because
data needs to be written without being interrupted.  If your program needs
the correct time get it from the real-time clock via int-1Ah/02h or
int-1Ah/04h for the date.

Hope this helps,

--Eric

 
 
 

1. AT Real Time Clock -> accuracy of timer?

I have a question about the accuracy of the timer-chip
frequency and the DOS Real Time Clock derived from it.

I noticed that the times reported on different computer by
DOS (using gettime() or time) differ significantly during a day.
I do the following:
1) Start two computer platforms of the same brand (e.g.
   Dell 486DX2-66 tower)
2) Enter 'time<cr>' on both machines using DOS 5.00
3) Enter the same time on both machines, but don't press
   return yet.
4) Press 'return' on both machines at the same time
5) Start a program that continuously writes date and time on
   the screen (both machines)
6) Monitor the output on both screens during the day
7) See that at the and of the day one PC appears 'slower' at
   the end of a timing-interval.

I have done this experiment on a few types of compaq's and on
our newest (Dell) machines. On the compaq's I also noticed that
the difference increases faster if one PC is using disk I/O
extensively. The biggest difference I measered was several
(about six!) seconds in about half a day!

I wonder if this is the result of interrupts that are somehow
lost by the RTC-interrupthandler, or that the frequency at which
the RTC runs differ due to a slightly different crystal.

I eliminated the temperature effect, because both machines are
located next to eachother for about three weeks. The machines are
stand-alone and configured in the same way (same autoexec and config).

Even if one machine has several TSR's running and the other nothing,
it seems to me that this may not influence the long term accuracy of
the clock, because a TSR may never cause clock-interrupts to get lost!

Does anybody has any comments about this? Has anyone ever calibrated
their timing-activities using a PC? Is there a difference in (long
term) accuracy if I use the DOS-clock (gettime() and such) or if
I reprogram the timerchip myself?

I am very curious what you guys have for explanations...

                Mark Hensbergen

--

X.400:  C=NL,ADMD=400NET,PRMD=PTT Research, SURNAME=Hensbergen
--------------------------------------------------------------
Is there any life after the terminal?

2. How much for these terminals $$

3. System timer update from real-time clock - uptim300.zip (1/1)

4. beginner

5. System timer update from real-time clock - uptim300.zip (0/1)

6. Print Log missing userid

7. Error: Faulty real-time clock

8. Bentley Help change in working units

9. Real Time Clock

10. Updating bios timer from real time clock

11. real-time MIDI & clock interrupts

12. The Real Time Clock - Answers, not questions :)

13. Accessing real-time clock directly