AT Real Time Clock -> accuracy of timer?

AT Real Time Clock -> accuracy of timer?

Post by Mark Hensberg » Wed, 11 Aug 1993 23:25:52



I have a question about the accuracy of the timer-chip
frequency and the DOS Real Time Clock derived from it.

I noticed that the times reported on different computer by
DOS (using gettime() or time) differ significantly during a day.
I do the following:
1) Start two computer platforms of the same brand (e.g.
   Dell 486DX2-66 tower)
2) Enter 'time<cr>' on both machines using DOS 5.00
3) Enter the same time on both machines, but don't press
   return yet.
4) Press 'return' on both machines at the same time
5) Start a program that continuously writes date and time on
   the screen (both machines)
6) Monitor the output on both screens during the day
7) See that at the and of the day one PC appears 'slower' at
   the end of a timing-interval.

I have done this experiment on a few types of compaq's and on
our newest (Dell) machines. On the compaq's I also noticed that
the difference increases faster if one PC is using disk I/O
extensively. The biggest difference I measered was several
(about six!) seconds in about half a day!

I wonder if this is the result of interrupts that are somehow
lost by the RTC-interrupthandler, or that the frequency at which
the RTC runs differ due to a slightly different crystal.

I eliminated the temperature effect, because both machines are
located next to eachother for about three weeks. The machines are
stand-alone and configured in the same way (same autoexec and config).

Even if one machine has several TSR's running and the other nothing,
it seems to me that this may not influence the long term accuracy of
the clock, because a TSR may never cause clock-interrupts to get lost!

Does anybody has any comments about this? Has anyone ever calibrated
their timing-activities using a PC? Is there a difference in (long
term) accuracy if I use the DOS-clock (gettime() and such) or if
I reprogram the timerchip myself?

I am very curious what you guys have for explanations...

                Mark Hensbergen

--

X.400:  C=NL,ADMD=400NET,PRMD=PTT Research, SURNAME=Hensbergen
--------------------------------------------------------------
Is there any life after the terminal?

 
 
 

AT Real Time Clock -> accuracy of timer?

Post by Andreas Hel » Thu, 12 Aug 1993 06:14:23



Quote:>I have a question about the accuracy of the timer-chip
>frequency and the DOS Real Time Clock derived from it.

>I noticed that the times reported on different computer by
>DOS (using gettime() or time) differ significantly during a day.
>I do the following:
>1) Start two computer platforms of the same brand (e.g.
>   Dell 486DX2-66 tower)
>2) Enter 'time<cr>' on both machines using DOS 5.00
>3) Enter the same time on both machines, but don't press
>   return yet.
>4) Press 'return' on both machines at the same time
>5) Start a program that continuously writes date and time on
>   the screen (both machines)
>6) Monitor the output on both screens during the day
>7) See that at the and of the day one PC appears 'slower' at
>   the end of a timing-interval.

>I have done this experiment on a few types of compaq's and on
>our newest (Dell) machines. On the compaq's I also noticed that
>the difference increases faster if one PC is using disk I/O
>extensively. The biggest difference I measered was several
>(about six!) seconds in about half a day!

>I wonder if this is the result of interrupts that are somehow
>lost by the RTC-interrupthandler, or that the frequency at which
>the RTC runs differ due to a slightly different crystal.

>I eliminated the temperature effect, because both machines are
>located next to eachother for about three weeks. The machines are
>stand-alone and configured in the same way (same autoexec and config).

>Even if one machine has several TSR's running and the other nothing,
>it seems to me that this may not influence the long term accuracy of
>the clock, because a TSR may never cause clock-interrupts to get lost!

>Does anybody has any comments about this? Has anyone ever calibrated
>their timing-activities using a PC? Is there a difference in (long
>term) accuracy if I use the DOS-clock (gettime() and such) or if
>I reprogram the timerchip myself?

>I am very curious what you guys have for explanations...

>            Mark Hensbergen

Did you measure the accuracy of the hardware clock or of the software clock
which DOS was using? The hardware clock runs independant of this software
clock and dependent of the system call you use, you get the time of the
hardware or the software clock. (Our Novell Fileserver was 1 Minute off
every day on its software clock, half an hour per month, and every other
computer on the net in sync with it.) The hardware clock is almost
comparable to a quarz clock with a badly selected quarz. If you use a
program wich calculates the error and corrects the hardware clock, you can
have a  relativly accurate clock. We do now get the time from a nearby unix
computer, which most of the time has the official time with less then 1
second error and sometimes is 1 or 2 hours wrong. As long as I was not on
the network, I used a time correcting program (written by me but not
yet bugfree).
Andreas
-

Andreas Helke, Molekulare Genetik, Universitaet Heidelberg, Germany



      has its main duty as DOS/Windows computer and accepts mail only in
      its Unix incarnation.

 
 
 

AT Real Time Clock -> accuracy of timer?

Post by m hua » Thu, 12 Aug 1993 07:44:04


: I have a question about the accuracy of the timer-chip

just FYI, you can get standard time by

telnet india.colorado.edu 13

--mh

 
 
 

AT Real Time Clock -> accuracy of timer?

Post by Fuzzy F » Thu, 12 Aug 1993 07:47:58


It used to be that automobile clocks were always wildly inaccurate.
Nowadays they are generally right on, and it's the PC clocks that are
never right.  :)

--

       "Statistics show that most of the people are in the
         majority, while only a few are in the minority."

 
 
 

AT Real Time Clock -> accuracy of timer?

Post by Alan Barre » Thu, 12 Aug 1993 22:59:43


[Followup out of comp.realtime.  I suggest comp.os.msdos.misc.]



> I have a question about the accuracy of the timer-chip
> frequency and the DOS Real Time Clock derived from it.

On typical MS-DOS systems, the DOS clock is set from the timer chip as
part of the boot sequence, but is subsequently allowed to drift
independently of the timer chip.  As you noticed, disk I/O and other
activities can cause lost interrupts, which cause the DOS clock to lose
time.

I wrote an installable device driver, which I call CLOCKDEV, to keep
the DOS time synchronised to the chip time.  Actually, on typical
AT-class systems where the clock chip has a 1-second resolution and the
normal DOS time has a 55-millisecond resolution, CLOCKDEV lets the DOS
time free run as usual if application software checks the time
frequently, but it re-synchronises from the clock chip if more than 5
minutes go by without a time request.  This is intended to maintain the
normal 55ms resolution for events having a short duration, as well as
taking advantage of the longer term accuracy of the clock chip.

Clockdev is available from
ftp://ftp.ee.und.ac.za/pub/msdos/clkdev14.zip, and from Simtel mirrors,
such as ftp://wuarchive/wustl.edu.mirrors/msdos/sysutl/clkdev14.zip.
You get full source code, in assembly language.

--apb
Alan Barrett, Dept. of Electronic Eng., Univ. of Natal, Durban, South Africa

 
 
 

AT Real Time Clock -> accuracy of timer?

Post by Alan Barre » Thu, 12 Aug 1993 23:10:40




> just FYI, you can get standard time by
> telnet india.colorado.edu 13

TCP port 13 is used for the daytime protocol defined in RFC 867.  Very
many systems (most systems?) on the Internet support tcp/daytime, but
it is not indended to be especially useful for machine use.

--apb (Alan Barrett)

 
 
 

AT Real Time Clock -> accuracy of timer?

Post by m hua » Sat, 14 Aug 1993 08:40:58




: > just FYI, you can get standard time by
: > telnet india.colorado.edu 13

: TCP port 13 is used for the daytime protocol defined in RFC 867.  Very
: many systems (most systems?) on the Internet support tcp/daytime, but
: it is not indended to be especially useful for machine use.

: --apb (Alan Barrett)

right. and due to uncertain network transmission delay it can't guarantee
very accurate time ticks. but as i know the site ( india.colorado.edu ) is
supported by the reference clock of the National Bureau of Standards in
Colorado Springs. a very handy "ultimate standard clock" for synchronizing
timing of simple systems. i don't use it here though. i have a GPS receiver
hooked up on my machine. :-)

--mh

 
 
 

AT Real Time Clock -> accuracy of timer?

Post by Dave Ferovi » Thu, 12 Aug 1993 02:47:17



Quote:>I have a question about the accuracy of the timer-chip
>frequency and the DOS Real Time Clock derived from it.

>I noticed that the times reported on different computer by
>DOS (using gettime() or time) differ significantly during a day.
>I do the following:
>1) Start two computer platforms of the same brand (e.g.
>   Dell 486DX2-66 tower)
>2) Enter 'time<cr>' on both machines using DOS 5.00
>3) Enter the same time on both machines, but don't press
>   return yet.
>4) Press 'return' on both machines at the same time
>5) Start a program that continuously writes date and time on
>   the screen (both machines)
>6) Monitor the output on both screens during the day
>7) See that at the and of the day one PC appears 'slower' at
>   the end of a timing-interval.

>I have done this experiment on a few types of compaq's and on
>our newest (Dell) machines. On the compaq's I also noticed that
>the difference increases faster if one PC is using disk I/O
>extensively. The biggest difference I measered was several
>(about six!) seconds in about half a day!

>I wonder if this is the result of interrupts that are somehow
>lost by the RTC-interrupthandler, or that the frequency at which
>the RTC runs differ due to a slightly different crystal.

This is because the DOS and BIOS on each system are halting the
interrupts at certain regular intervals.  What happens is that when you
call an interrupt, it can suspend the interrupts while it does its work.
This is commonly done with I/O operations (including the screen).  If
you happen to be inside this point where the interrupts are disabled
when the clock generator sends an interrupt to the processor, you will
miss it, and will loose one of the 18.2hz updates to the clock.  Since
the PC has no hardware to detect how many interrupts you have missed, it
cannot update the clock, and you rapidly lose time. You can fix this on
ATs by updating the time from the real time clock every so often,
instead of relying on the in-memory count.

--
Dave Ferovick                    | Why run one operating system    

(author of ZipUnDel)             | loyal DOS, OS/2, AIX supporter
==Try Wixer--A Multi-line Unix with full Internet Access (512)459-4391==

 
 
 

AT Real Time Clock -> accuracy of timer?

Post by thomas.j.rober » Sat, 14 Aug 1993 22:58:09



Quote:>[....]
> This is because the DOS and BIOS on each system are halting the
> interrupts at certain regular intervals.  What happens is that when you
> call an interrupt, it can suspend the interrupts while it does its work.
> This is commonly done with I/O operations (including the screen).  If
> you happen to be inside this point where the interrupts are disabled
> when the clock generator sends an interrupt to the processor, you will
> miss it, and will loose one of the 18.2hz updates to the clock.  [...]

Not quite true. The offending interrupt has to be so long that TWO OR MORE
18.2Hz timer interrupts are delayed. Any interrupt which comes during some
other interrupt will be serviced when the initial interrupt dismisses.
If two or more come, only one is serviced.

        [The PC/AT changed from level-triggered interrupts to
         edge-triggered interrupts. This is a subtle change
         which makes catching such nested timer interrupts more robust.
         It also makes it possible to "wire-OR" interrupts among
         cooperating boards.]

Some PCs use the real-time clock/calendar chip to determine the date
and time; these don't have this problem. Some (most) read it only
during bootstrap, and rely on a counter incremented by the 18.2Hz
timer interrupt for the time of day. Many of these latter PCs won't
increment the date at Midnight (:-(. Some seem to use the calendar chip
for the date, but the timer/counter for the time.


 
 
 

AT Real Time Clock -> accuracy of timer?

Post by Hamish Glen Colem » Sun, 15 Aug 1993 01:49:13






>: > just FYI, you can get standard time by
>: > telnet india.colorado.edu 13
>: it is not indended to be especially useful for machine use.
>right. and due to uncertain network transmission delay it can't guarantee
>very accurate time ticks. but as i know the site ( india.colorado.edu ) is
>supported by the reference clock of the National Bureau of Standards in
>Colorado Springs. a very handy "ultimate standard clock" for synchronizing
>timing of simple systems. i don't use it here though. i have a GPS receiver
>hooked up on my machine. :-)

Ah, I see .. You want to know if anyone moves your computer?

(or, perhaps, you have a Laptop with a celular modem And it calls
you and tells you where it is, just incase it gets stolen?)

:-)

Hamish

 
 
 

AT Real Time Clock -> accuracy of timer?

Post by Ralf Brow » Mon, 16 Aug 1993 09:14:18



}you happen to be inside this point where the interrupts are disabled
}when the clock generator sends an interrupt to the processor, you will
}miss it, and will loose one of the 18.2hz updates to the clock.  Since

No you won't, unless the interrupts are disabled for more than 55
milliseconds (VERY bad practice, since at those time scales you start
losing other things such as rapid sequences of keypresses).  The
interrupt controller remembers that there is a pending interrupt on the
line until all higher-priority interrupts have been processed.  However,
it doesn't have a counter, so if the same interrupt is triggered two or
more times before being processed, all but one of the triggerings is
lost.

--

Disclaimer?    |   Gilb's First Law of Computer Unreliability: Computers
What's that?   |   are unreliable, but humans are even more unreliable.

 
 
 

AT Real Time Clock -> accuracy of timer?

Post by Tom Grie » Tue, 17 Aug 1993 13:24:30




>}you happen to be inside this point where the interrupts are disabled
>}when the clock generator sends an interrupt to the processor, you will
>}miss it, and will loose one of the 18.2hz updates to the clock.  Since

>No you won't, unless the interrupts are disabled for more than 55
>milliseconds (VERY bad practice, since at those time scales you start
>losing other things such as rapid sequences of keypresses).  The
>interrupt controller remembers that there is a pending interrupt on the
>line until all higher-priority interrupts have been processed.      However,
>it doesn't have a counter, so if the same interrupt is triggered two or
>more times before being processed, all but one of the triggerings is
>lost.

Even in edge triggered mode I believe the 8259 (interrupt controller)
requires that the interrupt remain asserted until it is acknowledged
by the processor.  My experience is that the 8254 timer chip is programmed
in a square-wave mode with the output active for 55ms/2 and inactive
for 55ms/2.  (I believe this is because the pulsed mode has active
periods that are way too short.)   This means you could lose a timer
interrupt if they are deferred more than 27.5ms (approx).  This is
still VERY UNLIKELY and indicates that you're running poor software.

To keep our real-time network of distributed PC's synchronized to
within 100 microseconds, we send messages back and forth.  One of
the processors acts as the standard time and the others follow.
During testing, we noticed that some of the processor clocks were
not very accurate...  easily enough to account for the 12 seconds a day
loss noted before.

-Tom

 
 
 

AT Real Time Clock -> accuracy of timer?

Post by Robert Kenne » Tue, 17 Aug 1993 14:42:53


There is a program called (I think I haven't used it lately) WTime...It
automatically uses your modem to dial up a government atomic clock and
set your internal clock...from the west coast the cost of the call is
something like .42 cents...cheap...
I think that PC Magazine (or like pub.) put out this piece of
software...

--

###########################################################################
        Seattle, Wa. USA                #       Smells like Bourbon
################################################################################

 
 
 

1. AT Real Time Clock -> accuracy of

:I have a question about the accuracy of the timer-chip
:frequency and the DOS Real Time Clock derived from it.

:I noticed that the times reported on different computer by
:DOS (using gettime() or time) differ significantly during a day.
:I do the following:
:1) Start two computer platforms of the same brand (e.g.
:   Dell 486DX2-66 tower)
:2) Enter 'time<cr>' on both machines using DOS 5.00
:3) Enter the same time on both machines, but don't press
:   return yet.
:4) Press 'return' on both machines at the same time
:5) Start a program that continuously writes date and time on
:   the screen (both machines)
:6) Monitor the output on both screens during the day
:7) See that at the and of the day one PC appears 'slower' at
:   the end of a timing-interval.

:I have done this experiment on a few types of compaq's and on
:our newest (Dell) machines. On the compaq's I also noticed that
:the difference increases faster if one PC is using disk I/O
:extensively. The biggest difference I measered was several
:(about six!) seconds in about half a day!

:I wonder if this is the result of interrupts that are somehow
:lost by the RTC-interrupthandler, or that the frequency at which
:the RTC runs differ due to a slightly different crystal.

:I eliminated the temperature effect, because both machines are
:located next to eachother for about three weeks. The machines are
:stand-alone and configured in the same way (same autoexec and config).

:Even if one machine has several TSR's running and the other nothing,
:it seems to me that this may not influence the long term accuracy of
:the clock, because a TSR may never cause clock-interrupts to get lost!

:Does anybody has any comments about this? Has anyone ever calibrated
:their timing-activities using a PC? Is there a difference in (long
:term) accuracy if I use the DOS-clock (gettime() and such) or if
:I reprogram the timerchip myself?

:I am very curious what you guys have for explanations...

When a PC with a real time clock boots up, DOS reads the real-time clock
value and updates the time-of-day value stored in the BIOS data area
at 40:6Ch.  From then on, int-8 updates this time-of-day value stored
in 4 words.  Typing the DOS commands time and date allows you to change
the time-of-day value.  Any program that suspends interrupts (CLI assembly
instruction) or disables int-8 by setting the appropiate bit in the PIC's
IMR (interrupt mask register) will prevent the time-of-day value from
updating.  I suspect this is most noticable in disk operations because
data needs to be written without being interrupted.  If your program needs
the correct time get it from the real-time clock via int-1Ah/02h or
int-1Ah/04h for the date.

Hope this helps,

--Eric

2. Software to format CD-R for data

3. System timer update from real-time clock - uptim300.zip (1/1)

4. GEMFAST VDI & AES Libraries

5. System timer update from real-time clock - uptim300.zip (0/1)

6. Is it possible

7. System timer update from real-time clock - uptim300.zip (1/1)

8. System timer update from real-time clock - uptim300.zip (0/1)

9. System timer update from real-time clock - uptim300.zip (1/1)