Resetting system clock from driver module

Resetting system clock from driver module

Post by Richard Matle » Tue, 20 Oct 1998 04:00:00



Hi

I wonder could anybody please help me with the following problem.

I am attempting to write a device driver for a device (a CCD camera)
which connects
to the parallel port. I have the code of the original MS-DOS driver but
there is one slight
difficulty. It is necessary, I am told by the manufacturer to disable
interrupts while
reading from the device. This takes around 20 seconds during which time
the processor isn't
getting the timer interrupt signal leaving the system clock slow
afterwards. I would like to
be able to set the camera to take images at predetermined time, probably
every few minutes.
Unless I do something to prevent it, the system clock will soon become
many minutes (hours
even after a long run) out of step with the real time.

What would be the best way to reset the system clock after interrupts
have been enabled?
I know that hwclock --hctosys will do it from a command line but is
there some function
I can call from within my driver module?

Thanks in advance

Richard Matley
Condensed Matter and Nonlinear Dynamics
University of Manchester, U.K.

 
 
 

Resetting system clock from driver module

Post by Przemek Klosowsk » Tue, 20 Oct 1998 04:00:00



> I am attempting to write a device driver for a device (a CCD camera) which connects
> to the parallel port. I have the code of the original MS-DOS driver but there is one slight
> difficulty. It is necessary, I am told by the manufacturer to disable interrupts while
> reading from the device. This takes around 20 seconds during which time the processor isn't
> getting the timer interrupt signal leaving the system clock slow afterwards. I would like to

Forgive me, but slow clock would be the least of your problems after
20 seconds with interrupts disabled. For one thing, the memory is
refreshed from a periodic interrupt that runs I think every 18 ms.
If you miss too many refresh cycles, your RAM will start losing its
contents.

I think that you will have to read the image in chunks lasting few
miliseconds, and only disable interrupts for such small chunks.

Please also reveal the manufacturer, so that we can stay away from them.
They must be incompetent if they really recommend disabling interrupts
for 20 seconds.

--

                        NIST Center for Neutron Research (bldg. 235), E111
                        National Institute of Standards and Technology
                        Gaithersburg, MD 20899,      USA
                        .. and for spam extractors, FCC Commisioners' email is:


 
 
 

Resetting system clock from driver module

Post by Robert Hyat » Wed, 21 Oct 1998 04:00:00



:> I am attempting to write a device driver for a device (a CCD camera) which connects
:> to the parallel port. I have the code of the original MS-DOS driver but there is one slight
:> difficulty. It is necessary, I am told by the manufacturer to disable interrupts while
:> reading from the device. This takes around 20 seconds during which time the processor isn't
:> getting the timer interrupt signal leaving the system clock slow afterwards. I would like to

: Forgive me, but slow clock would be the least of your problems after
: 20 seconds with interrupts disabled. For one thing, the memory is
: refreshed from a periodic interrupt that runs I think every 18 ms.
: If you miss too many refresh cycles, your RAM will start losing its
: contents.

RAM refresh doesn't bother the CPU nor interrupts.  That's done in between
cycles by the memory controller itself.  You can halt the CPU completely
with interrupts disabled and memory won't vanish...

: I think that you will have to read the image in chunks lasting few
: miliseconds, and only disable interrupts for such small chunks.

: Please also reveal the manufacturer, so that we can stay away from them.
: They must be incompetent if they really recommend disabling interrupts
: for 20 seconds.

It seems like a silly requirement.  #1, why would it take 20 seconds to
read the data?  Parallel ports are pretty quick nowadays.  The resolution of
the CCD device is not that high...  And I don't see how the camera would know
interrupts are disabled anyway...  nor why it would care...

--
Robert Hyatt                    Computer and Information Sciences

(205) 934-2213                  115A Campbell Hall, UAB Station
(205) 934-5473 FAX              Birmingham, AL 35294-1170

 
 
 

Resetting system clock from driver module

Post by Richard Matle » Wed, 21 Oct 1998 04:00:00



> It seems like a silly requirement.  #1, why would it take 20 seconds to
> read the data?  Parallel ports are pretty quick nowadays.  The resolution of
> the CCD device is not that high...  And I don't see how the camera would know
> interrupts are disabled anyway...  nor why it would care...

Thanks for your comments.

I must confess that I don't know much about the internal

workings of the CCD electronics except that they rely on the

computer to time the process of reading out from the CCD.

I have spoken to the man who wrote the original DOS driver

and he has stressed that interrupts really must be disabled

to ensure that the CPU isn't distracted from the task even

briefly. I suppose that I could try doing it with interrupts

on but I don't hold out much hope.

As for the twenty seconds, this seems to be a limitation due

to the electronics driving the CCD. The spec. sheet for the

actual sensor suggests it could be read out in about five

seconds, however I have timed the process running under

DOS and it is definitely slower.

So it seems I really am stuck with my original problem of

trying to reset the clock.

--
Richard Matley
Condensed Matter and Nonlinear Dynamics Group
Dept. of Physics and Astronomy
University of Manchester

 
 
 

Resetting system clock from driver module

Post by Paul Flinder » Wed, 21 Oct 1998 04:00:00




> > It seems like a silly requirement.  #1, why would it take 20 seconds to
> > read the data?  Parallel ports are pretty quick nowadays.  The resolution of
> > the CCD device is not that high...  And I don't see how the camera would know
> > interrupts are disabled anyway...  nor why it would care...

> Thanks for your comments.
> I must confess that I don't know much about the internal
> workings of the CCD electronics

That will make writing a device driver hard. You _do_ have documentation on
the interface?

Quote:> except that they rely on the computer to time the process of reading
> out from the CCD.

What do you mean by "rely on the computer to time the process.."

Quote:> I have spoken to the man who wrote the original DOS driver
> and he has stressed that interrupts really must be disabled
> to ensure that the CPU isn't distracted from the task even
> briefly. I suppose that I could try doing it with interrupts
> on but I don't hold out much hope.

Just because some DOS driver written some time ago for a 4.77 Mhz 8088
needed interrupts off doesn't mean you will.

Quote:> As for the twenty seconds, this seems to be a limitation due
> to the electronics driving the CCD. The spec. sheet for the
> actual sensor suggests it could be read out in about five
> seconds, however I have timed the process running under
> DOS and it is definitely slower.

How, exactly do you get data from the device? Until the comment above I
assumed that you probably said "give me your data" and it delivered it back
at a fixed rate so, if you missed reading a byte you lost that byte but
being able to slow down the read process by a factor of four doesn't seem
to fit. Is there some handshaking or acknowledgement that you've read a
byte.

Although I'm guessing I suspect that the basic problem with your approach
is that you're trying to poll for the data - you probably want to use an
interrupt to recieve each byte. In fact you may be able to do the whole
thing in a user process using the existing parallel port driver (as long as
it's in interrupt mode)

If you mail me (or post) some details about the interface I'll try to make
some suggestions.

 
 
 

Resetting system clock from driver module

Post by Julie Haug » Wed, 21 Oct 1998 04:00:00



> Hi

> I wonder could anybody please help me with the following problem.

> I am attempting to write a device driver for a device (a CCD camera)
> which connects
> to the parallel port. I have the code of the original MS-DOS driver but
> there is one slight
> difficulty. It is necessary, I am told by the manufacturer to disable
> interrupts while
> reading from the device.

There's no need to disable interrupts if your driver is able to
read data from the parallel port faster than the camera is
providing it -- and the overhead from the rest of the system is
low enough that it doesn't prevent your interrupts from being
serviced in the appropriate time.
--
Julianne Frances Haugh             Life is either a daring adventure
Mail: jfh AT bga.com                   or nothing at all.
                                            -- Helen Keller
 
 
 

Resetting system clock from driver module

Post by Przemek Klosowsk » Wed, 21 Oct 1998 04:00:00




> : 20 seconds with interrupts disabled. For one thing, the memory is
> : refreshed from a periodic interrupt that runs I think every 18 ms.
> : If you miss too many refresh cycles, your RAM will start losing its
> : contents.

> RAM refresh doesn't bother the CPU nor interrupts.  That's done in between
> cycles by the memory controller itself.  You can halt the CPU completely
> with interrupts disabled and memory won't vanish...

OK, my age shows.. Here is a comment from kernel/dma.c

 * DMA0 used to be reserved for DRAM refresh, but apparently not any more...

I remember that the original design used one onboard timer channel and
one DMA channel to perform the refresh. I am curious---would anyone
know when was this changed? 386? 486? Pentium?
--

                        NIST Center for Neutron Research (bldg. 235), E111
                        National Institute of Standards and Technology
                        Gaithersburg, MD 20899,      USA
                        .. and for spam extractors, FCC Commisioners' email is:

 
 
 

Resetting system clock from driver module

Post by David Wils » Thu, 22 Oct 1998 04:00:00




>> RAM refresh doesn't bother the CPU nor interrupts.  That's done in between
>> cycles by the memory controller itself.  You can halt the CPU completely
>> with interrupts disabled and memory won't vanish...
>OK, my age shows.. Here is a comment from kernel/dma.c
> * DMA0 used to be reserved for DRAM refresh, but apparently not any more...
>I remember that the original design used one onboard timer channel and
>one DMA channel to perform the refresh. I am curious---would anyone
>know when was this changed? 386? 486? Pentium?

I would assume that it happened when the PC-AT was released. After all, that
is when DACK0/DMA0 appeared on the ISA bus (on pins D8 and D9) and B19 was
renamed REFRESH.
--

 
 
 

Resetting system clock from driver module

Post by Remco Treffkor » Thu, 22 Oct 1998 04:00:00



> I wonder could anybody please help me with the following problem.

What I know about astro CCD cameras:

  The computer performs the cameras internal timing by asserting
  certain i/o pins on the par port. This has to be very accurate.

I have not seen the dos driver source, else I could tell you if
Linux real time capabilities are sufficient.

Are you able to let us see the source? Or at least tell us the
manufacturer?

Cheers,
Remco

--
Remco Treffkorn (RT445)
HAM DC2XT



 
 
 

Resetting system clock from driver module

Post by Maciej Golebiewsk » Fri, 23 Oct 1998 04:00:00




> >I remember that the original design used one onboard timer channel and
> >one DMA channel to perform the refresh. I am curious---would anyone
> >know when was this changed? 386? 486? Pentium?

> I would assume that it happened when the PC-AT was released. After all,
> that is when DACK0/DMA0 appeared on the ISA bus (on pins D8 and D9) and
> B19 was renamed REFRESH.

I think it must have happened later, because I remember using on my
old AT a small program which would decrease the RAM refresh rate in
order to improve the performance a little bit (and it was working).

Maciej Golebiewski

 
 
 

Resetting system clock from driver module

Post by Paul Flinder » Fri, 23 Oct 1998 04:00:00




> > I wonder could anybody please help me with the following problem.

> What I know about astro CCD cameras:

>   The computer performs the cameras internal timing by asserting
>   certain i/o pins on the par port. This has to be very accurate.

> I have not seen the dos driver source, else I could tell you if
> Linux real time capabilities are sufficient.

> Are you able to let us see the source? Or at least tell us the
> manufacturer?

Ick - no wonder the guy wants to turn interrupts off (Richard, if
you're still listening I begin to see your problems).

How fast do you need to do things? (i.e could you hang it iff an RTC
interrupt).

 
 
 

Resetting system clock from driver module

Post by Martin J. Man » Fri, 23 Oct 1998 04:00:00



Quote:> What do you mean by "rely on the computer to time the process.."

It's a WinCam.  They traded off the $5 microcontroller for absurdly
tight timing requirements imposed on the host CPU.  This is how you
sell product in the mass PC market.

WinWidgets suck.  :-(

Quote:> Just because some DOS driver written some time ago for a 4.77 Mhz 8088
> needed interrupts off doesn't mean you will.

This quite likely does.  For one thing, it probably uses the host CPU
as the source of its process timing - no interrupts 'cause there's
nothing smart enough in the WinCam to generate an interrupt, let alone
queue up a decent chunk of data.  And CCDs can make pretty strict
demands for _stable_ timing to keep the light response constant from
cell to cell, IIRC.
 
 
 

Resetting system clock from driver module

Post by Paul Flinder » Sat, 24 Oct 1998 04:00:00


(Posted & mailed, in case Richard isn't still reading this thread)



> > What do you mean by "rely on the computer to time the process.."

> It's a WinCam.  They traded off the $5 microcontroller for absurdly
> tight timing requirements imposed on the host CPU.  This is how you
> sell product in the mass PC market.

Close, I think, but Richard said he had an old DOS driver to port.

Quote:> WinWidgets suck.  :-(

> > Just because some DOS driver written some time ago for a 4.77 Mhz 8088
> > needed interrupts off doesn't mean you will.

> This quite likely does.  For one thing, it probably uses the host CPU
> as the source of its process timing - no interrupts 'cause there's
> nothing smart enough in the WinCam to generate an interrupt, let alone
> queue up a decent chunk of data.  And CCDs can make pretty strict
> demands for _stable_ timing to keep the light response constant from
> cell to cell, IIRC.

Something Remco said (and another look at Richard's sig) made the penny
drop - it's probably one of the ones aimed at astronomers so I dug around
the net and found some info (http://www.pcug.co.uk/~starlite for example)

You are right - they need the host to carefully time the A/D conversion and
also to drive the pixel to pixel clock - they really don't look as though
they are the sort of thing you want to drive from any sort of multitasking
system.

As far as I can tell though the really tight timing is whilst reading along
a line so you should only need interrupts off for a line at a time - maybe
(guessing timings - I couldn't find any) 20 millisecs but that's still a
bit long to make sure you don't loose timer interrupts.

Normally when people ask questions like this it's because they are trying
to do things the "DOS way" in linux but I'm beginning to think that there
might not be another way to do it.

So the answer to the original question is probably to look at the source to
the hwclock program (or just run hwclock to reset the kernel clock from the
timer chip) after you've done the conversion

 
 
 

Resetting system clock from driver module

Post by Byron A Je » Mon, 02 Nov 1998 04:00:00





-> What do you mean by "rely on the computer to time the process.."
-
-It's a WinCam.  They traded off the $5 microcontroller for absurdly
-tight timing requirements imposed on the host CPU.  This is how you
-sell product in the mass PC market.
-
-WinWidgets suck.  :-(
-

Yes they do. The bottom line is though is that such constraints simply will
not work in Linux's multitasking environment. Turning off interrupts means
missing disk accesses and totally locking up the system while a picture is
scanned. It's flat out a bad idea.

I'd advise testing out a do nothing driver that simply disables interrupts
and sleeps for 20 seconds at each access and verify that your system
continues to operate properly. I have a sinking suspicion that your Linux box
will lock up hard because some essential process is locked out (disk would
be my guess followed closely by the network) while the interrupts are off.

Here's a couple of different ideas that may better suit the task at hand:

1) Add the $5 microcontroller. Use a cheap 8051 or a PIC 16CXX type part
as an interface to the CCD. Then the uC can maintain (easily) the tight
timing requirements, dump the image to the buffer, and then leisurely present
the image to the Linux box using a standard (and interruptable ;-) interface
like the serial port.

2) Dedicate a DOS box to the CCD, then interface the DOS box to the Linux box
via the LAN.

-> Just because some DOS driver written some time ago for a 4.77 Mhz 8088
-> needed interrupts off doesn't mean you will.
-
-This quite likely does.  For one thing, it probably uses the host CPU
-as the source of its process timing - no interrupts 'cause there's
-nothing smart enough in the WinCam to generate an interrupt, let alone
-queue up a decent chunk of data.  And CCDs can make pretty strict
-demands for _stable_ timing to keep the light response constant from
-cell to cell, IIRC.

All true. But you've found the one sticking point that Linux boxes simply
cannot compete with DOS. Sice DOS was single tasking, locking it up for 20
seconds had no significant effects. Essentially the application had all the
control. As a real OS, the Linux kernel has all the control and simply grants
applications limited rights to resources. But the responsibility of that is
the system will be available to all applications.

Bottom line. It's very unlikely to work and you'll have to find another way
do to it.

Go back and ask the engineer if the device has an NT interface and if so,
how was it done... This will give you more insight on how Linux would
have to do it.

Sorry for the doom and gloom...

BAJ

-

--
Another random extraction from the mental bit stream of...
Byron A. Jeff - PhD student operating in parallel - And Using Linux!

 
 
 

1. ntp always resets clock instead of slewing clock

Hello,

I'm running ntpd 4.1.1b, Debian sid, currently using dial-up. I start and
stop the /etc/init.d/ntp-simple daemon with scripts in /etc/ppp/ip-up.d/
and /etc/ppp/ip-down.d/ respectively.

Curiously 'ntp-simple' always resets the clock instead of slewing it. I'm
curious as to why.

Jun 14 13:39:52 mobile pppd[1673]: Script /etc/ppp/ip-up finished (pid 1681), status = 0x0

Jun 14 13:39:52 mobile ntpd[1832]: signal_no_reset: signal 13 had flags 4000000
Jun 14 13:39:52 mobile ntpd[1832]: precision = 35 usec
Jun 14 13:39:52 mobile ntpd[1832]: kernel time discipline status 0040
Jun 14 13:39:52 mobile ntpd[1832]: frequency initialized 70.305 from /var/lib/ntp/ntp.drift
Jun 14 13:39:52 mobile ntpd[1833]: signal_no_reset: signal 17 had flags 4000000
Jun 14 13:44:18 mobile ntpd[1832]: time reset 3.689185 s
Jun 14 13:44:18 mobile ntpd[1832]: kernel time discipline status change 41
Jun 14 13:44:18 mobile ntpd[1832]: synchronisation lost

The logs always show 'time reset xxxxxxx s' followed by synchronisation
lost.

Is this normal? Is there anything I can do to get the clock to slew
instead of resetting? Should I even be concerned?

--
Ciao,
CRH 8^)>

2. 2.5.51 bttv oops.

3. reset system clock

4. 47DAE63A You know nothing about Linux.

5. system clock/bios clock setting

6. linux on toshisba T2150cdt

7. how to sync system clock to CMOS clock

8. looking for restricted shells

9. How do it synchronize my system clock with an atomic clock ?

10. System clock vs some other clock???

11. hwclock (system clock to hardware clock)???

12. Red Hat 6.0 System Clock vs. Hardware Clock

13. system clock & cmos clock