Technical: "98"..."99"...."100" instead of "2000"?

Technical: "98"..."99"...."100" instead of "2000"?

Post by TimR » Sun, 04 Oct 1998 04:00:00



Since this year (1998) in effect is known as "98", and next
year is "99", will it help anyone to simply regard the year 2000
as "100", the next number in the sequence?

I'm a (former) programmer, been studying y2k for a while,
this idea just struck me, and I'm surprised I've never seen
it mentioned.

Perhaps there is no value.. but maybe someone out there
will find it useful to be able to expand to only 3 digits rather
than 4? Or create some other work-around?

Other ways to look at it: 1900 has become year "0"
(shouldn't cause any problems). Also consider the day
following 12/31/99 would be 1/1/100.

Perhaps on forms that are stuck with "19__" we can
write in or enter "100". (just ignore the "19")

Ah, it's a fun thing to think about anyway.

Tim Reynolds

 
 
 

Technical: "98"..."99"...."100" instead of "2000"?

Post by Stormhoun » Sun, 04 Oct 1998 04:00:00



> Since this year (1998) in effect is known as "98", and next
> year is "99", will it help anyone to simply regard the year 2000
> as "100", the next number in the sequence?

> I'm a (former) programmer, been studying y2k for a while,
> this idea just struck me, and I'm surprised I've never seen
> it mentioned.

> Perhaps there is no value.. but maybe someone out there
> will find it useful to be able to expand to only 3 digits rather
> than 4? Or create some other work-around?

    In cases where the year is stored as two digits in character format, the
technique of converting them into a three-digit packed field would do what you
suggest, and has been suggested before.  It has the advantage of not requiring
file size enlargements.
    But where you're already dealing with packed dates, you generally have to
expand, and if you're going to expand the field size, why not 4 instead of just
3?

--
Stormhound
DNRC Ombudsman for Induhvidual Affairs, Holder of Past Knowledge
Come visit my web page at http://www.sound.net/~stormhnd

 
 
 

Technical: "98"..."99"...."100" instead of "2000"?

Post by Coup d'eta » Sun, 04 Oct 1998 04:00:00



> Since this year (1998) in effect is known as "98", and next
> year is "99", will it help anyone to simply regard the year 2000
> as "100", the next number in the sequence?

> I'm a (former) programmer, been studying y2k for a while,
> this idea just struck me, and I'm surprised I've never seen
> it mentioned.

> Perhaps there is no value.. but maybe someone out there
> will find it useful to be able to expand to only 3 digits rather
> than 4? Or create some other work-around?

And what do you gain by doing 3 instead of 4 digits?  Or by
windowing?  You still have to comb every line of code looking for
references to it, and you still have to convert the data, and .......
you still have to time machine it .... and ......
 
 
 

Technical: "98"..."99"...."100" instead of "2000"?

Post by Osmo Ronkan » Sun, 04 Oct 1998 04:00:00




>Since this year (1998) in effect is known as "98", and next
>year is "99", will it help anyone to simply regard the year 2000
>as "100", the next number in the sequence?

>I'm a (former) programmer, been studying y2k for a while,
>this idea just struck me, and I'm surprised I've never seen
>it mentioned.

>Perhaps there is no value.. but maybe someone out there
>will find it useful to be able to expand to only 3 digits rather
>than 4? Or create some other work-around?

I've also thought that. If the number is stored in binary form then there
probably is no limit at 100 and one can only change the IO routines. The
IO could well use 4 digits but internally the year would be defined as
year since 1900. I think it would be good if one did not have to change
all the data as well. That would work up to 2100 or 2079 when it comes
65535 days from 1900.

Or do all the big computers work on decimal system? Somehow on popular
reports at least the wrap to 1900 is taken granted without any thought.

Of course much of the problem becomes from communication between
different systems and if that is done on ASCII, or even worse in EBCDIC
(where numbers are at the end of the code) with fixed fields then there
is a problem.

I actually faintly recall writing something that used two digit years and
I decided not to do check for >99 just to allow some form of
compatibility after 2000.

Osmo

 
 
 

Technical: "98"..."99"...."100" instead of "2000"?

Post by Dr John Stockto » Mon, 05 Oct 1998 04:00:00



21:03:42 in comp.software.year-2000, Osmo Ronkanen


>I've also thought that. If the number is stored in binary form then there
>probably is no limit at 100 and one can only change the IO routines. The
>IO could well use 4 digits but internally the year would be defined as
>year since 1900.

If changing the year to binary, one might as well change the origin to
the Year Zero [proleptic astronomical gregorian, of course] (providing,
except for astronomers, a convenient "impossible" entry"; and keeping
the 400-year rule simple).  A 16-bit signed year will amply accommodate
all the dateable past and all the foreseeable future.

--

 Web <URL: http://www.merlyn.demon.co.uk/> -- includes FAQish topics and links:
 Dates - misctime.htm  Year 2000 - date2000.htm  Critical Dates - critdate.htm
 Y2k for beginners - year2000.txt  UK mini-FAQ - y2k-mfaq.txt  Don't Mail News.

 
 
 

Technical: "98"..."99"...."100" instead of "2000"?

Post by ibm7.. » Mon, 05 Oct 1998 04:00:00


This is a long and well established feature in "C" Code. Most people now
refer to it as a bug, and call it the tm_year problem.
Please bookmark our web page at http://www.sevenoaks.demon.co.uk/y2k.htm
which is where we will be placing key y2k links as we get to know about
them.
 
 
 

Technical: "98"..."99"...."100" instead of "2000"?

Post by Gilbert C Healto » Mon, 05 Oct 1998 04:00:00


: This is a long and well established feature in "C" Code. Most people now
: refer to it as a bug, and call it the tm_year problem.
: Please bookmark our web page at http://www.sevenoaks.demon.co.uk/y2k.htm
: which is where we will be placing key y2k links as we get to know about
: them.

And I've recently repaired some C programs that printed dates using

        printf( "%2d/%2d/%2d", .... Time.tm_year )

To use Time.tm_year % 100 to continue printing a tewo-digit year  on
the reports (along with adding pivot years, etc., to the thing).

----------------------------------------------------------------------

        Beware the Calends of January 2000!
----------------------------------------------------------------------

 
 
 

Technical: "98"..."99"...."100" instead of "2000"?

Post by Jay » Tue, 06 Oct 1998 04:00:00



> Since this year (1998) in effect is known as "98", and next
> year is "99", will it help anyone to simply regard the year 2000
> as "100", the next number in the sequence?

> I'm a (former) programmer, been studying y2k for a while,
> this idea just struck me, and I'm surprised I've never seen
> it mentioned.

> Perhaps there is no value.. but maybe someone out there
> will find it useful to be able to expand to only 3 digits rather
> than 4? Or create some other work-around?

> Other ways to look at it: 1900 has become year "0"
> (shouldn't cause any problems). Also consider the day
> following 12/31/99 would be 1/1/100.

> Perhaps on forms that are stuck with "19__" we can
> write in or enter "100". (just ignore the "19")

> Ah, it's a fun thing to think about anyway.

> Tim Reynolds

Then you have to tell the program to "ignore the 19". Usually
they'll stick it on anyway and you get the very interesting
date 01/01/19100.

Then when the program takes it apart again to store in the
database, it ends up with the first 4 characters (0101)
and the seventh and eighth characters (10) for a result
of 010110. I recently read an article about a technical
problem with embedded systems that insisted on returning
a date of 010110.

J.

 
 
 

Technical: "98"..."99"...."100" instead of "2000"?

Post by B'iche » Tue, 06 Oct 1998 04:00:00



>Since this year (1998) in effect is known as "98", and next
>year is "99", will it help anyone to simply regard the year 2000
>as "100", the next number in the sequence?

>Perhaps on forms that are stuck with "19__" we can
>write in or enter "100". (just ignore the "19")

        Actually My tandy Color Computers Clock uses the Byte values to
refere to dates. IE: print ASC(Yr) to refere to the year so 98 is 98 in
binary. When my battery went dead in the RTC circut it shows dates as
19147! BE WARY! some software will not Flip to 1900 but to 19100! Is it
just Os9 Level 2 that has this problem or does other software use one byte
for the year?

--
                A pearl of wisdom from the y2K newsgroups:
-------------------------------------------------------------------------
Y2K appears to be the Baby Boomers mid-life crisis, and it has the
potential to be a dandy.
                        -- Anonymnous --
--------------------------------------------------------------------------

                        B'ichela

 
 
 

Technical: "98"..."99"...."100" instead of "2000"?

Post by Osmo Ronkan » Tue, 06 Oct 1998 04:00:00





>21:03:42 in comp.software.year-2000, Osmo Ronkanen

>>I've also thought that. If the number is stored in binary form then there
>>probably is no limit at 100 and one can only change the IO routines. The
>>IO could well use 4 digits but internally the year would be defined as
>>year since 1900.

>If changing the year to binary, one might as well change the origin to
>the Year Zero [proleptic astronomical gregorian, of course] (providing,
>except for astronomers, a convenient "impossible" entry"; and keeping
>the 400-year rule simple).  A 16-bit signed year will amply accommodate
>all the dateable past and all the foreseeable future.

I did not say anything about changing the year to binary.

Osmo

 
 
 

Technical: "98"..."99"...."100" instead of "2000"?

Post by Alistair Man » Tue, 06 Oct 1998 04:00:00







> >21:03:42 in comp.software.year-2000, Osmo Ronkanen

> >>I've also thought that. If the number is stored in binary form then
there
> >>probably is no limit at 100 and one can only change the IO routines.
The
> >>IO could well use 4 digits but internally the year would be defined as
> >>year since 1900.

> >If changing the year to binary, one might as well change the origin to
> >the Year Zero [proleptic astronomical gregorian, of course] (providing,
> >except for astronomers, a convenient "impossible" entry"; and keeping
> >the 400-year rule simple).  A 16-bit signed year will amply accommodate
> >all the dateable past and all the foreseeable future.

16 bit signed years, Osmo? Are you nuts? How then do archeologists
accomodate hundreds of thousands of years ago? -32767? Come on! How then
should Stephen Hawking measure thousands of millions of years in the
future? Think about it :-)

Alistair Mann

 
 
 

Technical: "98"..."99"...."100" instead of "2000"?

Post by Dr John Stockto » Tue, 06 Oct 1998 04:00:00



17:36:25 in comp.software.year-2000, Alistair Mann







>> >21:03:42 in comp.software.year-2000, Osmo Ronkanen

>> >>I've also thought that. If the number is stored in binary form then
>there
>> >>probably is no limit at 100 and one can only change the IO routines.
>The
>> >>IO could well use 4 digits but internally the year would be defined as
>> >>year since 1900.

>> >If changing the year to binary, one might as well change the origin to
>> >the Year Zero [proleptic astronomical gregorian, of course] (providing,
>> >except for astronomers, a convenient "impossible" entry"; and keeping
>> >the 400-year rule simple).  A 16-bit signed year will amply accommodate
>> >all the dateable past and all the foreseeable future.

>16 bit signed years, Osmo? Are you nuts? How then do archeologists
>accomodate hundreds of thousands of years ago? -32767? Come on! How then
>should Stephen Hawking measure thousands of millions of years in the
>future? Think about it :-)

I think that you are confused about who you are responding to.

16 bit signed years are adequate for all present purposes involving
"calendar" dates, with month and day.  I don't know what the earliest
historically-reported event that can be reliably, fully dated on the
present calendar is, but it will surely be much later than 10000 BC.

For SH-type astronomy, a "single", > 6 digit accuracy, 10E38 range, will
do.

My programs/longcalc.(pas.exe) will, however, accommodate up to +-
10E1000 seconds, or more if you care to.

--

  Web <URL: http://www.merlyn.demon.co.uk/> - FAQqish topics, acronyms & links.
  PAS, EXE in <URL: http://www.merlyn.demon.co.uk/programs/> - see 00index.txt.
  Do not Mail News to me.    Before a reply, quote with ">" or "> " (SoRFC1036)

 
 
 

Technical: "98"..."99"...."100" instead of "2000"?

Post by Alistair Man » Thu, 08 Oct 1998 04:00:00


<snip>

< think that you are confused about who you are responding to.

You're right. Sorry Osmo, you're blameless, it was John here who gave us
that 16bits is adequate for present purposes.

<16 bit signed years are adequate for all present purposes involving
<"calendar" dates, with month and day.  I don't know what the earliest
<istorically-reported event that can be reliably, fully dated on the
<resent calendar is, but it will surely be much later than 10000 BC.

You're wrong John, 16bits is not enough, and your "present purposes"
argument is identical to the reason y2k is upcoming, when 2 chars was
adequate for "present purposes" decades ago. As you may well have read in
previous post liking the world;s current relationship with date and time as
analogous to England's relationship with it in pre-railway times, we need to
unify the world both in terms of how dates and time are represented in
computers, and we need to unify date/time in the macro fields (eg, your
astronomy) with the micro (eg, your physics) fields.

<programs/longcalc.(pas.exe) will, however, accommodate up to +-
<0E1000 seconds, or more if you care to.

I don't doubt it. I argue that if the basic architecture of date/time was
right, we wouldn't need it.

 
 
 

Technical: "98"..."99"...."100" instead of "2000"?

Post by Osmo Ronkan » Thu, 08 Oct 1998 04:00:00





><snip>

>< think that you are confused about who you are responding to.

>You're right. Sorry Osmo, you're blameless, it was John here who gave us
>that 16bits is adequate for present purposes.

><16 bit signed years are adequate for all present purposes involving
><"calendar" dates, with month and day.  I don't know what the earliest
><istorically-reported event that can be reliably, fully dated on the
><resent calendar is, but it will surely be much later than 10000 BC.

>You're wrong John, 16bits is not enough, and your "present purposes"
>argument is identical to the reason y2k is upcoming, when 2 chars was
>adequate for "present purposes" decades ago.

You got to be kidding. A decade ago one could clearly foresee that there
soon will be the millennium change. However, to speculate that one has to
prepare for 30000 years in advance is ludicrous. The most advanced tool
30000 years ago was a stone axe. Do we use them now? (especially those
made 30000 years ago).

Osmo