17:36:25 in comp.software.year-2000, Alistair Mann
>> >21:03:42 in comp.software.year-2000, Osmo Ronkanen
>> >>I've also thought that. If the number is stored in binary form then
>> >>probably is no limit at 100 and one can only change the IO routines.
>> >>IO could well use 4 digits but internally the year would be defined as
>> >>year since 1900.
>> >If changing the year to binary, one might as well change the origin to
>> >the Year Zero [proleptic astronomical gregorian, of course] (providing,
>> >except for astronomers, a convenient "impossible" entry"; and keeping
>> >the 400-year rule simple). A 16-bit signed year will amply accommodate
>> >all the dateable past and all the foreseeable future.
>16 bit signed years, Osmo? Are you nuts? How then do archeologists
>accomodate hundreds of thousands of years ago? -32767? Come on! How then
>should Stephen Hawking measure thousands of millions of years in the
>future? Think about it :-)
I think that you are confused about who you are responding to.
16 bit signed years are adequate for all present purposes involving
"calendar" dates, with month and day. I don't know what the earliest
historically-reported event that can be reliably, fully dated on the
present calendar is, but it will surely be much later than 10000 BC.
For SH-type astronomy, a "single", > 6 digit accuracy, 10E38 range, will
My programs/longcalc.(pas.exe) will, however, accommodate up to +-
10E1000 seconds, or more if you care to.
Web <URL: http://www.merlyn.demon.co.uk/> - FAQqish topics, acronyms & links.
PAS, EXE in <URL: http://www.merlyn.demon.co.uk/programs/> - see 00index.txt.
Do not Mail News to me. Before a reply, quote with ">" or "> " (SoRFC1036)