> The algorithm provides accurate results over a period of ±1.89 Trillion years
i'm placing my bets that in a few thousand years we'll have changed calendar system entirely haha
but, really interesting to see the insane methods used to achieve this
> The algorithm provides accurate results over a period of ±1.89 Trillion years
i'm placing my bets that in a few thousand years we'll have changed calendar system entirely haha
but, really interesting to see the insane methods used to achieve this
Maybe not in a few thousand years, but given the deceleration of the Earth’s rotation around its axis, mostly due to tidal friction with the moon, in a couple hundred thousand years our leap-day count will stop making sense. In roughly a million years, day length will have increased such that the year length will be close to 365.0 days.
I therefore agree that a trillion years of accuracy for broken-down date calculation has little practical relevance. The question is if the calculation could be made even more efficient by reducing to 32 bits, or maybe even just 16 bits.
> The question is if the calculation could be made even more efficient by reducing to 32 bits, or maybe even just 16 bits.
This is somewhat moot considering that 64-bits is the native width of most modern computers and Unix time will exceed 32-bits in just 12 years.
I meant reducing the representable year numbers, not the whole timestamp.
Shorter term the Gregorian calendar has the ratio for leap years just a tiny bit wrong which will be a day off by 3000 years or so.
> i'm placing my bets that in a few thousand years we'll have changed calendar system entirely haha
Given the chronostrife will occur in around 40_000 years (give or take 2_000) I somewhat doubt that </humor>
The calendar system already changed. So this won't get correct dates, meaning the dates actually used, past that date. Well, those dates, as different countries changed at different times.
Wouldn’t it be accurate for that as well? Unless we change to base 10 time units or something. Then we all have a lot of work to do.
But if it’s just about starting over from 0 being the AI apocalypse or something, I’m sure it’ll be more manageable, and the fix could hopefully be done on a cave wall using a flint spear tip.
Or set 0 to be the Big Bang and make the type unsigned. Do it the same time we convert all temperature readings to Kelvin.
And count Planck time instead of seconds.. it's not as impossible as it may sound. You'll need more than 128 bits but less than 256 bits even if the epoch is the Big Bang (I can't recall exactly how many bits are needed, but I did the math once, some years ago). And it'll be compatible with alien or future time systems too, in case what we call a second (currently defined by caesium-133 periods) changes.