Maybe not in a few thousand years, but given the deceleration of the Earth’s rotation around its axis, mostly due to tidal friction with the moon, in a couple hundred thousand years our leap-day count will stop making sense. In roughly a million years, day length will have increased such that the year length will be close to 365.0 days.
I therefore agree that a trillion years of accuracy for broken-down date calculation has little practical relevance. The question is if the calculation could be made even more efficient by reducing to 32 bits, or maybe even just 16 bits.
> The question is if the calculation could be made even more efficient by reducing to 32 bits, or maybe even just 16 bits.
This is somewhat moot considering that 64-bits is the native width of most modern computers and Unix time will exceed 32-bits in just 12 years.
I meant reducing the representable year numbers, not the whole timestamp.
Shorter term the Gregorian calendar has the ratio for leap years just a tiny bit wrong which will be a day off by 3000 years or so.