[0]: https://github.com/kccqzy/smartcal/blob/9cfddf7e85c2c65aa6de...
edit: I just love that there are like 5 different comments pointing out this same thing
this is when time-travelling fugitives hide out
Everything now makes sense, I always wondered why September was the nine month with a 7 prefix.
146097 days = 400 year portion of the leap year cycles (including leap years during that)
36524 days = same for the 100 year portion of the leap year cycles
1461 days = 4 year cycle + 1 leap day
In case someone was wondering why in the world someone said we should add a day to the second month of the year...
The Roman calendar moved to January as the first month of the year in 153 BC, over a hundred years before the leap day was added. The 10-month calendar may not have even existed--we see no contemporary evidence of its existence, only reports of its existence from centuries hence and the change there is attributed to a mythical character.
Caesar happened to be the Pontifex Maximus (an office you hold for life once elected to), but he wasn't in Rome much to do that job. So after he came back from hanging out with Cleopatra in Egypt he came back and set the calendar on auto-pilot.
It's fair to say January was the first month of the Roman calendar; despite it having formerly been March.
> How would that have worked anyway, did they have more days per month?
The way I've heard, they just simply didn't track the date during the winter.
It achieves a 30–40% speed improvement on x86-64 and ARM64 (Apple M4 Pro) by reversing the direction of the year count and reducing the operation count (4 multiplications instead of the usual 7+).
Paper-style explanation, benchmarks on multiple architectures, and full open-source C++ implementation.
I was a bit confused initially about what your algorithm actually did, until I got to the pseudo-code. Ideally there would be a high level description of what the algorithm is supposed to do before that.
Something as simple as: “a date algorithm converts a number of days elapsed since the UNIX epoch (1970-01-01) to a Gregorian calendar date consisting of day, month, and year” would help readers understand what they're about to read.
> Years are calculated backwards
How did that insight come about?
So that a day number can be directly mapped to year, month, and day, and the calendar date can be mapped back with a year-month LUT.
const C1 = 505054698555331 // floor(2^64*4/146097)
as constexpr int C1 = floor(2^64*4/146097);Neat code though!
A few notes for those not familiar with Lisp:
1. Common Lisp defines a time called "universal time" that is similar to unix time, just with a different epoch
2. A "fixnum" is a signed-integer that is slightly (1-3 bits) smaller than the machine word size (32-bits at the time the article was written). The missing bits are used for run-time type tagging. Erik's math assumes 31-bits for a fixnum (2.9M years is approximately 2^30 days and fixnums are signed).
3. Anywhere he talks about "vectors of type (UNSIGNED-BYTE X)" this means a vector of x-bit unsigned values. Most lisp implementations will allow vectors of unboxed integers for reasonable values of X (e.g. 1, 8, 16, 32, 64), and some will pack bits for arbitrary values of X, doing the shift/masking for you.
https://stackoverflow.com/questions/10849717/what-is-the-sig...
This focuses on string <-> timestamp and a few other utilities that are super common in data processing and where the native Java date functions are infamously slow.
I wrote it for some hot paths in some pipelines but was super pleased my employer let me share it. Hope it helps others.
Have a fallback with this algorithm for all other platforms.
i'm placing my bets that in a few thousand years we'll have changed calendar system entirely haha
but, really interesting to see the insane methods used to achieve this
I therefore agree that a trillion years of accuracy for broken-down date calculation has little practical relevance. The question is if the calculation could be made even more efficient by reducing to 32 bits, or maybe even just 16 bits.
This is somewhat moot considering that 64-bits is the native width of most modern computers and Unix time will exceed 32-bits in just 12 years.
Given the chronostrife will occur in around 40_000 years (give or take 2_000) I somewhat doubt that </humor>
But if it’s just about starting over from 0 being the AI apocalypse or something, I’m sure it’ll be more manageable, and the fix could hopefully be done on a cave wall using a flint spear tip.