Time is a Lie
In order to measure time, we need some periodic event to use as a time unit. Then, if we can order two events one after the other, we can count elapsed time units between them. That’s enough to measure relative time, but if two or more systems want to share timing information about events which were not observed by all participants, they also need a fixed frame of reference, an epoch. For example, take two arbitrary events A and B, then count N units between them; what time was it when A happened? With the information at hand, the best answer you can give is that A happened N units before B. Then, what time was it when B happened? N units after A, of course. If I didn’t observe neither A or B, that answer might be completely useless to me.
If you think the previous example is unrealistic, you probably haven’t tried to read Linux dmesg
logs.
By default, dmesg
adds a timestamp to each kernel message, and this timestamp is measured in terms of uptime, i.e., seconds since kernel startup.
If a user doesn’t know when the system was last turned on, it is impossible to answer things like “was this kernel warning issued in the last week?” unless you have also registered the start/end of the week using these timestamps, or you know for sure there was no reboot in the last week (to make things worse, uptime does not tick while the system is sleeping or hibernating, so you would also have to take standby time into account).
We could even use these requirements to synthesize a formal definition of a calendar. I won’t bother, however, because as you’ll see, dealing with time and human calendars is incredibly complex. If you’ve ever seen (or maybe written) code like this:
days = wholeMonths*30 + wholeYears*365
timeInSeconds = days * 24 * 60 * 60
createAlert(now() + timeInSeconds)
Then you probably knew, deep down, that it was imprecise. But how bad can it get, really? That calculation assumes:
- Every month has 30 days.
- Every year has 365 days.
- Every day has \(24 \times 60 \times 60\) seconds.
The truth is that every single one of these assumptions is wrong.
Ancient Lunisolar Calendars
One of the first periodic events we could observe was the cycle of day and night. So the primitive man decided to use days as a base time unit, and to this days we’re paying for that decision while trying to maintain backwards compatibility to some extent.
Of course, the primitive man wasn’t that good with big numbers, and eventually chose a more coarsely-granular time scale: eras following lunar and/or solar cycles. Meanwhile, their time anchors were based on political events, such as the crowning of a new ruler. Using these systems, we can imagine someone referring to a specific day as “the 3rd day of the 200th crescent moon of the 11th Dynasty”.
Unfortunately, these astronomical events which guide our calendars don’t repeat on an integer number of days. Still, we kept the day as a base time unit and decided that months and years must have a whole number of days. As a result, we have to make our calendars slightly irregular in order to compensate the frequency difference between astronomical and civil calendars. These drift-compensation mechanisms are called intercalary seconds/days/weeks/months.
For example: if the lunar month had a period of exactly 29.5 days, and we alternated between 29 and 30 -day calendar months, these would be in sync with a maximum error of \(\pm 0.5\) days. Then, in order to compute how long to wait until “this day next month”, we would need to take that into account:
days = currentMonth.isEven ? 30 : 29;
timeInSeconds = days * 24 * 60 * 60;
But is the moon really that important, to the point where we need to change our calendar in order to keep them in sync? Probably not, but the sun apparently is.
Leap Days and the Gregorian Reform
We’re all familiar with leap (or bissextile) years: every four years or so, the year has 366 days instead of the usual 365, and February ends on the 29th instead of the 28th. This intercalary event happens because we want our civil calendar to have a whole number of days each year, while keeping it in sync with the solar calendar. Unfortunately, the cosmos doesn’t care about integers nearly as much as we do, and the period of Earth’s revolution around the sun is approximately 365.2422 days (365 full days plus ~5.8128 hours). This difference would cause the calendar year to drift over time with respect to certain astronomical events, notably seasons.
If we round the period difference between solar and calendar years from ~5.8128 to 6 hours, then adding an extra day to our calendars every four years would be enough to fix this drift precisely, and seasons would be off by at most 18 hours. Since the period offset is actually smaller than 6 hours, doing this every four years would lead to an overcompensation. This is precisely how the Julian calendar worked (or, how it didn’t work).
Currently, most of the world uses the Gregorian calendar, which was introduced in 1582 A.D. as a substitute to the Julian Calendar. Before the Gregorian calendar was established, the Julian calendar had been running since 45 B.C., using excessive leap years. This led to an offset (w.r.t. the solar year) of approximately 10 days, all of which were adjusted at once: in 1582, the next day after October 4 (a Thursday) was October 15 (a Friday). After that point, we’ve been skipping leap years (we don’t add the leap day) on years which are multiples of 100 and not multiples of 400. In the end, our maximum error with respect to seasonal events is approximately 2 days, and an era in the Gregorian calendar (a full period, taking leap days into account) is 400 years long.
The Julian-Gregorian transition doesn’t make time calculations any more complicated if you pretend the Gregorian calendar had always been in effect; in which case you would be using the proleptic Gregorian calendar.
Unfortunately, you might be surprised by date and time libraries trying to be smart about it.
Java is a notable example, as converting a java.util.Date
to a string will use a default-parameter GregorianCalendar.
Despite the name, that implementation actually switches between Gregorian and Julian calendars when going across the transition, so these strings might lead to time calculations which are off by 10 days.
ISO 8601 and Leap Seconds
Nowadays, we have standard representations / interchange formats for time and dates. ISO 8601 was first introduced in 1988, and it has some nice features:
- Fixes the proleptic Gregorian calendar as an international standard for civil dates.
- Lexicographical order of the representation corresponds to chronological order.
- Robust applications can easily parse dates with higher-than-needed precision.
Unfortunately, we’re still left with a couple of problems, mostly artifacts from the past.
When the day was used as a base time unit, we needed to compensate calendars based on units bigger than that, and anything smaller was simply defined as a fraction of a day.
So, in the past, the second was defined to be \(\frac{1}{24 \times 60 \times 60} = \frac{1}{86400}\) of a day.
Currently, in the S.I., seconds are the base time unit, where 1 second is defined as “the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom”.
If we ignore relativistic frequency shifts, that’s pretty much a constant, and TAI is a time standard which tracks time based on a monotonic clock, ticking precisely every second and centered around the Unix epoch 1970-01-01T00:00:00Z
.
Unfortunately, now that our base unit became smaller, solar time and civil time start differing, and calendar days need drift compensation as well.
UT1 is a time standard which tracks the Earth’s rotation angle, also known as solar time. However, the Earth spins irregularly, and the length of a day has been slowly increasing for the past centuries (this is apparently caused in part by the moon). This means that UT1 (solar time) and TAI (atomic time) are drifting apart. Now, what we actually use to coordinate time worldwide is the UTC standard, which we decided would have seconds elapsing at the same rate as TAI, all while keeping it’s offset to UT1 to less than a second. In this case, drift compensation is done by adding a leap second every now and then.
Unlike leap days, however, the addition of leap seconds to UTC does not follow a regular schedule. When it does happen, the last minute of the day goes from 00 to 60 (and this is compatible with ISO 8601), for a total of 61 seconds in that minute. As of 2023, UTC is 37 seconds behind TAI, and the next scheduled leap second will be advertised in “Bulletin C”.
Conclusion
Remember our initial assumptions?
- Every month has 30 days.
- Every year has 365 days.
- Every day has \(24 \times 60 \times 60\) seconds.
I’m sure you knew the first one was wrong already, since you’re used to the Gregorian calendar.
The second is also common knowledge, but we often forget about leap days.
The third is false in both UT1 and UTC, since days are getting longer, and every now and then we have a leap second to account for.
Furthermore, we can’t even refer to specific times in the past or future:
- In ISO 8601, dates before 1582-10-15 can only be transmitted after mutual agreement of the parts exchanging information; this is done in order to avoid ambiguities and confusion related to the Julian-Gregorian transition.
- The fact that we can’t predict the addition of leap seconds to UTC makes it impossible to reference precise date-times in the future.
In reality, the best we could do would probably be to forget about days, months and years altogether; use the Unix epoch as a reference, and atomic clocks to make sure frequency doesn’t drift away from the standard second. Although this is feasible when we’re talking about computer systems, I’m not sure it will ever catch on.
In the end, UTC is the best we currently have for civil time, even if it is a lie.