« Back to home

The fractious leap second debate

You might not have heard about it, but there’s a debate going on which threatens to redefine time as we measure it. I’m something of a time nerd; all the computers in our house are synchronized to atomic clocks, as are several of our regular clocks, my wristwatch, and my phone. The debate going on concerns leap seconds. To understand the importance of it, it’s necessary to understand what a leap second is.

Recording time is made difficult for us by the fact that we live on a large rotating object with high mass, in orbit around a star. We like our time measurements to correspond to the apparent observed motion of the star in our sky; in short, we like day to be light, and night to be dark. We also like to set our calendar based on the earth’s orbit around the sun, so that winter is always cold and summer is always hot.

Inconveniently, the earth does not make an exact number of rotations per year. Hence every now and again it’s necessary to have a leap year, inserting an extra day into the calendar to bring it back into sync with the earth’s orbit, so that the months don’t gradually drift against the cycle of hot and cold weather.

The problem of wanting day to be light is solved by having time zones, with different parts of the world choosing a different offset in hours so that noon is roughly when the sun is overhead.

Historically, the offsets were measured from GMT, time as measured at the Greenwich Observatory in England, calculated from the position of the sun. However, the development of atomic clocks of increasing accuracy, and telescopes of increasing power, made scientists aware of problems with this simple scheme.

The thing is, the rotational period of the earth varies on a day to day basis in an unpredictable way. When you’re looking at objects a billion light years away through a very narrow field of view, you soon notice if your telescope’s position is a few milliseconds behind where it should be. Hence astronomers work with Universal Time, a time scale set by observing distant stars and radio sources outside our galaxy as well as the objects in the solar system.

There are various flavors of Universal Time: UT0, UT1, UT1R, UT2, UT2R. They apply increasingly subtle correction factors to account for the wobble caused by the irregular movement of the earth.

Meanwhile, atomic clocks work by their own rules. The time scale we get from them is called TAI, and is based on averaging the time value from several hundred atomic clocks around the world. The averaging is important, because the rate at which time passes depends on where you are in earth’s gravitational field and how fast you are moving, as per Einstein’s general theory of relativity, and atomic clocks are accurate enough to detect the change. Corrections are applied to attempt to make TAI the atomic time as measured at mean sea level on the surface of the earth.

So we have two fundamental ways to measure time: astronomy and atomic clocks. Unfortunately, they measure time at slightly different rates, because of this pesky planet we’re on.

Atomic clocks are the most stable and convenient, so they are now used as the principal means of everyday time measurement. The old basis of our everyday time zones, GMT, was replaced with Coordinated Universal Time or UTC, measured by atomic clock.

However, because we like to keep noon during daylight, and because it’s measured using atomic clocks, UTC needs to be adjusted periodically to keep it in sync with UT1; that’s the “coordinated” part of its name. So every now and again, an extra second is inserted into UTC, to bring it back in sync with UT1 and the movement of the earth through the universe.

So while UTC was the same as TAI back in 1958, it has gradually drifted away from TAI in order to stay no more than a second away from UT1. UTC and TAI are now 23 seconds apart.

(I’m glossing over a few historical details here, like the fact that before 1972 UTC had fractional leaps, so clocks measuring TAI and UTC ticked at different times.)

Anyhow, with that background out of the way, we arrive at the fractious debate.

Computer programmers hate leap seconds. They have enough time getting leap years right, and we know when those are going to be hundreds of years in advance. Leap seconds are unpredictable, and have to be scheduled by a team of scientists after analyzing astronomical data.

The consequences can be problematic. For example, Java is defined to keep time in UTC, stored internally by counting the number of milliseconds since midnight UTC on January 1st 1970, known as the epoch. Skipping over the minor detail that UTC wasn’t defined until 1972, how well does this work?

The answer is, not very. It means that it’s actually impossible to correctly represent future dates and times in Java. You can’t calculate the number of milliseconds UTC between the epoch and (say) 2097-10-01 19:57:22, because nobody knows how many leap seconds are going to be inserted into UTC between now and that date — and therefore, you don’t know how many UTC seconds there will be.

You can’t even return correct results after the fact, because if you ever insert an extra second into Java’s UTC timescale to match real UTC, every existing persisted Java date/time value after that event will change by a second. So in practice, Java represents time/date values as something that’s rather like a mutant version of TAI with seconds that are occasionally 2 seconds long to make it sync back up with UTC — although it claims to be reporting UTC.

For example:

TimeZone utc = TimeZone.getTimeZone("UTC");
GregorianCalendar cal = new GregorianCalendar(utc);
cal.set(2005,11,31,23,59,00);
long start = cal.getTimeInMillis();
cal.set(2006,00,01,00,01,00); 
long end = cal.getTimeInMillis(); 
long difference = (end - start) / 1000; 
System.out.print("Difference in seconds = ");
System.out.println(difference);

The output on my Linux system is 120, which is incorrect, as there was a leap second at 2005-12-31 23:59:60. Sun’s API documentation notes:

Although the Date class is intended to reflect coordinated universal time (UTC), it may not do so exactly, depending on the host environment of the Java Virtual Machine.

That’s a bit of a cop-out, however, as my Linux box is configured with correct time zone files that include leap second information, rather than POSIX ones. So I’m skeptical of the implicit claim that Java ever produces the right answer, even when the OS allows it to.

I say “correct…rather than POSIX” because sadly, it’s not just Java that got things so disastrously wrong. Every Unix system has the same problem, it’s built in to the POSIX standards. Java just copied Unix. When there’s a leap second inserted into UTC, Unix just fudges the clock by a second, has the same time of day occur twice, and continues as if nothing happened. The extra second is effectively invisible to software.

Of course, these crude kludges can cause problems with software. What’s a second between friends? If you’re a big telecoms company like AT&T, it can be $4,000 of profits. If you were on the phone for 5 minutes on New Year’s Eve 2005 when a leap second was inserted, I bet AT&T would have liked to bill you for the entire 5 minutes, not for the 4:59 their software probably calculated.

Or suppose you’re the US Air Force. You might like to avoid having to shut down the entire US military satellite grid for two minutes the way you did last time there was a leap second.

So there’s a lot of broken software out there, and there are a lot of computer programmers who want to legislate away the problem by doing away with leap seconds. This would effectively do away with UTC, and make it equal to TAI from now onwards.

It would also, of course, cause UTC to gradually drift from UT1, and from earth time. Over the years the drift would get worse, until the divergence between noon and the sun being overhead would be noticeable. Eventually, noon would be at night.

Of course, chances are that at some point, it would be decided that the divergence between computer time and physical earth time was unacceptable, and we’d have to have a leap hour. There would be massive amounts of work involved. However — and this is the key point — nobody currently writing software would need to worry about that. We’ll all be dead by then. Nobody will be running any of our code that far in the future, right?

In other words, it’s like we’re setting up to have Y2K all over again.

Or rather, it’s like we have a small Y2K already happening, so people are proposing to push the problem off and make it worse by legislating that the year 2000 should actually be represented as 1900 again.

You can probably guess where I stand on this issue. I think the software should be fixed properly, including Unix. GPS systems have to have accurate time, and they provide a good model of how to handle it: they measure TAI, and use an offset broadcast by the satellites to calculate UTC from TAI for display purposes. The leap second tables required are far simpler than the time zone data updates already needed periodically.

In the case of Java, the existing Date and Calendar classes are such a mess that it’s time they were deprecated and replaced anyhow. So how about it, Sun? Let’s have a new Java DateTime class, perhaps modeled on Joda Time, that makes explicit the fact that Java internal time is TAI and must be converted to UTC.

For Linux, it seems that someone has the right idea: my Ubuntu systems have /usr/share/zoneinfo/posix, and a separate /usr/share/zoneinfo/right which contains zoneinfo files with leap second information available. There’s no insurmountable technical difficulty with switching Unix systems to use a monotonic atomic time standard for the internal clock, and calculating UTC from it. POSIX, however, actually requires that leap seconds be ignored, so it’s a political issue of fixing POSIX (and Java).

The US Naval Observatory is now gathering feedback on the stupid proposal to redefine UTC. Sadly, it seems that they plan to make the decision based on what’s cheapest to do, rather than what’s actually correct. That’s the same logic that gave us the 2 minute Air Force shutdown and the Y2K bug. Here we go again…

In the mean time, the next UTC leap second will occur at the end of this year. Do please pause to enjoy it, as it may be your last.

[Further reading]