Services designed to fit your needs, mitigate your risk and deliver results.

Developer Exchange Blog

  • Home
    Home This is where you can find all the blog posts throughout the site.
  • Articles
    Articles Displays all articles posted on this site.
  • Bloggers
    Bloggers Search for your favorite blogger from this site.

Leap Second Madness

by

Time. It's one of a very few of our unwavering dimensions of the existential Experience. "What time is it?" Such a simple question to ask. Such a simple concept to grasp. It moves in one direction, steadily progressing. While not everyone is aware of it, there are few topics more entangled than that of time. It can often be the bane of existence for a software developer. Make no mistake, this is a deep and dark hole of knowledge that can cause more waste of existence and money than most uninitiated watch-wearers typically appreciate. From Time Zones and Daylight Savings to Leap Years to Leap Seconds, there's plenty of complexity nested here to cause problems. Here's my best shot at explaining some of this.

leapsecond

The problem is, we need some way to measure this thing we call time. On earth, at least in western, modern civilization, we measure it and correlate it to the Earth's rotations about its axis and the Earth's revolutions around the Sun.  No big deal. One day per rotation, one year per revolution, 24 hours per day, 60 minutes per hour, sixty seconds per minute.
Actually, that notion of time, where one day == one rotation and one year == one revolution, is really what's known as Astronomical Time. To keep everyone as confused as possible, it's called UT1. However, UT1 is not truly what we humans go by when measuring elapsed time from second to second.

What humans go by when measuring elapsed time is something called Universal Time Coordinated (how I think of it) or UTC (and here again, as a grand example of global group-think, this is truly known as Coordinated Universal Time but still UTC - go figure).  This is a time system based on defining each elapsed second as a consistent, constant duration.  This is done based on observable state transitions in atoms. Though this approach has evolved over time, this is currently an atomic time scale that defines that one second is equal to 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom. Here, you have to take a leap of faith that this is a constant. Actually, even that has been challenged due to environmental effects, including temperature, on the observation, and the method of precise measurement has been revised as recently as 1999! Nevertheless, for our purposes this is a constant for the existential duration that elapses in one second. Every atomic second is, according this definition, exactly the same. It's precise to at least 1/10,000,000th of a second.

Because Astronomical Time is based on the position of the planet and UTC is based on a constant, we have a lack of absolute synchronicity. It takes 365.242199 days, or 365 days, 5 hours, 48 minutes and 46 seconds of atomic time (approximately) for the Earth to circle the Sun. Therefore, after every single Astronomical Year, we're out of sync, even if we had started the prior year perfectly in-sync.

We know, all too well, about this variance.  That's why, every four years, we have a "leap day," so that we can stay relatively synchronized with the number of revolutions around the Sun.  If we didn't do that, the human calendar would drift by almost 6 hours per year. So what? So, then after a number of years, the seasons would fall on different parts of the year! Yikes! We can't have that! Apparently. Leap Day keeps us very closely in sync over time, but not perfectly in sync. Note that since this difference from an Astronomical year is not perfectly off by 6 hours, determining whether it's a leap year is actually more complicated than simply one every four years (every year evenly divisible by 4). Here are two more rules. Years evenly divisible by 100 are not leap years, unless they are evenly divisible by 400. Good grief! But, that keeps things very close. Not perfect, but very close.

To make things worse, from an Earthly, atomic perspective, it doesn't take EXACTLY the same amount of time for the Earth to rotate on its axis each time. Likewise, each revolution around the Sun doesn't take the EXACT same amount of atomic time. Since the rate of rotation and revolution does vary (microscopically), the number of seconds (in precise real numbers, not integers) in a day and/or year varies.

Since things are still not perfect, every so often we need to inject a "Leap Second" to truly avoid a smaller, more microscopic drift.  In fact, we have one such Leap Second coming up at the end of June, 2015. From a Gregorian, human representation of time, the UTC clock will elapse as:

  • 2015 June 30, 23h 59m 59s
  • 2015 June 30, 23h 59m 60s
  • 2015 July 1, 0h 0m 0s

Note the 61st second (60s) there. That's clearly not normal.

To the non-software development or non-IT systems oriented observer the response is likely, "So what? Big deal. So, you're saying my analog watch might be off by a second?"  But, as Lee Corso might interrupt, "Not so fast, partner!"

This can be a big deal to some software systems. Why? It goes back to the complexity of all this and shortcuts that sometimes occur. As developers implement logic, - at times with limited knowledge about time, and at other times with limited tools, and still other times with limited existential time (deadlines) - they sometimes calculate elapsed time using logic that's too simple regarding how to advance from one minute to the next, one hour to the next, one day to the next and so on.

For example, it can seem very safe to say there are 60 seconds in a minute; so, if I need to calculate when the next minute will occur, I can add 60 seconds to some reference minute, and voilá, I have the next minute.  This can be repeated for other increments (with decreasing precision) to calculate the next hour, day, week, month, year, etc.  But due to the complexities cited above, this is not a reliable way to do so. There actually aren't 60 seconds in every minute.

Legacy computing languages and environments were very limited in their sophistication to deal with time.  In fact, we had a whole tech bubble related to the upcoming year 2000 due to a lack of sophistication in storing, incrementing and computing time differences due to years being stored as a two-digit number long ago.

Modern languages and libraries have very sophisticated support for time calculations.  Normally, there are libraries of code that know about astronomical time (in the form of a Calendar-type abstraction).  With these libraries, to advance time one minute, for example, one asks the Calendar to do so with a primitive or method that explicitly asks the Calendar to add one minute to a given time.  This allows those libraries to embrace the atomic/astronomical clock complexities to truly be able to roll or add a given number of minutes, hours, days, weeks, months, years to a given reference time to arrive at what a human, who operates on the astronomical clock in large measure, would want.

Some have been making the case that the upcoming Leap Second is no big deal.  And they are right, if they are talking about its effects on modern operating systems, because the sophistication I have just mentioned is included in those systems.  However, they are wrong, in general.  There are too many software systems that include date and time math by simply adding atomic seconds when they mean to roll or add to an astronomical clock.

So, what's the expected consequence?  Well, unless someone corrects schedules that were created by faulty time calculations, one can wind up with scheduling systems that are forever off-by-one second. So, for example, where one system thinks a given time should be 01:00:00, it may actually show up as 00:59:59. And until and unless someone corrects it, it would and will be perpetual. And systems tend to communicate these interpretations of time to match or reconcile with each other. Digital logic is precise. 01:00:00 will never match 00:59:59, regardless of whether most humans would agree, "yeah that's good enough."

Will this cause widespread failure? Probably not. Real-time scheduling systems will be updated. Problems will be found and corrected quickly.

But to say that this is absolutely no big deal is, in my opinion, whistling past the graveyard.

Quick chronological bio: Panther, Gopher, Blazer, Husband, Father, Doozer.