Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Science

A Giant Leap for the Leap Second (nytimes.com) 53

A top scientist has proposed a new way to reconcile the two different ways that our clocks keep time. Meet -- wait for it -- the leap minute. From a report: Later this month, delegations from around the world will head to a conference in Dubai to discuss international treaties involving radio frequencies, satellite coordination and other tricky technical issues. These include the nagging problem of the clocks. For 50 years, the international community has carefully and precariously balanced two different ways of keeping time. One method, based on Earth's rotation, is as old as human timekeeping itself, an ancient and common-sense reliance on the position of the sun and stars. The other, more precise method coaxes a steady, reliable frequency from the changing state of cesium atoms and provides essential regularity for the digital devices that dominate our lives.

The trouble is that the times on these clocks diverge. The astronomical time, called Universal Time, or UT1, has tended to fall a few clicks behind the atomic one, called International Atomic Time, or TAI. So every few years since 1972, the two times have been synced by the insertion of leap seconds -- pausing the atomic clocks briefly to let the astronomic one catch up. This creates UTC, Universal Coordinated Time. But it's hard to forecast precisely when the leap second will be required, and this has created an intensifying headache for technology companies, countries and the world's timekeepers.

"Having to deal with leap seconds drives me crazy," said Judah Levine, head of the Network Synchronization Project in the Time and Frequency Division at the National Institute of Standards and Technology in Boulder, Colo., where he is a leading thinker on coordinating the world's clocks. He is constantly badgered for updates and better solutions, he said: "I get a bazillion emails." On the eve of the next international discussion, Dr. Levine has written a paper that proposes a new solution: the leap minute. The idea is to sync the clocks less frequently, perhaps every half-century, essentially letting atomic time diverge from cosmos-based time for 60 seconds or even a tad longer, and basically forgetting about it in the meantime.

This discussion has been archived. No new comments can be posted.

A Giant Leap for the Leap Second

Comments Filter:
  • by Tim the Gecko ( 745081 ) on Friday November 03, 2023 @04:09PM (#63977758)

    Once it gets to 60 seconds you can imagine people saying "Is 120 seconds really that bad? Let's leave it for someone else."

    The cartoon [npr.org].

  • Everybody put in their product support wikis, "Do not open until half a century less one day from (epoch)", so you can make sure to deal with the missing/extra minute.

    • They'd better document it well, as almost nobody with any experience of leap seconds will be around to offer advice. Maybe it can be done at 00:00 on 2100-01-01: "Yo dawg, I heard you like Y2k..."
      • That's okay. The folks who are in the field will probably read about it on slashdot in the year 2074.

        Of course the article might have unsupported unicode characters, and people will skip past the gibberish...

  • by RitchCraft ( 6454710 ) on Friday November 03, 2023 @04:15PM (#63977784)

    Just ditch the less accurate method and be done with it.

    • by Calydor ( 739835 )

      The problem is the eventual drift a few centuries down the line when the day starts at midnight and ends somewhere around 5 pm.

      • by Alsn ( 911813 ) on Friday November 03, 2023 @04:29PM (#63977840)
        If the solution of a leap minute would require us to update every half century, that would mean that "a few centuries down the line" would be out of sync by something like 6 minutes. Or 20 minutes for a millennium. In three millenniums we would only have diverged by about an hour, or daylight savings time, which means we could scrap that too! I have a feeling that in 3000 years we won't really care that midnight is at 1 am (or 11 pm, I don't know which way the error goes), but that's just me.
        • by JeffSh ( 71237 )

          if the change is over generations, no one would notice and adaptation would be automatic or irrelevant.

        • by jaa101 ( 627731 )

          You're assuming the rate of drift is constant but it's actually constantly increasing. The second we have matches the earth's rotation as at around 1900 but the earth continues to slow down so leap seconds, minutes, or hours will become more frequent as the years go by. This article [ucolick.org] estimates a 1 minute difference (relative to now) by 2076, an hour by 3023, and a full day by 7633.

          Weirdly, we're currently going through a fast patch with one estimate that "we are almost as likely as not to experience a nega [insidegnss.com]

        • I have a feeling that in 3000 years we won't really care that midnight is at 1 am

          If you don't care there is already a time keeping method for you to use. Just use TIA and leave those people who rely on time accurate to astronomy to do their thing.

          In reality if we update something every half a century we're going to have a Y2K style event every half a century.

        • Midnight is no problem. As my solar cells produce no power.
          Noon is.

          I want my two solar panels adjusted that they start pumping in the morning around 6:30, and stopp around 10:30.
          For my evening panels I want 16:00, they stop automatically when it is to dark.

          Ofc: with clever automation, you do not need any time info ...

          But the assumption of /. experts that a kind of "reliable" time in "reference to the sun" is not really needed by humans: makes no sense.

          Or do you prefer to wake up for a stupid 9:00 - 17:00 jo

  • UTCS (Score:3, Funny)

    by camliner ( 685937 ) on Friday November 03, 2023 @04:18PM (#63977794)
    Welcome UTCS: Sorta Coordinated Universal Time
  • by crunchygranola ( 1954152 ) on Friday November 03, 2023 @04:21PM (#63977810)

    One method, based on Earth's rotation, is as old as human timekeeping itself, an ancient and common-sense reliance on the position of the sun and stars.

    If you have been paying close attention to where the center-line of the narrow total eclipse path is for the various total eclipses in recent years, maybe making plans for exactly where you plan to observe it, starting several years in advance, you will be aware that the projected center-line path shifts significantly on the projection maps (if they are kept up to date). The shift is in the order of hundreds of meters over several years. It isn't that we don't know the time of the eclipse precisely, we do, it is because the variability in the slowing down of the Earth's rotation makes the exact position of the Earth's surface under the eclipse uncertain.

    • by jaa101 ( 627731 )

      We have also used this method in reverse to go from the reported locations of ancient solar eclipses to knowing the length of the day almost 3000 years ago. Back then, days (from one noon to the next), averaged around 50ms shorter than today.

  • It would be unrelated to any timezone but would just be a counter that is never adjusted. Perhaps it would start out matched to UTC , perhaps not, but it would be consistent. Youd need to have a mapping to UTC but the actual time is needed far less for machines than relative time and time periods.

    • There is TAI (international atomic time) that just increases a second counter. UTC defined as the clock in sync with the TAI ticking, except with adding a leap second once in a few years. UTC is currently shifted 32 seconds from TAI.

    • by ceoyoyo ( 59147 )

      Congratulations, you've described TAI.

      Yes, it's stupid that computers generally use UTC, or worse, local time. I think it's because people like saying things like "leap seconds drive me crazy." Keeps them employed.

  • by MobileTatsu-NJG ( 946591 ) on Friday November 03, 2023 @04:25PM (#63977824)

    Once we start getting people off this rock we're gonna have to move to something like Star Trek's 'stardate' anyway. Daylight cannot be synced across spinning astral bodies.

  • by swillden ( 191260 ) <shawn-ds@willden.org> on Friday November 03, 2023 @04:28PM (#63977838) Journal

    Assuming I understand the proposal correctly, the idea is to keep all of our electronics and other timekeeping instruments synchronized to the atomic clocks and just allow them all to slowly drift out of sync with the sun. Then when the difference gets to be around a minute, we'd make an adjustment similar to what we do now -- and like servers do all over the world they adjustment probably wouldn't be instantaneous but would be smeared over some period.

    I don't see any issue with allowing noon to be one minute off. For most people it's already off by far more than that, because our time zones are (mostly) one hour wide, and there's only a very narrow strip in each where noon is exactly astronomical noon.

    The only potential issue, IMO, is whether the one-minute adjustment is actually less problematic than 60 one-second adjustments, and I think the answer is yes. Even a one-second adjustment is already so large that it's a problem for many of our systems to make it in an actual leap, hence the smearing. Given that we're already having to deal with an uncomfortably-large adjustment, I don't see any reason not to make it bigger in exchange for making it less frequent.

    Or we could go the other direction and start inserting leap-milliseconds, or leap-nanoseconds, or whatever interval is small enough that we can actually "leap" without concern about breaking things. Then we could make these tiny leaps all the time, on some schedule defined a year or so in advance.

    But, all in all, having a leap minute every 50 years or so sounds easier.

    • The problem with leap milliseconds or leap nanoseconds would be the size of the conversion file required to hold all the changes. As it is a leap second every year or so is pretty small. Seems completely pointless to go from leap second to leap minute. The skewing wouldn't likely be uniform over all devices leading to problems. Skewing a second and not doing it completely in sync seems less error prone than the other.

    • by uncqual ( 836337 ) on Friday November 03, 2023 @05:47PM (#63978034)

      One reason to keep the leap second rather than transition to a leap minute is that we "practice" the leap second transition every few years, understand it well, and have ongoing institutional knowledge of it and its consequences.

      Switching to a leap minute a couple times a century means that each generation of developers, IT staff, etc will experience it on average less than once in their lifetime. And when it happens almost no one on staff would ever have experienced it previously in their professional career. Programmers will either be blissfully unaware of the problem or will, as generations before did with Y2K, figure "the leap minute won't occur for 40 years, this code won't be running then" and not bother to deal with it (let alone in a way consistent with the way others have). (Yes, some code I wrote over 40 years ago is still running in production in many large enterprises with minimal modification - I wouldn't have expected that when I was "young and dumb" and writing the code at that startup - I wasn't sure it would ever even find a customer, let alone not be rewritten/replaced within a decade!).

      As well, if systems "get it wrong", it will mostly be in systems introduced since the last leap event so fewer systems will fail at once with a leap second vs. leap minute and the original development teams are more likely to be around to fix them rather than the failures occurring in ancient code that is no longer well understood.

      Finally, a full minute being handled incorrectly is more likely to create a problem than just one second would. For example in a train system if someone screwed up in a leap second situation, it's unlikely that two trains would collide (yes, I know, sensors and monitoring should prevent an actual collision, but the "black swan" triggering of such sensors and monitors can have a much larger ripple effect shutting down many trains) but in a full minute discrepancy two trains could try to occupy the same piece of track at the same time.

    • by Entrope ( 68843 )

      Quite a few systems have more trouble with a smeared leap second than with a single one. If they care that a "second" is actually a second long, or especially if they need to communicate the current time to other computers, smearing a leap second across a day or two is bad. It's even easier if they don't need to operate across UTC midnight.

  • by RightwingNutjob ( 1302813 ) on Friday November 03, 2023 @04:39PM (#63977868)

    The problem is that the Earth's rotation isn't uniform. Even if you ignore leap seconds, it is still necessary for the purpose of satellite navigation (for example) to keep track of the offset between the true angle of the Earth and the value predicted by a constant rotation rate.

    Bodies like ESA and the USNO do this based off of VLBI, GNSS, or SLR measurements, and there's a file that gets posted on the Naval Observatory's website called "finals.data" or some such that expresses this offset in terms of the number of additional SI seconds necessary for a full rotation of the Earth.

    The way leap seconds were conceived, this number is always less than 1 since that's the maximum allowable offset between the actual length of a sidereal day and the nominal value.

    Get rid of leap seconds and ... wait for it... precision stuff will *still* need to account for this difference, except the maximum offset between Earth rotation inferred from wall clock time versus reality will need to be expressed in a number that might be bigger than 1.

    This might break some things.

    It might also break some things that ignored this offset and just used the wall clock time with the understanding that the error would be under 1 second, and now might break.

    Off the top of my head, things that might break because the numerical value of this offset is larger than 1 might be some GPS receivers that assumed UTC and UT1 would always be within 1 second. And things that might break because they ignore the offset and rely only on the clock time are some amateur (or professional) telescopes or satellite tracking antennas.

    There might be others.

    • The problem is in human nature. The reason leap seconds may be a problem are because devs are lazy and donâ(TM)t write proper software. So once a year this may cause a problem, so you (should) fix it at that time.

      If you go to leap minutes EVERY piece of software will be âoewonâ(TM)t happen before someone else will look at itâ and nobody will fix it until we get to a Y2K level problem.

  • by geekmux ( 1040042 ) on Friday November 03, 2023 @04:40PM (#63977870)

    "Having to deal with leap seconds drives me crazy,"

    Good. Perhaps we should be thankful that none of the 'crazy' leap second ideas are moving forward, and instead we're considering leap minutes for a race that hardly gives a shit about being on time anyway.

    Who cares? Yes, I'm asking seriously here. The machines care. Networking cares about timing (a lot), so I get the worry about accuracy for some things where it is critically necessary, but humans? Please.

    We sell 'automatic' winding watches that still succumb to earths gravitational with names like 'Rolex' on them for tens of thousands of dollars. All for the luxury of wearing a timekeeping device that is less accurate than damn near every $30 quartz-powered watch. The rest use a smartphone synced to NIST, so they know exactly how late they are. Every time.

    • > Who cares?  Yes, I'm asking seriously here.  The machines care.  Networking cares about timing (a lot), ...

      People and machines in space will certainly care. Imagine docking on a space station, where the capsule / shuttle / etc launched from earth which uses planetary time, while the station uses TAI. Going to be much unhappiness in that scenario.
  • And add 30 seconds to trigger it, i.e. a delta of 90 seconds before anything happens. That should bring the issue down enough to not matter much anymore.

    What, you software cannot deal with time adjustments? Go home, amateur! Seriously. These can happen without any leap-seconds as well, so anything needs to be able to handle them.

  • Find someone who knows how to do something that was last done 50 years ago. Everyone who knows first hand will be retired or dead. Historians will have hilariously wrong ideas about the reasons it needs doing and of course will have different opinions on how it was actually done. It won't work. You might as well give up on the concept if you remove it from public attention for so long. Software and devices which need to implement the leap minute won't get tested until it's too late. The leap minute will be scrubbed to avoid Y2K-like scenarios.

    If you want a time that is synchronized to Earth's rotation, you have to do the work relatively frequently or it won't get done, and we already have a name for that: TAI [wikipedia.org] Use that if you really don't care about UTC.

    • No, the standards bodies aren't so forgetful.

      It's a non-issue and won't be a problem

    • You mean long gaps like daylight savings? Sorry if I just triggered a bunch of people. I would say a minute qualifies for just jumping instead of smearing or stretching or shrinking like they do for a leap second.
    • Hmm... a simple adjustment hack that needs to be updated every half-century, or a complex formula and set of lookup tables that needs to be maintained every year?

      If you're not rooting for the former option, you probably haven't spent a lot of time refactoring other peoples' software.

      • Except it is not "a simple adjustment hack". You will still need a lookup table for leap minute. Wait, you need 2 tables now because you will have leap seconds AND leap minutes in history.

        Computer time keeping sidestep time zone and all the DST stuff by recording time in UTC. Leap minutes can't be sidestepped like those DST time zone stuff.

  • The proper solution for timestamps is to use a counter that only counts forwards and never skips.

    We could define this counter as being the number of actual seconds that passed from some arbitrary fixed point in time (akin to unix counting seconds passed since 1/1/1970, but without adjusting it for leap seconds). Where "actual seconds" means the number of seconds that actually passed since that predefined point in time. Converting from a number to a date and time is performed by taking into account the ful
    • by vux984 ( 928602 )

      Ok... I'd argue its cleaner to account for leap years with
      (365 * 10 + 2) instead of (365 * 8 + 366 * 2)

      But that aside,

      "This would fix the ambiguity of timestamps."

      What ambiguity do you refer to? We already define the time during a leap second as the 61st second in the minute... e.g. and the time should be displayed as 23:59:60.

      This is unambiguous. It's a little odd to a lot of people, and not supported by all software, but it a clean solution.

    • by ceoyoyo ( 59147 )

      If only someone had thought of something like that. It could have been mentioned in the summary even.

      • by kipsate ( 314423 )
        Not sure if you're implying this was mentioned in the summary, because it wasn't.
        • by ceoyoyo ( 59147 )

          One method, based on Earth's rotation, is as old as human timekeeping itself, an ancient and common-sense reliance on the position of the sun and stars. The other, more precise method coaxes a steady, reliable frequency from the changing state of cesium atoms and provides essential regularity for the digital devices that dominate our lives.

          • This highlights the shift from using astronomical yardsticks such as the earth rotation and its position relative to the sun to derive a time and date, to using a measure that doesnâ(TM)t depend on any of these.

            I argued that timestamp logic would be more robust if it were to use a monotically increasing counter instead of a date and time. The summary does not mention any of that.
            • by ceoyoyo ( 59147 )

              "Steady reliable frequency" is "a monotonically increasing counter." TAI has counted off a tick every tenth of a nanosecond since the fifties. You can format that number however you like.

              This is the basic problem. People like to argue about time standards without actually knowing anything about them. Everyone who has ever programmed something to do with time on a computer seems to think that local time is the one true reality. Maybe they eventually discover UTC and go from complaining about timezones to com

    • by jaa101 ( 627731 )

      There are already continuous timescales like you describe, notably TAI [wikipedia.org]. Leap seconds just increase the difference between TAI and UTC by 1. Computers should just keep TAI internally and maintain a list of leap seconds. When displaying something like a file modification date, stored as TAI, the computer can then use the list of leap seconds to convert to UTC. The computer then has to consult a historical list of "daylight savings time" rules to convert UTC to the local time of the relevant time zone.

      • by kipsate ( 314423 )
        Thank you! Didn't know TAI existed. Yes, that'd do the job.

        Surprisingly, UNIX time originally behaved exactly like TAI, causing localtime() to run fast: one second after each leap second. Sadly, with the advent of NTP, instead of adapting localtime() to correctly account for leap seconds when converting to time, the xntp daemon was adapted instead to repeat the same second twice.

        Story here: https://cr.yp.to/proto/utctai.... [cr.yp.to]
    • IIRC, That's more or less what *n*x does, and works fine, IF you only care about a single frame of reference.

      Things still get hairy if we have to extend this to airplanes, satellites, or even different latitudes, each of which are moving at a slightly speed relative to you and your own.

      Time starts getting to be a pretty gnarly thing once you have to start taking relativity into account. Yes, I know that's a different problem than the main one being discussed, but it does make your proposed solution a littl

  • This is something an Omnipotent God should straighten out! Make everything move in simple ratios. He/she/it/they fell down on the job. Maybe they are too busy teaching evangelicals how to give LGBTQ+ a shitty time; while leaving tax cheats, adulterers, coup'ers, and pussy grabbers alone.

  • I understand the desire to not have to go through this shit so often, but the less often you do it, the harder it becomes. Leap seconds are bad enough now when they only show up irregularly, but if they only happen every 50 years? And it's a whole minute? That's gonna be WAY worse to deal with because everyone who was there last time has retired.
    Heck, I'd advocate for going MORE frequently, and leaping milliseconds at a time here and there - it'll happen way more often, and thus the software involved will b

  • Just have more than one time standard. UTC just needs to line up with the day and night cycle. Publish tables every year detailing the drift.
    If you need accuracy and want to coordinate then you use both times. If you need even more accuracy with regard to the solar system then use even more.
    It really is relative and depends on your use.

  • Screw internal use of UTC in machines. Just use TAI for everything in the network. All internal timestamps. Everything. And as for UTC, just restrict it to human readable displays. And of course, never do a bidirectional conversion between TAI & UTC for data exchange between computers.

  • My idea was to leap 2-5 seconds. 1 minute seems problematic for things like GPS.

The biggest difference between time and space is that you can't reuse time. -- Merrick Furst

Working...