Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Science

No Leap Second To Be Added To Universal Time in 2024, IERS Says (datacenterdynamics.com) 59

No leap second will be added to universal time in 2024, the International Earth Rotation and Reference Systems Service (IERS) has announced. From a report: An additional second has previously been added to the universal time as displayed by atomic clocks (UTC) when this measurement has become out of sync with the rotation of the Earth (UT1).

But in a statement released on Thursday, the IERS, which enacts changes to UTC on behalf of the International Telecommunications Union (ITU), said the difference between UTC and UT1 is not great enough to warrant a change. Changes in the relationship between UTC and UT1 sometimes occur because the Earth does not always spin at the same speed, with natural events such as earthquakes often causing small changes.

No Leap Second To Be Added To Universal Time in 2024, IERS Says

Comments Filter:
  • by Joce640k ( 829181 ) on Friday July 05, 2024 @10:21AM (#64602699) Homepage

    Damn, I was looking forward to an extra second in bed this year.

  • Hopefully we don't get 2 seconds next year risking global panic.

    • that is some optimism... I mean, drivers (apparently) seem to freak out on the interstates frequently when they see a leaf fall from a tree on the opposite side of the highway from them, soooooooo... And heaven forbid a paper cup blow across the highway! Pandemonium!
  • by gavron ( 1300111 ) on Friday July 05, 2024 @10:45AM (#64602765)

    Hi fellow Slashdot readers. Let me share with you the fun of living in Arizona, USA.
    There are more than 24 time zones, and some are offset by 15 or 45 minutes... so none of this makes sense.
    But hey, follow along with me:

    In Arizona we celebrate mountain standard time (MST) the year round. We don't do that daylight savings time (DST) .

    But before you say "so what" you should know that our northeastern corner of the state shars borders with Colorado (MST/MDT), New Mexico (MST/MDT), and Utah (MST/MDT) which we call "Four corners". The Navajo Nation lives there, and it's their land, and they also celebrate DST. So Arizona is MOSTLY MST but the Four Corners area does DST so they get to be an hour off from the rest of the state for many moons.

    But wait, there's more!

    Landlocked inside the middle of the Navajo Nation is the Hopi Nation, and they DO NOT celebrate MST at all. That means they are always on the same timezone as Arizona (MST).

    Before telephone calls and Zoom meetings this was of no consequence. Now it's people missing out on calls because they think it's in THEIR timezone, not the listed timezone.

    People have now started "invensting" time zones. ":Phoenix time." "Arizona Time" "I-40 time" etc. I get that "Phoenix time is US/Phoenix" and is always MST but still isn't the real timezone - MST. "Arizona" time would include Phoenix (UTC-0700), Navajo (UTC-0600), Hopie (UTC-0700), Navajo headed east (UTC-0700), and then New Mexico (UTC-0700).. All this changes with DST, which used to be date-fixed but the GWB administration in the US changed that.

    UTC - That standard by which we set time. This is real.
    UT1 - More related to the earth's daily time of rotation. We adjust to this with "leap seconds" but don't use it directly.
    TAI - Atomic clock time, which is really great and montonic but we don't use it directly. We add/subtract to it to get to UTC.
    Sidereal Time - More on earth's rotation... just used to generated data to help get TAI to UT1.

    Please don't make up time zones. Those of you in non hysterical areas, just use the assigned time zone code. In the US lower 48 states those are PST/PDT, MST/MDT, CST/CDT, and EST/EDT. Hawaii, welcome to the no-DST club. Indiana - youu lost adding DST back in. US Military Alpha-Zulu timezones don't even begin to describe all the timezones, as there are MORE than 24 and some offset by plus or minus 15 minutes. Europe, euro on the right track.

    So no more of "Pacific Time" (inexact and unclear if it's PST or PDT) and no more GMT (Greenwich is some irrrelevant place in some small place not even in Europe). It's either UTC plus or minus the offset,or an actual timezone name.

    https://en.wikipedia.org/wiki/... [wikipedia.org]

    The concept of measured time was created by man for our convenience. Please don't toiletize it by failing to use it properly.
    See you at 7.75 (Swahili time) for dinner.

    • I don't have time to read all that shit.

    • by cusco ( 717999 )

      Oh, dear gods, you've just described the hell that people working on worldwide security networks deal with every day. What you described is quite bad enough, but then you get to Brasil, which nominally uses DST, but some years doesn't. Oh, and some Brasilian cities use the timezone of the Argentinian city on the other side of the river. Or may the gods help us, India, which has a time zone 30 minutes offset from the zone to the east or west of it, one area which for unknown reasons is (IIRC) 15 minutes

    • Greenwich is not in the EU, but it is most definitely in Europe. The time in the UK is GMT in winter and British Summer Time in summer.
  • There is very little benefit to keeping clocks in sync--to the second--with fluctuations in the earth's rotation. There have been about 27 leap seconds since the 70's when they were first introduced. Over the course of 100 years, clocks would drift by about one minute, in relation to the earth's rotation. Perhaps in 1,000 years, clocks could be adjusted by 10 minutes, all at once, if the people in that time period feel it necessary.

    There are a LOT of reasons NOT to add leap seconds. Leap seconds cause a LOT

    • by ceoyoyo ( 59147 )

      There's a very good reason to add leap seconds. If they cause software problems you know that the developers of that software were pretty crap and used the wrong time standard.

      • by Entrope ( 68843 )

        "Wrong" by what definition?

        Most of the world runs on civil time and doesn't really care if we have leap seconds or not. It will take decades or centuries for UT1 to slip enough relative to TAI that people will notice the rotational differences. But if you care about precise sequences of events, you want a continuous time scale without repeated seconds. If you care about precise durations, you also don't want skipped seconds. If you are doing astronomy with small angular resolution, or you're trying to g

        • by ceoyoyo ( 59147 )

          "Wrong" by the definition "causes an error."

          UTC doesn't really have repeated seconds. It has minutes of varying length. That could be a problem if you take a time represented as something like Y:M:D:H:M:S because some of the Ms (could) have 59 seconds and some have 61 seconds. That means that your Y:M:D:H:M:S timestamps won't all be unique. Of course, the months also have varying lengths, which we irrationally call leap years, so even if we didn't add leap seconds those timestamps wouldn't be unique.

          NTP tra

          • by Entrope ( 68843 )

            UTC's idea that some minutes are 61 seconds long is mostly theory rather than practice. What's the most widely deployed computer protocol that does that right? For example, how many email systems will generate a timestamp that says an email was sent at 23:59:60?

            NTP is a good example of this, and its actual implementation is not what your second paragraph says. NTP documentation [ntp.org] makes essentially the same error as the Unit specification, assuming that UTC seconds are continuously counted:

            the correspondence between the NTP and Unix timescales is determined only by the constant 2,208,988,800.

            But the actual im

            • by ceoyoyo ( 59147 )

              TP is a good example of this, and its actual implementation is not what your second paragraph says.

              You're right, I shouldn't have said that NTP is linear, it's almost linear, except right after leap seconds. It is continuous and monotonic, which is important. It also *correctly* recognizes that UTC counts seconds continuously and varies the length of minutes to match varying length of days. "the correspondence between the NTP and Unix timescales is determined only by the constant 2,208,988,800" is true mos

      • This is an example of "Perfect is the enemy of good." For 99% of all software, there is no reason for developers to care about how time is calculated. They just want to do things like create timestamps for logs, or measure the duration between two timestamps. Spending time doing coding and testing of a circumstance that happens maybe once every two years, is not good return on investment. Eliminating leap seconds will eliminate an entire class of rare, but potentially harmful, bugs, that programmers shouldn

        • by ceoyoyo ( 59147 )

          I didn't say developers who don't worry about it are shit. I said developers who need to worry about it and don't are shit.

          If you require timestamps for intervals that are less than a second and are guaranteed unique, or you need to measure time intervals to a precision of less than a second, then you need to know why using UNIX time or anything encoded as H:M:S is a bad idea. I agree with you that the former shouldn't really be necessary: whoever designed UNIX time made a mistake.

          • Well unfortunately, most of us are driven for various reasons beyond our control, to use ISO 8601 datetime formats.

            The set of developers who *should* have to worry about leap seconds, is infinitesimally small. There are very few applications that care about wobbles of the earth's rotation, outside of astronomy itself.

            • by ceoyoyo ( 59147 )

              If you care about less than a second precision and you're forced to use datetime then you've got incompatible requirements. Someone in charge of you is dumb and you will have to perform some addition as a consequence. Welcome to engineering.

              • Well, "incompatible" requirements sometimes happen in the real world.

                Usually, these situations happen as a result of needing to sort events in order, where the events happen on dispirit systems, some of which you might not control. Sort order can matter a lot, when it comes to things such as financial transactions or update sequence. And sometimes the difference between two events is indeed, less than a second.

        • And more to the point, application programmers shouldn't need to concern themselves with leap seconds anyway. That's something for the system programmers to deal with. There should be a cron job that checks a flag and either adjusts the clock up or down one second at a specified time or does nothing, and another job that checks whatever time server the box uses and sets that flag if and when needed. Then, any programs you're running that actually care about leap seconds can be kept current, and all the r
          • System clock adjustments also cause problems. For example, if you are logging timestamps, and the system adjusts the time behind your back to account for a leap second, you might end up with log entries out of order, when sorted by timestamp. This is precisely the kind of thing that brings servers down. https://www.wired.com/2012/07/... [wired.com]

        • Leap smear is a good middle ground to balance most software routine need with the leap second standard.
          • Sure, I agree it's a clever way to deal with the leap seconds. But it's far from straightforward, and has to be done at the OS level, or through some long-running service process with very high priority (so as to ensure that the adjustments are both even and regular). This software would have to be tested like any other. It's a problem that _shouldn't_ have to be solved.

            It's also not practical for SAAS in the cloud, where you don't have direct access to the hardware your software is running on.

            • Time synchronization is always done at the "OS level". Without internet time synchronization, the quartz clock in computing devices will drift so rapidly that a day with leap second and a day without may look indistinguishable. Most quartz watches can't sustain sub-second accuracy for over a day just by themselves. For most container guests, they don't even need to know the existence of leap second or leap smear. They only need to know if the clock is re-synced with the internet with a short enough poll in

              • p.s. I envision the ideal computer system will let application access both leap smeared time and leap seconded time. Software will then choose according to their need.
              • Yes, leap smear requires special handling and testing for some software. But what are these software?

                This is the question, isn't it! You listed some things, but the reality is, no one knows until the software is tested. There are many mundane applications that rely on matching timestamps across networks. Most just *assume* that time on all systems will be correct, and match each other. Many make decisions based on the sequence of events, which can be skewed if the time on one system doesn't match the time on another system. Yes, even by one second.

                Your alternative of abandoning leap seconds totally means the civil clock is going to drift and mismatch the sky gradually

                Yes, this is true, by about 10 minutes over 1,000 years. In

                • Your mundane requirement of "matching timestamps across networks" don't care if the timestamps are done with leap smear. They run fine even if atomic clocks are never invented.

                  Your "leap hours twice a year" is done by timezone change. They don't affect UTC timestamps. And no. Frequent event is what's truly easy to compensate. Once in 100 year or once in 1000 year event is dangerous and will cause huge scale computer crash. Our computer system is NOT designed for UTC leap beyond 1 second. Those stupid "da

                  • Software that want to see leap seconded version of time are those who care about the length of SI second and use it to do maths or physics. They may as well just use TAI, the time without leap second that you love so much. They don't need to wait. They can use TAI now.

                    Applications that care time synchronization with each other across the world works more merrily with leap smeared version of time. The world will run fine with only 2 universal time, one with UTC with smear, one with TAI that has never been

                  • Here's one "mundane" application that cares about time synchronization: OneDrive.

                    If two people make a change to a file, from different computers, at almost the same time, which change wins? Microsoft relies on the clocks being both accurate and in sync with each other. Now, if the two computers are in different networks, one uses unsmeared time and the other smears, there is the chance that the wrong update will "win."

                    This pattern is applied in many contexts that are automated and high-speed and depend on t

                    • Your example is meaningless as you are describing a scenario with one computer using unsmeared time and the other smears. Leap smear, if become standard, will be applied for everyone. Those who don't do leap smear will be the "wrong" computers.

                      If you insist some computers will refuse to do leap smear, I can claim the same for your proposal of abandoning leap seconds. Some computers will refuse to abandon leap seconds. "Oh, synchronization will fail miserably." You see? Judging a scheme by assuming there

                    • There is zero chance that leap smear will become universal. Even if Microsoft made it standard on Windows servers, it won't become standard for every Linux distro, or for other systems Microsoft doesn't control. There will always be a group of sysadmins out there who swear by "official" time as provided by NIST, it won't matter to them that their time is "wrong."

                      If leap seconds are removed, then no, some computers will not refuse to abandon leap seconds. This is because these systems rely on NIST or some ot

                    • So magically leap second abandonment is assumed to be endorsed by NIST while leap smear adoption is assumed to be NOT endorsed by NIST. Then all your "leap second abandonment is better than leap smear" will then be deduced from there conveniently. Wow.

                      No, this is not a fair comparison. This is not even a valid comparison.

                    • NIST's job is to report the precise, official time. If "official" means leap seconds, that is precisely what they will do. They do not "smear" for the convenience of computers, that would no longer be official. On what grounds do you suppose they would make such a departure from their mandate? NIST doesn't make the rules, they just follow them.

                      Thankfully, the leap seconds WILL be going away, for at least 100 years, but not until 2035. https://www.theverge.com/2022/... [theverge.com]

                    • So you are putting your superiority of abandoning leap second vs adopting leap smear by its current higher support in international official bodies, not in true technical merit. Appeal from authority is NOT a valid argument, sorry. It only shows your camp is better at lobbying.
                    • I have no idea what you're talking about, "authority" has nothing to do with my argument. Practicality does.

  • They should let it drift a little and do leap minutes. You'd have a leap minute perhaps once in a few decades, and people would have years in advance to see it coming. The committee could then simply announce how far UTC and UT1 are out of sync each year, and if it gets close to a minute, add a leap minute. One disruption per few decades is less of a problem.

    • With frequent leap seconds, systems will get tested against it and problem solved. With rare leap minutes or even rarer leap hours, when the time comes, a lot of horrible bugs will occur similar to the Y2K bug.

      So, no. Leap second is okay and fine. Don't be stupid and switch into leap minutes or leap hours.

      • by Entrope ( 68843 )

        Empirically, you are wrong. Many cloud computing providers choose to use "leap second smearing" precisely because application developers couldn't find and fix leap-second bugs and the cloud developers decided that being wrong by a fraction of a second during a day or a few days was better than mishandling events when a leap second got inserted. However, leap second smearing causes problems for cases where distributed clocks must agree, such as stock market orders and GNSS positioning. The trade-offs are

        • This is the lack of standardized leap smear that everyone agree upon. I support a standardized leap smear scheme that all distributed clocks agree with each other.

          One can't do leap smear if it is "leap minute".

      • With frequent leap seconds, systems will get tested against it and problem solved. With rare leap minutes or even rarer leap hours, when the time comes, a lot of horrible bugs will occur similar to the Y2K bug.

        So, no. Leap second is okay and fine. Don't be stupid and switch into leap minutes or leap hours.

        This! I use numerous pieces of software that automatically adjust whenever a second is added or subtracted. What I am doing needs that level of accuracy, and if someone's sorftware cannot handle the adjustments, It isn't a problem of the leap seconds, it's software that needs tweaked.

  • by crunchygranola ( 1954152 ) on Friday July 05, 2024 @02:30PM (#64603455)

    Changes in the relationship between UTC and UT1 sometimes occur because the Earth does not always spin at the same speed, with natural events such as earthquakes often causing small changes.

    But the changes in the relationship between UTC and UT1 systematically changes because the Earth is always slowing down with rotational momentum transferred to the Moon -- and the Earth is now spinning more slowly than the standard unchanging international time standard. That is why all 27 leap second adjustments that have been made are additions. The fluctuations due to "natural events such as earthquakes often causing small changes" are why leap second addition is not predictable - they induce a quasi-random walk around the long term deceleration rate [wikipedia.org]. They are not the underlying reason for leap second addition. In principle a combination of really large fluctuations might rise to the level of calling for a negative leap second, but this has never happened, and isn't expected to.

    Now there is a technical detail about this that lead the Wikipedia page on Leap Seconds to misleadingly state "It is a mistake, however, to consider leap seconds as indicators of a slowing of Earth's rotation rate; they are indicators of the accumulated difference between atomic time and time measured by Earth rotation." When international standard timekeeping was set up they chose to use as the standard second 1/86400 of the tropical year of 1900, and this does not change. The reason we add leap seconds is because the Earth has slowed down a fair bit over the last 124 years and even if the Earth were to stop slowing down entirely (magic) the mismatch between its current slower rate and the 1900 rate would require leap second addition. So yes, leap seconds are absolutely an indicator of the slowing of Earth;s rotation rate, just not on a year-to-year basis.

    When leap seconds were instituted they were added annually, or even twice a year. But with global warming momentum is being transferred from the atmosphere to the solid Earth stabilizing, even slightly accelerating its rotation so there was a seven year gap after 1998 before another leap second was added, and the last leap second was added seven and a half years ago. The spin difference has fluctuated between about +0.7 and -0.65 seconds (ideally it would be +0.5 and - 0.5, but the adjustments are only made on fixed dates) and on the last adjustment it jumped from -0.4 to +0.6 and then moved down to -0.25, but in mid-2020 reversed and has climbed back up to 0. It will likely be some years before this trend reverses itself, and the difference moves back down to -0.5 or thereabouts.

    • by Entrope ( 68843 )

      The Wikipedia bit you quote is not wrong at all. Leap seconds are not needed because the Earth's rotation is slowing, they are needed because the average sidereal day is not exactly 86400 SI seconds long. A change in rotation rate (length of a sidereal day) is only one possible cause of that difference; an initial offset and sudden jumps are other causes.

      IERS publishes data products for this "length of day" (LOD) offset, for example at https://datacenter.iers.org/pl... [iers.org] .

  • The article is a bit confused. There is TAI (Atomic time) moving at exactly 1 SI second per second. And there is UT1 (Solar time) which assumes that the time from noon to noon is exactly 86,400 seconds (with some corrections because Earth moves on an elliptical path around the sun, not a circle). And solar seconds are not exactly equal to SI seconds.

    UTC was invented as a compromise. It would have been equal to TAI in 1958 when the first atomic clocks started running. In 1970 TAI and UT1 had drifted 10 se

"my terminal is a lethal teaspoon." -- Patricia O Tuama

Working...