No Leap Second To Be Added To Universal Time in 2024, IERS Says (datacenterdynamics.com) 59
No leap second will be added to universal time in 2024, the International Earth Rotation and Reference Systems Service (IERS) has announced. From a report: An additional second has previously been added to the universal time as displayed by atomic clocks (UTC) when this measurement has become out of sync with the rotation of the Earth (UT1).
But in a statement released on Thursday, the IERS, which enacts changes to UTC on behalf of the International Telecommunications Union (ITU), said the difference between UTC and UT1 is not great enough to warrant a change. Changes in the relationship between UTC and UT1 sometimes occur because the Earth does not always spin at the same speed, with natural events such as earthquakes often causing small changes.
But in a statement released on Thursday, the IERS, which enacts changes to UTC on behalf of the International Telecommunications Union (ITU), said the difference between UTC and UT1 is not great enough to warrant a change. Changes in the relationship between UTC and UT1 sometimes occur because the Earth does not always spin at the same speed, with natural events such as earthquakes often causing small changes.
Who are these people? (Score:4, Funny)
Damn, I was looking forward to an extra second in bed this year.
Re: (Score:2)
Yeah, but you know she'd charge you for it.
Whew! I was so worried! (Score:2)
Hopefully we don't get 2 seconds next year risking global panic.
Re: (Score:2)
Re: (Score:3)
I wonder how many systems could even cope with 2 leap seconds in a year.
For most of the systems I have built in the past I just used UTC and ignored leap seconds entirely, but a few years back they became important for a particular project and I switched to using TAI internally. Of course I had to rely on a stored database of leap seconds, since there is no way to predict them.
Ironically the system hasn't actually needed to add any leap seconds yet, because it used 2020-01-01 as the epoch, and there haven't
Re: (Score:3)
I'd advocate not doing any more until they build up to a whole minute. On that timescale most software will never need to be updated, so the issue of having a database of leap seconds/minutes isn't much of a problem. With leap seconds we need to patch every few years as they come along.
A lot of things aren't all that connected with accuracy, so it probably wouldn't bother them if they were off by a minute. I deal with worldwide communication systems that have to have 1 second accuracy. A minute inaccuracy would break them completely. When a second gets added or subtracted, everyone adds or subtracts it. The system works pretty well, we're all synchronized worldwide to UTC. And some systems need even more.
Re: Whew! I was so worried! (Score:2)
Re: (Score:2)
Fact is: You are not using UTC, you are using Posix time. (Or could you ever display a time of 23:59:60? Or 18:29:60 in India? If not you are using Posix time).
We use NTP, not UNIX Epoch time.
Re: (Score:2)
As far as I understood, a bank in New York is not allowed to communicate with a bank in San Francisco if there is 1 second of time difference between these banks. Any successful bank transaction needs to be completed in 200 milliseconds or less. It can take a lot longer for an transaction to be approved, but at the moment such a transaction (B2B) need to take place, 200 milliseconds is the maximum set as rule. If B2C transactions are bound by the same regulations, I don't remember anymore.
There are many ind
UTC, UT1, GMT, and people who make up time zones. (Score:3, Informative)
Hi fellow Slashdot readers. Let me share with you the fun of living in Arizona, USA.
There are more than 24 time zones, and some are offset by 15 or 45 minutes... so none of this makes sense.
But hey, follow along with me:
In Arizona we celebrate mountain standard time (MST) the year round. We don't do that daylight savings time (DST) .
But before you say "so what" you should know that our northeastern corner of the state shars borders with Colorado (MST/MDT), New Mexico (MST/MDT), and Utah (MST/MDT) which we call "Four corners". The Navajo Nation lives there, and it's their land, and they also celebrate DST. So Arizona is MOSTLY MST but the Four Corners area does DST so they get to be an hour off from the rest of the state for many moons.
But wait, there's more!
Landlocked inside the middle of the Navajo Nation is the Hopi Nation, and they DO NOT celebrate MST at all. That means they are always on the same timezone as Arizona (MST).
Before telephone calls and Zoom meetings this was of no consequence. Now it's people missing out on calls because they think it's in THEIR timezone, not the listed timezone.
People have now started "invensting" time zones. ":Phoenix time." "Arizona Time" "I-40 time" etc. I get that "Phoenix time is US/Phoenix" and is always MST but still isn't the real timezone - MST. "Arizona" time would include Phoenix (UTC-0700), Navajo (UTC-0600), Hopie (UTC-0700), Navajo headed east (UTC-0700), and then New Mexico (UTC-0700).. All this changes with DST, which used to be date-fixed but the GWB administration in the US changed that.
UTC - That standard by which we set time. This is real.
UT1 - More related to the earth's daily time of rotation. We adjust to this with "leap seconds" but don't use it directly.
TAI - Atomic clock time, which is really great and montonic but we don't use it directly. We add/subtract to it to get to UTC.
Sidereal Time - More on earth's rotation... just used to generated data to help get TAI to UT1.
Please don't make up time zones. Those of you in non hysterical areas, just use the assigned time zone code. In the US lower 48 states those are PST/PDT, MST/MDT, CST/CDT, and EST/EDT. Hawaii, welcome to the no-DST club. Indiana - youu lost adding DST back in. US Military Alpha-Zulu timezones don't even begin to describe all the timezones, as there are MORE than 24 and some offset by plus or minus 15 minutes. Europe, euro on the right track.
So no more of "Pacific Time" (inexact and unclear if it's PST or PDT) and no more GMT (Greenwich is some irrrelevant place in some small place not even in Europe). It's either UTC plus or minus the offset,or an actual timezone name.
https://en.wikipedia.org/wiki/... [wikipedia.org]
The concept of measured time was created by man for our convenience. Please don't toiletize it by failing to use it properly.
See you at 7.75 (Swahili time) for dinner.
Re: (Score:1)
I don't have time to read all that shit.
Re: (Score:2)
Huh?
Re: (Score:2)
I tried to do that with ABC, but my Australian colleagues kept referencing rugby matches that don't air on my country's realization of "ABC".
Re: (Score:3)
Oh, dear gods, you've just described the hell that people working on worldwide security networks deal with every day. What you described is quite bad enough, but then you get to Brasil, which nominally uses DST, but some years doesn't. Oh, and some Brasilian cities use the timezone of the Argentinian city on the other side of the river. Or may the gods help us, India, which has a time zone 30 minutes offset from the zone to the east or west of it, one area which for unknown reasons is (IIRC) 15 minutes
Re: UTC, UT1, GMT, and people who make up time zon (Score:2)
Re: (Score:2)
It's built into the almanac. If you don't use the annual almanacs then you just insert the adjustment yourself; there's already a time conversion step from whatever your watch says to whatever standard your tables or formula are using. You probably wouldn't bother with a second though.
Re: (Score:2)
It's built into the almanac. If you don't use the annual almanacs then you just insert the adjustment yourself; there's already a time conversion step from whatever your watch says to whatever standard your tables or formula are using. You probably wouldn't bother with a second though.
And that's Why I use UTC time. The only time I shift to local time is when communicating with the rest of the world. Some folks don't like being woken in the middle of the night while it's 2:00 PM for civilians and 6:00 PM UTC for my location. but 2:00AM in Singapore or 4:00 AM in Sydney
Re: (Score:2)
And some people don't seem to grasp the concept of time zones at all. I was working in security and had to test a new installation with the electricians at a site in Atlanta. We had scheduled the testing for 8:00 a.m., but I got a phone call just before 3:00 from the techs saying, "I know that we were scheduled for 8, but it's going to be hot today so we wanted to start at 6:00." I told him that I lived in Seattle, and he said, "So? What does that have to do with anything?" It's 3:00 in the morning her
Re: (Score:2)
And some people don't seem to grasp the concept of time zones at all. I was working in security and had to test a new installation with the electricians at a site in Atlanta. We had scheduled the testing for 8:00 a.m., but I got a phone call just before 3:00 from the techs saying, "I know that we were scheduled for 8, but it's going to be hot today so we wanted to start at 6:00." I told him that I lived in Seattle, and he said, "So? What does that have to do with anything?" It's 3:00 in the morning here. The response was a puzzled, "Why?" Just call me back in five hours, idiot.
Similar incident. I was out on the west coast for a month. My assistant would call me as soon as he got into work at 7:30 AM Eastern time. Usually with dumb questions he could have figured out himself. I kept reminding him it was 4:30 AM where I was, so only call if you have to.
After a week of this, I was pretty pissed, so I told him that I was going to call him for a daily report at 9 in the evening my time - that was when I'd get off work there.
He said "9:00? that's four hours after the workday's ov
Time to get rid of leap seconds completely (Score:2)
There is very little benefit to keeping clocks in sync--to the second--with fluctuations in the earth's rotation. There have been about 27 leap seconds since the 70's when they were first introduced. Over the course of 100 years, clocks would drift by about one minute, in relation to the earth's rotation. Perhaps in 1,000 years, clocks could be adjusted by 10 minutes, all at once, if the people in that time period feel it necessary.
There are a LOT of reasons NOT to add leap seconds. Leap seconds cause a LOT
Re: (Score:2)
There's a very good reason to add leap seconds. If they cause software problems you know that the developers of that software were pretty crap and used the wrong time standard.
Re: (Score:2)
"Wrong" by what definition?
Most of the world runs on civil time and doesn't really care if we have leap seconds or not. It will take decades or centuries for UT1 to slip enough relative to TAI that people will notice the rotational differences. But if you care about precise sequences of events, you want a continuous time scale without repeated seconds. If you care about precise durations, you also don't want skipped seconds. If you are doing astronomy with small angular resolution, or you're trying to g
Re: (Score:2)
"Wrong" by the definition "causes an error."
UTC doesn't really have repeated seconds. It has minutes of varying length. That could be a problem if you take a time represented as something like Y:M:D:H:M:S because some of the Ms (could) have 59 seconds and some have 61 seconds. That means that your Y:M:D:H:M:S timestamps won't all be unique. Of course, the months also have varying lengths, which we irrationally call leap years, so even if we didn't add leap seconds those timestamps wouldn't be unique.
NTP tra
Re: (Score:2)
UTC's idea that some minutes are 61 seconds long is mostly theory rather than practice. What's the most widely deployed computer protocol that does that right? For example, how many email systems will generate a timestamp that says an email was sent at 23:59:60?
NTP is a good example of this, and its actual implementation is not what your second paragraph says. NTP documentation [ntp.org] makes essentially the same error as the Unit specification, assuming that UTC seconds are continuously counted:
the correspondence between the NTP and Unix timescales is determined only by the constant 2,208,988,800.
But the actual im
Re: (Score:2)
You're right, I shouldn't have said that NTP is linear, it's almost linear, except right after leap seconds. It is continuous and monotonic, which is important. It also *correctly* recognizes that UTC counts seconds continuously and varies the length of minutes to match varying length of days. "the correspondence between the NTP and Unix timescales is determined only by the constant 2,208,988,800" is true mos
Re: (Score:3)
This is an example of "Perfect is the enemy of good." For 99% of all software, there is no reason for developers to care about how time is calculated. They just want to do things like create timestamps for logs, or measure the duration between two timestamps. Spending time doing coding and testing of a circumstance that happens maybe once every two years, is not good return on investment. Eliminating leap seconds will eliminate an entire class of rare, but potentially harmful, bugs, that programmers shouldn
Re: (Score:2)
I didn't say developers who don't worry about it are shit. I said developers who need to worry about it and don't are shit.
If you require timestamps for intervals that are less than a second and are guaranteed unique, or you need to measure time intervals to a precision of less than a second, then you need to know why using UNIX time or anything encoded as H:M:S is a bad idea. I agree with you that the former shouldn't really be necessary: whoever designed UNIX time made a mistake.
Re: (Score:2)
Well unfortunately, most of us are driven for various reasons beyond our control, to use ISO 8601 datetime formats.
The set of developers who *should* have to worry about leap seconds, is infinitesimally small. There are very few applications that care about wobbles of the earth's rotation, outside of astronomy itself.
Re: (Score:2)
If you care about less than a second precision and you're forced to use datetime then you've got incompatible requirements. Someone in charge of you is dumb and you will have to perform some addition as a consequence. Welcome to engineering.
Re: (Score:2)
Well, "incompatible" requirements sometimes happen in the real world.
Usually, these situations happen as a result of needing to sort events in order, where the events happen on dispirit systems, some of which you might not control. Sort order can matter a lot, when it comes to things such as financial transactions or update sequence. And sometimes the difference between two events is indeed, less than a second.
Re: (Score:2)
Re: (Score:2)
System clock adjustments also cause problems. For example, if you are logging timestamps, and the system adjusts the time behind your back to account for a leap second, you might end up with log entries out of order, when sorted by timestamp. This is precisely the kind of thing that brings servers down. https://www.wired.com/2012/07/... [wired.com]
Re: (Score:2)
Re: (Score:2)
Sure, I agree it's a clever way to deal with the leap seconds. But it's far from straightforward, and has to be done at the OS level, or through some long-running service process with very high priority (so as to ensure that the adjustments are both even and regular). This software would have to be tested like any other. It's a problem that _shouldn't_ have to be solved.
It's also not practical for SAAS in the cloud, where you don't have direct access to the hardware your software is running on.
Re: (Score:2)
Time synchronization is always done at the "OS level". Without internet time synchronization, the quartz clock in computing devices will drift so rapidly that a day with leap second and a day without may look indistinguishable. Most quartz watches can't sustain sub-second accuracy for over a day just by themselves. For most container guests, they don't even need to know the existence of leap second or leap smear. They only need to know if the clock is re-synced with the internet with a short enough poll in
Re: (Score:2)
Re: (Score:2)
Yes, leap smear requires special handling and testing for some software. But what are these software?
This is the question, isn't it! You listed some things, but the reality is, no one knows until the software is tested. There are many mundane applications that rely on matching timestamps across networks. Most just *assume* that time on all systems will be correct, and match each other. Many make decisions based on the sequence of events, which can be skewed if the time on one system doesn't match the time on another system. Yes, even by one second.
Your alternative of abandoning leap seconds totally means the civil clock is going to drift and mismatch the sky gradually
Yes, this is true, by about 10 minutes over 1,000 years. In
Re: (Score:2)
Your mundane requirement of "matching timestamps across networks" don't care if the timestamps are done with leap smear. They run fine even if atomic clocks are never invented.
Your "leap hours twice a year" is done by timezone change. They don't affect UTC timestamps. And no. Frequent event is what's truly easy to compensate. Once in 100 year or once in 1000 year event is dangerous and will cause huge scale computer crash. Our computer system is NOT designed for UTC leap beyond 1 second. Those stupid "da
Re: (Score:2)
Software that want to see leap seconded version of time are those who care about the length of SI second and use it to do maths or physics. They may as well just use TAI, the time without leap second that you love so much. They don't need to wait. They can use TAI now.
Applications that care time synchronization with each other across the world works more merrily with leap smeared version of time. The world will run fine with only 2 universal time, one with UTC with smear, one with TAI that has never been
Re: (Score:2)
Here's one "mundane" application that cares about time synchronization: OneDrive.
If two people make a change to a file, from different computers, at almost the same time, which change wins? Microsoft relies on the clocks being both accurate and in sync with each other. Now, if the two computers are in different networks, one uses unsmeared time and the other smears, there is the chance that the wrong update will "win."
This pattern is applied in many contexts that are automated and high-speed and depend on t
Re: (Score:2)
Your example is meaningless as you are describing a scenario with one computer using unsmeared time and the other smears. Leap smear, if become standard, will be applied for everyone. Those who don't do leap smear will be the "wrong" computers.
If you insist some computers will refuse to do leap smear, I can claim the same for your proposal of abandoning leap seconds. Some computers will refuse to abandon leap seconds. "Oh, synchronization will fail miserably." You see? Judging a scheme by assuming there
Re: (Score:2)
There is zero chance that leap smear will become universal. Even if Microsoft made it standard on Windows servers, it won't become standard for every Linux distro, or for other systems Microsoft doesn't control. There will always be a group of sysadmins out there who swear by "official" time as provided by NIST, it won't matter to them that their time is "wrong."
If leap seconds are removed, then no, some computers will not refuse to abandon leap seconds. This is because these systems rely on NIST or some ot
Re: (Score:2)
So magically leap second abandonment is assumed to be endorsed by NIST while leap smear adoption is assumed to be NOT endorsed by NIST. Then all your "leap second abandonment is better than leap smear" will then be deduced from there conveniently. Wow.
No, this is not a fair comparison. This is not even a valid comparison.
Re: (Score:2)
NIST's job is to report the precise, official time. If "official" means leap seconds, that is precisely what they will do. They do not "smear" for the convenience of computers, that would no longer be official. On what grounds do you suppose they would make such a departure from their mandate? NIST doesn't make the rules, they just follow them.
Thankfully, the leap seconds WILL be going away, for at least 100 years, but not until 2035. https://www.theverge.com/2022/... [theverge.com]
Re: (Score:2)
Re: (Score:2)
I have no idea what you're talking about, "authority" has nothing to do with my argument. Practicality does.
Leap Minutes (Score:2)
They should let it drift a little and do leap minutes. You'd have a leap minute perhaps once in a few decades, and people would have years in advance to see it coming. The committee could then simply announce how far UTC and UT1 are out of sync each year, and if it gets close to a minute, add a leap minute. One disruption per few decades is less of a problem.
Re: (Score:2)
With frequent leap seconds, systems will get tested against it and problem solved. With rare leap minutes or even rarer leap hours, when the time comes, a lot of horrible bugs will occur similar to the Y2K bug.
So, no. Leap second is okay and fine. Don't be stupid and switch into leap minutes or leap hours.
Re: (Score:2)
Empirically, you are wrong. Many cloud computing providers choose to use "leap second smearing" precisely because application developers couldn't find and fix leap-second bugs and the cloud developers decided that being wrong by a fraction of a second during a day or a few days was better than mishandling events when a leap second got inserted. However, leap second smearing causes problems for cases where distributed clocks must agree, such as stock market orders and GNSS positioning. The trade-offs are
Re: (Score:2)
This is the lack of standardized leap smear that everyone agree upon. I support a standardized leap smear scheme that all distributed clocks agree with each other.
One can't do leap smear if it is "leap minute".
Re: (Score:2)
With frequent leap seconds, systems will get tested against it and problem solved. With rare leap minutes or even rarer leap hours, when the time comes, a lot of horrible bugs will occur similar to the Y2K bug.
So, no. Leap second is okay and fine. Don't be stupid and switch into leap minutes or leap hours.
This! I use numerous pieces of software that automatically adjust whenever a second is added or subtracted. What I am doing needs that level of accuracy, and if someone's sorftware cannot handle the adjustments, It isn't a problem of the leap seconds, it's software that needs tweaked.
TFS Ignores the Important Case (Score:3)
Changes in the relationship between UTC and UT1 sometimes occur because the Earth does not always spin at the same speed, with natural events such as earthquakes often causing small changes.
But the changes in the relationship between UTC and UT1 systematically changes because the Earth is always slowing down with rotational momentum transferred to the Moon -- and the Earth is now spinning more slowly than the standard unchanging international time standard. That is why all 27 leap second adjustments that have been made are additions. The fluctuations due to "natural events such as earthquakes often causing small changes" are why leap second addition is not predictable - they induce a quasi-random walk around the long term deceleration rate [wikipedia.org]. They are not the underlying reason for leap second addition. In principle a combination of really large fluctuations might rise to the level of calling for a negative leap second, but this has never happened, and isn't expected to.
Now there is a technical detail about this that lead the Wikipedia page on Leap Seconds to misleadingly state "It is a mistake, however, to consider leap seconds as indicators of a slowing of Earth's rotation rate; they are indicators of the accumulated difference between atomic time and time measured by Earth rotation." When international standard timekeeping was set up they chose to use as the standard second 1/86400 of the tropical year of 1900, and this does not change. The reason we add leap seconds is because the Earth has slowed down a fair bit over the last 124 years and even if the Earth were to stop slowing down entirely (magic) the mismatch between its current slower rate and the 1900 rate would require leap second addition. So yes, leap seconds are absolutely an indicator of the slowing of Earth;s rotation rate, just not on a year-to-year basis.
When leap seconds were instituted they were added annually, or even twice a year. But with global warming momentum is being transferred from the atmosphere to the solid Earth stabilizing, even slightly accelerating its rotation so there was a seven year gap after 1998 before another leap second was added, and the last leap second was added seven and a half years ago. The spin difference has fluctuated between about +0.7 and -0.65 seconds (ideally it would be +0.5 and - 0.5, but the adjustments are only made on fixed dates) and on the last adjustment it jumped from -0.4 to +0.6 and then moved down to -0.25, but in mid-2020 reversed and has climbed back up to 0. It will likely be some years before this trend reverses itself, and the difference moves back down to -0.5 or thereabouts.
Re: (Score:2)
The Wikipedia bit you quote is not wrong at all. Leap seconds are not needed because the Earth's rotation is slowing, they are needed because the average sidereal day is not exactly 86400 SI seconds long. A change in rotation rate (length of a sidereal day) is only one possible cause of that difference; an initial offset and sudden jumps are other causes.
IERS publishes data products for this "length of day" (LOD) offset, for example at https://datacenter.iers.org/pl... [iers.org] .
I thought it was until 2035 at least (Score:2)
UTC was invented as a compromise. It would have been equal to TAI in 1958 when the first atomic clocks started running. In 1970 TAI and UT1 had drifted 10 se