Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Science Technology

New Atomic Clock Pushes Boundaries of Accuracy 43

Neophytus writes "An interesting story on the BBC reports on how a new type of atomic clock is near completion that would only loose about a second in every 100 million years. Within ten years they hope to have a clock with billion year accuracy which would potentially bring advances in disease research by watching timing genes. More reports from this year's AAAS Annual meeting can be found on the BBC, and information about the event on the AAAS Annual Meeting website."
This discussion has been archived. No new comments can be posted.

New Atomic Clock Pushes Boundaries of Accuracy

Comments Filter:
  • by Anonymous Coward on Sunday February 16, 2003 @01:39PM (#5314428)
    "...that would only loose about a second in every 100 million years"

    • Every 100 million years or so a second gets loose out of the clock and raises all sorts of havok. Using less accurate clocks allows this to happen with corespondingly greater frequency. What's worse only part of the loose second needs to be consumed at a single locale in order to cause a single problem. The other day one hit here and my boot sector got over-written. That really hurt. I can only wonder what other damage it caused elsewhere -- I only burned up a millisecond.

      Surely more accurate clocks will make the world a more functional place. I am trying do devise a program that will work with the network time protocol so that I and my equipment can leap over any more of these headed our way. I always wondered why the NIST was talking about leap seconds. Yes I know that it will just hit someone else but I feel that I have stopped my share of bullets. If the NIST clocks are leaping seconds, don't blame me for doing the same.

      (Congrats on being the first to spot it. My more serious question is after your post. I totally read past the typo until re-reading to see if anyone else had raised my issue.)

      • This should've been blindingly obvious - I read it and did not understand wtf it was on about then realised it was a typo and should've been lose.

        Talking about things not making sense, did ne1 else understand the second (no pun intentended) paragraph of the parent post or maybe the poster could shed light there? The first paragraph was funny then the second just started going a bit crazy.

      • Using less accurate clocks allows this to happen with corespondingly greater frequency.

        I second the notion that this is a minute problem facing our day, and before the issue at hand goes decadent, we must work around the clock to fix it.
        • My clock is one of those WWV (broadcast radio time standard in the U.S.) synchronized LCD wall clocks. It is rectangular, almost square.

          But not to worry, I have a jigsaw and lathe. I will make it round in a few minutes.

    • BTW, I just noticed that LoseNotLooseGuy [] is back after a long absence.

      Save us, LoseNotLooseGuy! Your assistance is desperately needed!

  • When I started to read the article, I was thinking to my self
    "Who cares?!?!?!"
    But as I continued, this article kind of answered my questioning for having a clock that acurate, besides the "No, it really is 12:43.35PM coolness factor"

    Does anyone know any other good uses for such and exact timing system?
    • Re:Who Cares? (Score:3, Insightful)

      by C21 ( 643569 )
      quantum physics, when you're "looking" at particles that are so small you cannot even see them but have to examine how theyre interacting with larger particles, that .001 +% accuracy with timing helps, a lot.
    • > Does anyone know any other good uses for such and
      > exact timing system?

    • But as I continued, this article kind of answered my questioning for having a clock that acurate, besides the "No, it really is 12:43.35PM coolness factor"

      Yeah, but 12:34:56.78 PM is cooler.
    • The offshoots of this technology will eventually allow us to more accurately measure signal timing in PC's and other electronic equipment. Currently it is very expensive to get accurate time measurements with electronic systems. Optical systems are already used in the lab for femtosecond measurements. In commercial applications, we must be satisfied with a $20-30K system to measure path lengths in time with sub 100ps accuracy. If you have a PC with > 2Ghz internal CPU clock frequency you are looking at internal signals that switch every 500ps. What I want is to see more optical stuff become cheap enough for mass useage... this was another brick in the wall. -- Ross
  • Disease research (Score:4, Insightful)

    by dmanny ( 573844 ) on Sunday February 16, 2003 @01:48PM (#5314475)
    First, I did RTFA. Second, I should say that I admire such work on time precision and think it should be supported.

    However I must say that I am puzzled how any new higher precision timing source will directly help biological research in the area of genes. I did follow the recent reports of a genetic timing mechanisms being discovered but how does adding another step of resolution to the best available time source have anything to do with this research? Likely this the new clock will be far removed from any lab doing work with the genetic material in many ways -- geographic, propagation and subject matter. The currently available clocks are certainly no slouches. Are they not sufficient for biological work? How is an improved one going to help?

    In part, I ask about this particular point because, while somewhat weakly addressed in the article, it was repeated on /. I am seriously hoping a little light could be shed -- preferably based on knowledge not speculation.

    • I agree, the disease angle is purely gratuitous and silly. The journalist (and/or the editor) are probably trying to fulfill a stupid editorial edict to punch up "science" articles with 'human interest" elements. CNN is notoriously bad and sloppy in this regard. Perhaps BBC is getting that way too. And Slashdot.
      • That it was journalistic fluff was my fear. Worse still is that it may have seed in what the media was fed.

        Still I would welcome a real explanation illustrating how it might apply but I can't see it. It is difficult to say that a more accurate clock wouldn't help but I would give a lot of credence to someone in the field who said that they saw no connection either. Do you work in any area of research genetic biology?

        I do not hold and have never held /. up to the same journalistic standards as other media for hopefully obvious reasons.

        • Re:Disease research (Score:3, Informative)

          by baz00f ( 520771 )
          I am a biochemist/molecular biologist and I can't imagine what the author has in mind. I'm sure it was just a giddy deadline thing. He probably wrote something earlier on gene transcription timing or chronobiology and made an goofy link to this new "cutting edge research pushing back the foreskin of science..."

          Accuracy on the order of 1 second in 1 billion years is about 1 part in 3x10^16. I see no way that is important to have for measurements of any observable biological process.
          • Chemical reactions happen with times on the order of a femtosecond. That's about 10^-15 second. Probably protein folding would take a bit longer than that. Maybe these are the chemical reactions that they were talking about?

            It might not seem like much, but didn't someone win a nobel prize for directly observing a single chemical reaction with femtosecond timing? Someone out there must get a hardon for accurate clocks then.

            • Re:Disease research (Score:2, Informative)

              by baz00f ( 520771 )
              Yes, but we are talking accuracy here, not absolute values, so using a more accurate timebase to measure a femtosecond process still begs the question of why you would need that extreme "1 part in 10^16 of a femtosecond" accuracy. Measuring a nominally 1 fs reaction to 1 part in 100 (1% accuracy) is no doubt good enough.

              Protein folding is on the order of millseconds, but "something" is always occuring at all time scales. The Music of the Universe covers all frequencies.
  • by one9nine ( 526521 ) on Sunday February 16, 2003 @01:51PM (#5314484) Journal
    Scientists have now discoverd that it's more accurate to count "1 Alabama, 2 Alabama" instead of "1 Mississippi, 2 Mississippi". Scientists predict quartback sacks during backyard football games will increase 27% over the next four years.
  • Sounds Good... (Score:2, Informative)

    by sepluv ( 641107 )
    Well...I guess the laser won't make any sound. Seriously though, why do they need things this accurate? I don't know but I'm sure there are some scientific experiments (atomic &c) which require extremely accurate timing. It is amazing that for instance radio telescopes need at least a picosecond accuracy (so that the computers can line up signals from different telescopes in an array) so they all have atomic clocks on site.

    It appears that these clocks are still in the early conceptual stages but they sound a helluva accurate (doubt they'll need more accuracy but u don't know).

    There are more prosaic applications as well, speeding up telecommunications and making them more secure from hackers.
    Why does that require 1 second in billions of years accuracy?

    Also, shouldnt these clocks use the measurement system detailed in the official CGPM SI defintion of the second to be used as scientific master clocks.

    Official Systeme International d'Unites definition:
    #The second is the duration of 9 192 631 770 periods of the radiation
    #corresponding to the transition between the two hyperfine levels of
    #the ground state of the caesium 133 atom.

    Can they be sure that what they are measuring does not change (especially if it involves light - although I think scientists have now decided to just assume c is constant now even if it is not and now base other measures (e.g.: the metre) on the value of c)?

    • I do not work in this area but do maintain a passing acquaintance with some relatively layperson media coverage. I believe you are correct in stating the cesium definition of a second. AFAIK it has not been superceded. However stability between multiple clock sources is fairly easily detected (although I will grant you 9+ gigahertz is fairly fast).

      Given a collection of clocks, whose average is being taken, one could compute an error from that average for any individual. Further any single other clock could be compared to that average and seen to have a better, or worse, stability when compared to an individual of the collection. The new clock type would not have to be based on the canonical mechanism used in the definition of a second. If the new clock turned out to be more stable and the technology was sufficiently accessible, the definition might be updated. The definition of a second is only a convention anyway.

    • Re:Sounds Good... (Score:2, Informative)

      by Anonymous Coward
      The applications cited are stupid. Measuring the relativistic effects of reasonable masses is a better application for a hyper-accurate clock.

      And, as you mentioned, VLBI, where you aim two radio-telescopes far away (like opposite sides of the planet) at the same object and combine the signals to get higher resolution, requires time sync to within a fraction of a cycle of the frequency being observed. This can always use more accurate clocks to make longer observations at higher frequencies.

      As for the definition, it is defined that way because that is the most accurate way to measure the second currently known. If somebody finds a better way (like the trapped mercury ion system discussed here), then the second will be redefined in terms of the new, more accurate reference. Just like how the metre was changed in 1983 from a multiple of the wavelength of a certain atom's radiation to 1/299,792,458 of a light-second. This new standard is equal, to the limits of measurement, to the old one, but the limits are those of the old standard; the new one can be measured more precisely.

      (See [] for a description of how it's done... it involves building a highly stable laser, measuring its frequency against the second, using the constant 299,792,458 to compute the wavelength, and then counting wavelengths to get the distance. This gives you the meter to 7.2 parts in 10^12, compared to 2.5 parts in 10^11 for an iodine-stabilized HeNe laser or 4 parts in 10^9 for the old Krypton standard.)

      As for it not changing... nothing can ever be proved absolutely, but many people have measured it very carefully and have never observed any variation.
  • Speed of light (Score:3, Interesting)

    by Oriumpor ( 446718 ) on Sunday February 16, 2003 @02:32PM (#5314716) Homepage Journal
    am I mistaken, or will this clock (or the technology therin) help nasa and the relativity theorists? The already have "precise" clocks according to this []

    It was my understanding that the more precise the clock the easier it would be to test the speed of light.

    • Re:Speed of light (Score:4, Insightful)

      by PaddyM ( 45763 ) on Sunday February 16, 2003 @03:15PM (#5314905) Homepage
      Actually, I was going to say, that I bet it only loses 1 second every 100 million years, except if you take the clock around real fast. Then it will seem to lose all kinds of seconds.

      But everyone will just say, "proves special relativity again" instead of "proves that moving fast messes up the timing of atomic clocks".
      • But everyone will just say, "proves special relativity again" instead of "proves that moving fast messes up the timing of atomic clocks".
        Could you please explain this statement? According to special relativity, one observer will see another fast-moving observer's clock ticking slowly. This is what is observed. Am I understanding you incorrectly, or are you saying that you think there is some inherent property of atomic clocks that make them run slowly when they are moving quickly relative to some arbitrary observer, and that special relativity is incorrect?
      • ...or if you just raise or lower the clock.

        I believe the clocks on satellites in orbit run at different rates from clocks at the earth's surface.
  • Additional info (Score:3, Informative)

    by bardencj ( 122074 ) on Sunday February 16, 2003 @04:11PM (#5315163)

    You can read a little more about the background of this new clock at NIST's archive of a paper in IEEE T. Instrum. Meas. [], for those of us who foolishly let our subscription lapse...

    It would appear the chief technological development that made this clock possible was the femtosecond laser. The paper also suggests that the average error could be reduced even further than the article suggests (down to attoseconds, perhaps) if higher-order Stark and Zeeman shifts are properly treated. As for practical uses, I personally can't think of any, except to finally answer the question "Does anybody really know what time it is?" But elimination of uncertainties is laudable anyway.

  • Brownouts (Score:2, Funny)

    by Anonymous Coward
    Experiments suggest this clock may lose only one second in 100 million years.

    And the power will never go out, not in 100 million years.

  • loose = lose;

  • ..that mysterious extra "o" in "loose"...

  • by Omkar ( 618823 )
    what about relativistic effects? At that level of accuracy, wouldn't small peturbations in the earth's rotation and disturbances affect the clock? Also, does anyone know the quantum limit of time measurement? Is it around 10^-24 s or what?

"We want to create puppets that pull their own strings." -- Ann Marion "Would this make them Marionettes?" -- Jeff Daiell