Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Science

Physicists May Have Found a Hard Limit on The Performance of Large Quantum Computers (sciencealert.com) 71

For circuit-based quantum computations, the achievable circuit complexity is limited by the quality of timekeeping. That's according to a new analysis published in the journal Physical Review Letters exploring "the effect of imperfect timekeeping on controlled quantum dynamics."

An announcement from the Vienna University of Technology explains its significance. "The research team was able to show that since no clock has an infinite amount of energy available (or generates an infinite amount of entropy), it can never have perfect resolution and perfect precision at the same time. This sets fundamental limits to the possibilities of quantum computers."

ScienceAlert writes: While the issue isn't exactly pressing, our ability to grow systems based on quantum operations from backroom prototypes into practical number-crunching behemoths will depend on how well we can reliably dissect the days into ever finer portions. This is a feat the researchers say will become increasingly more challenging...

"Time measurement always has to do with entropy," says senior author Marcus Huber, a systems engineer who leads a research group in the intersection of Quantum Information and Quantum Thermodynamics at the Vienna University of Technology. In their recently published theorem, Huber and his team lay out the logic that connects entropy as a thermodynamic phenomenon with resolution, demonstrating that unless you've got infinite energy at your fingertips, your fast-ticking clock will eventually run into precision problems. Or as the study's first author, theoretical physicist Florian Meier puts it, "That means: Either the clock works quickly or it works precisely — both are not possible at the same time...."

[F]or technologies like quantum computing, which rely on the temperamental nature of particles hovering on the edge of existence, timing is everything. This isn't a big problem when the number of particles is small. As they increase in number, the risk any one of them could be knocked out of their quantum critical state rises, leaving less and less time to carry out the necessary computations... This appears to be the first time researchers have looked at the physics of timekeeping itself as a potential obstacle. "Currently, the accuracy of quantum computers is still limited by other factors, for example the precision of the components used or electromagnetic fields," says Huber. "But our calculations also show that today we are not far from the regime in which the fundamental limits of time measurement play the decisive role."

This discussion has been archived. No new comments can be posted.

Physicists May Have Found a Hard Limit on The Performance of Large Quantum Computers

Comments Filter:
  • to proper performance. all those ones & zeros cant be read & written in the right order without it.

    fuzzy digits too
    • I once explained (in broad terms) to someone who had brought up the subject of their clock on their machine being off why having correct time was important and why there are time servers. I even mentioned the timing required to communicate with the probes around Mars, Jupiter, and Saturn, as well as the Voyager probes.

      This then led into people who don't log into the network on VPN at home for some time (because they're lazy or stupid. Or both) and don't come into the oficce and how that could prevent them

      • Doesn't the real reason have to do primarily with arbitrary, fickle hu-man policies? Could you not let ppl log on and only warn them when there was a problem?

        • Yeah, that's just stupid policy, I mean sure some specific authentication methods like TOTP might go out of whack as they rely on precisely the code for that 30s interval but existing VPNs, with proper pubkey crypto shouldn't be affected in any way. Maybe if the clock is out by years or decades they might not accept a certificate as being in its validity range, but that's a different issue than the local clock drifting a little every day and week and month

  • Does this suggest that for cryptographic purposes quantum computing can only go so far?
    In other words, does this mean that a future-proof cryptographic algorithm is really possible, rather than a just a race with hardware progress?

    • by gweihir ( 88907 )

      Symmetric crypto is not even threatened. And for, say, RSA, you need so many qbits that the race is leisurely winnable by the defenders.

    • After 20-odd years of work on quantum cryptanalysis, the current record from memory is factoring the number 21. That's not a 21-digit or even 21-bit number, it's the product of 3 and 7. However, this hasn't stopped an entire snake oil industry from emerging and promoting quantum-resistant everything everywhere they can. I doubt this latest result will change that, there's too much money (commercial) and academic publication credit (academia) involved for either side to call it quits.
      • Well, as of three years ago, it was 1099551473989, not 21. Please update your memory.

        • by arglebargle_xiv ( 2212710 ) on Monday December 04, 2023 @12:02AM (#64052143)
          That used classical preprocessing to reduce the value to being amenable to a three-qubit circuit. It's one of many stunt factorisations that allow someone to artificially claim a bigger number than before, a bit like the quantum supremacy claims that are announced every few months. The largest number by Shor's algorithm remains at 21.
      • 21 was using Shor's algorithm, and the team reporting that realized they'd factored larger numbers in the process. Pre-processing is arguably a bit of a "cheat", and of course there are groups which take credit for some snake-oily results, but the fact remains that prime factorization is coming along. You imply that it's only because of money and academic credit involved, which is really a silly claim to make.

      • it's the product of 3 and 7

        SPOILER ALERT!

  • by greytree ( 7124971 ) on Sunday December 03, 2023 @06:25PM (#64051491)
    You give us a story that they found the limit.

    What is it then?
    • Nobody's looked at it yet, so it's still indeterminate.
    • by rossdee ( 243626 ) on Sunday December 03, 2023 @07:08PM (#64051557)

      42

    • by jovius ( 974690 )

      Our quantitative estimates indicate that, for experi- mentally relevant gate counts and durations, imperfect timekeeping may become a significant error source when the timing uncertainty is on the picosecond scale or greater. It is crucial to emphasise that this is the un- certainty in timing the duration of each quantum gate, which must necessarily be very fast (e.g. nanoseconds) in order to counteract environment-induced decoherence. Precise timing of such short time intervals remains an outstanding technical challenge. This should be con- trasted with the sub-femtosecond uncertainty of atomic clocks, which is only achieved after integration times of seconds, hours or several days [47] depending on the sys- tem.

      From the paper (https://arxiv.org/pdf/2301.10767.pdf [arxiv.org]). I don't think they directly discuss how large a system could be. Their takeaway is that new timing strategies need to be developed. I wonder if the quantum set could be immersed in some sort of coherent field. Quantum computing is extremely fascinating - it deals with the core of the universe.

    • I think the point has been lost in translation from the paper to the reporting to the summary.

      Why would finite units of Planck time require an infinite amount of energy to measure precisely and accurately? Does measuring at a resolution of two units require half an infinite amount?
      • by jvkjvk ( 102057 )

        >Why would finite units of Planck time require an infinite amount of energy to measure precisely and accurately?

        You, like a lot of other people, don't understand what Planck time is, apparently. It is no the smallest unit of time there is. Just the smallest natural unit of time that light travels (Oh oh! I hear some people say, that's not too good for smallest unit of time!) in the smallest unit of Planck length.

        Things get smaller, both in time and space than Planck. However, at this boundary (Planck len

        • So you're saying at some point in the future we'll break quantum uncertainty and will then measure more finely than that? Seems doubtful.

          I may not know a ton of cutting-edge theories in physics, but I do know when someone's blowing hot air. If you'd care to return to the actual question posed, rather than merely posit your superior understanding, perhaps you can actually illuminate us as to how an infinite amount of energy could be required to measure something finite.
          • by jvkjvk ( 102057 )

            >aps you can actually illuminate us as to how an infinite amount of energy could be required to measure something finite.

            Because I'm telling you we don't know if it's finite or not. Simple. As I explained. Just because we can't measure more finely that that is simply boo hoo - use more energy as a theoretic solution.

            Is that simple enough for your small mind or do you need even smaller words?

            • by dpille ( 547949 )
              Hey, thanks for the disingenuous bullshit. Defensively lashing out doesn't actually make you look smart.

              Any measurement is finite even if the thing you're measuring is continuous. Yes, you can calculate the spot on the graph where 1/x = 1, but you can't measure it, because there is always a level of precision available greater than your measurement. If you want to measure where 1/x = 0, well, congratulations, you've found something that would reasonably take an infinite amount of energy to measure: infi
              • by jvkjvk ( 102057 )

                >Hey, thanks for the disingenuous bullshit. Defensively lashing out doesn't actually make you look smart.

                I only replied appropriately to YOU else attacking me first. So suck it. YOU certainly aren't looking too smart.

                >How many words do you need, and how short need they be, to understand it doesn't actually matter whether spacetime is continuous or quantized?

                As many as you can afford, because you are flat wrong. If spacetime is actually quantized, then their argument doesn't work. If spacetime is no

                • by dpille ( 547949 )
                  Buddy, if spacetime is quantized, we can return to my original question: why would it take an infinite amount of energy to measure something finite? If spacetime is not quantized, it can only take an infinite amount of energy to measure to infinite precision. Is there a conceivable technical reason why infinite precision would ever be necessary? Are we trying to entangle an infinite number of qubits or something? If so, measuring time will be the least of our problems. Are they in fact arguing "we shou
      • Time and energy have a Heisenberg's Uncertainty Principle same as position and momentum. And while the math allows for arbitrarily large energies to make up for that, eventually your equipment starts turning into black holes. So we will never get better than Plank units unless we are wrong about black holes, but then we're not even close to that yet.

    • You give us a story that they found the limit. What is it then?

      They ended the story so precisely we will never know thus proving the result in real time.

    • by HiThere ( 15173 )

      The summary clearly indicates that it's not any particular number, but rather depends on the amount of power you spend on the clock. They did say that infinite precision requires infinite power, but clearly you're going to need to stop somewhere before that, if for no other reason then because the heat buildup melts the computer.

      AFAIKT (without reading the actual article) there are a bunch of tradeoffs, and you can't be precise about what they are until you decide exactly which design you're using. No sim

  • by 93 Escort Wagon ( 326346 ) on Sunday December 03, 2023 @06:36PM (#64051511)

    But it's also true that we've yet to see whether practical quantum computers are going to be a thing at all.

    • by jd ( 1658 )

      We're basically at the same point with quantum computers as we were with electronic computers in the 1930s. We've got toy systems but that's about it.

      With regular electronic computers, we didn't know they could be useful until 1948's SSEM at Manchester University. So, about 20 years down the road.

      Quantum computers seem to be evolving at about half the speed of classical computers, so it would seem to follow that we won't know until 40 years after the first quantum computer (which was 1998). So I wouldn't ex

      • by godrik ( 1287354 )

        people have been trying to convince me that we should only focys on studying quantum computers casue they are the future.
        That was 2005. Glad I took the stance "I'll care once I can ssh on a useful one"

      • by gweihir ( 88907 )

        Actually, no. In the 1930 we knew that digital computers could be scaled up at the very least not a lot worse than with linear effort for a while. All we have for QCs is exponential effort. In several dimensions.

        • by HiThere ( 15173 )

          We didn't know how far it could be scaled up. Vacuum tubes were unstable enough that it wasn't clear how useful they'd be. That's why in the 1940's the chairman of IBM predicted a world market for less than a dozen computers. (He was thinking about a particular design There were non-vacuum tube designs that were a lot more stable, but less powerful.) Even around 1960 the IBM 7090 always had a site engineer on site, because it broke down so often it didn't make sense not to.

          • by vbdasc ( 146051 )

            IBM 7090 was a transistor computer, with much better reliability than its vacuum tube predecessor IBM 709, and didn't need to have burned elements changed every day anymore.

          • by jd ( 1658 )

            1943, according to the Interwebs. At that time, computers were hard-wired - the world's first programmable machine was built in 1948 - and programming was therefore very slow. Those computers had no internal storage as such, nothing equivalent to RAM. Values were passed on "immediately", with timing handled with mercury delay lines I believe.

            Ferranti turned the SSEM into a commercially sellable computer... And sold two of them. One to Manchester University, one to Toronto University. They then upgraded the

            • by HiThere ( 15173 )

              There were actually some computers built out of electro-mechanical relays. They were pretty reliable, but they didn't scale very well. (They were slow and used lots of power, comparatively.) They stored memory as relay position. I don't know whether they were programmable, except via patch boards (or some earlier version of the same thing). I suppose you could in principle program them by setting the relays in a particular state before hitting start, but ugh!.

        • by jd ( 1658 )

          We had electronic computers at that point, but no digital ones. The computers were hard-wired, data was fed in by tape, there was no concept of storage at that point. So anything we knew was known in theory only.

          The SSEM (the Manchester Baby) changed that. This was the first truly digital stored-program computer. It actually had RAM (32 words of 40 bits each). It was only with the SSEM that we had any idea what a computer would look like or how it would behave. It was also the first time that any theory had

  • by SoftwareArtist ( 1472499 ) on Sunday December 03, 2023 @06:49PM (#64051527)

    This is pretty much what I've suspected for a long time. Some calculations just require exponential scaling. You can push the exponentials down, but they keep popping up in new places. Proponents of quantum computing claim it isn't subject to them, but it's never been rigorously proven for anything that resembles a real physical computer. I suspected the exponential scaling would eventually reappear in a new place: either the energy needed, or the precision to which you needed to measure something, or something like that.

    We'll see how this result holds up once more people look at it. If they're right, it means the whole field is based on a wrong assumption. Quantum computers would be bound by the same scaling as classical ones.

    • by godrik ( 1287354 )

      AFAIU quantum machines don't do exponential calculation. They are operating along a different model than regular machines. And as such they can do things that the other model can't do. (And vice versa)

      It's a bit like saying that you can't go very far on scissors, but you can on a wheel.

      • The difficulty is because all quantum computing as understood today relies on isolating the system from the rest of the universe long enough to get a specific systems probability amplitudes to interact with itself in a predictable way. They only do the special sauce when no one is looking. The problem is the larger the system it appears to be exponentially more difficult to ensure every last qubit and interaction or entanglement goes off perfectly instead of something random happening in even a single pla
      • by jvkjvk ( 102057 )

        Yes, they do exponential calculation, if I understand what you mean by that. That's the "different model" than regular machines.

        As such, they have different behaviors. Such as calculating stuff exponentially more quickly.

        However, what the parent, and the article are pointing out is that there may not be such a thing as a free lunch. That as you scale up your problem with more of those magikal Qbits, something ELSE rears it's ugly head and becomes the new bottleneck, well before any theoretical limits are re

    • by gweihir ( 88907 )

      Exponential effort in computation length and number of bits. Oh, end error correction (needed to go anywhere at all) makes both factors worse.

  • This has me wondering, not sure how to put this but, does the universe compute?

    When I first came across the concept of quantum computing, I thought the notion was that very complicated things happen in physics that require a lot of math to model, so the universe must be doing some equivalent amount of computation to behave that way, and quantum computation was about finding a way to tap in to that computational ability of the universe. It seems like maybe that was a naive assumption of mine because this ar

    • This has me wondering, not sure how to put this but, does the universe compute?

      When I first came across the concept of quantum computing, I thought the notion was that very complicated things happen in physics that require a lot of math to model, so the universe must be doing some equivalent amount of computation to behave that way, and quantum computation was about finding a way to tap in to that computational ability of the universe. It seems like maybe that was a naive assumption of mine because this article suggests that no, the universe can't do that kind of computation. Stuff happens without computation somehow. Is that right?

      No, it means the universe works as a system and the problem lies in isolating your computer long enough away from any other influence to get a useful result. If it mixes back into the universe it’s no longer working. Also of note reality really isn’t deterministic from inside itself because to simulate it faster than itself at full fidelity you’d need a larger universe or one that had a higher computational density or passage of time. None of those exist. Same reason you can’

    • by gweihir ( 88907 )

      Nobody knows how the universe does it. QCs are not using the full range of things though and only work for entangled particles and the universe usually does not do these.

      • It makes no difference how it’s done, only that it is mappable to computation. Thus the universe is explainable through computations.
        • by gweihir ( 88907 )

          That is some fine pseudo-profound bullshit you have there.

          Obviously, unless you know how it works, you do not know that it is mappable to anything. Very elementary.

          • You post this, on the internet, using a computer of some sort and have the audacity to say that physical reality is unmappable to basic laws of physics or subject to engineering? Your post disproves your own logic.
            • by gweihir ( 88907 )

              You seem to be on drugs. Because that is not hat I wrote. What I wrote is "Nobody knows how the universe does it." That happens to be the literal truth. Physical reality is very much _not_ subject to the "laws" of Physics. The "laws" of Physics are merely an, at this time, incomplete and inaccurate attempt at modelling physical reality. Anybody that thinks the universe is governed by these man-made "laws" needs to have their head examined.

              Of course there are always some idiots that do not understand things

              • Obviously, unless you know how it works, you do not know that it is mappable to anything. Very elementary.

                Is precisely what you wrote and is anthetical to the foundations of science.

                Physical reality is very much _not_ subject to the "laws" of Physics. The "laws" of Physics are merely an, at this time, incomplete and inaccurate attempt at modelling physical reality

                So you do agree, except by incomplete you mean only 99.99999999% accurate and we have yet to fill in the rest. Data we have and the physical laws and principles to explain it will never be undermined in the future because they are valid for the scales of time, energy, and space for which they were conceived, it’s the oh yea and that too incremental improvement. Newton was, is, and forever will be correct. General relativit

                • >> Physical reality is very much _not_ subject to the "laws" of Physics. The "laws" of Physics are merely an, at this time, incomplete and inaccurate attempt at modelling physical reality.

                  > That you thought this was upside down reflects your lack of awareness of what is real and your post cemented it in a public forum.

                  Those statements, taken together, lead to a very interesting argument.

                  • Those statements, taken together, lead to a very interesting argument.

                    It really does. Somehow my saying “ It makes no difference how it’s done, only that it is mappable to computation. Thus the universe is explainable through computations. ” was turned into “ Obviously, unless you know how it works, you do not know that it is mappable to anything. Very elementary..” Which was subsequently turned into the strawman above. The OP is incorrect in asserting that “ QCs are not using the full range of things though and only work for entangled

    • If stuff happens without computation why should nature be subject to assumed laws of computation, such as the Law of Noncontradiction?

      • It does not matter what is actually happening, from within the system it’s likely to be fundamentally unknowable. However, it is mappable to computation to an absurd degree already, if not infinite in principle. so it does not matter, it’s equivalent.
  • What quantum device can do without a flux capacitor and turbo flaps?
    Everyone who's been on the internet knows that.

    • No.

      Thompson had a lot of strong opinions, and they weren't always correct, but he knew about birds, he knew about energy densities, and he knew about the developments of 1896 and heavier-than-air flying machines.
      Perhaps you're thinking about https://zapatopi.net/kelvin/pa... [zapatopi.net]

      • So, yes?

        Isn't Kelvin quoted in your link as saying airplanes are impossible because, implicitly, the science of thermodynamics prohibits it?

        • So, yes?

          Isn't Kelvin quoted in your link as saying airplanes are impossible because, implicitly, the science of thermodynamics prohibits it?

          I don't see how that interview implies a thermodynamic limit of any sort to heavier-than-air flight. If you know of other writings or even tangential implications from Lord Kelvin's statements that talk about such things, please let me know!

  • It's called money.

  • we'll be fine. I guess...
  • Fast and Furious become Fast and Spurious.

After the last of 16 mounting screws has been removed from an access cover, it will be discovered that the wrong access cover has been removed.

Working...