Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Google Science

Google's Quantum Computer Makes a Big Technical Leap (nytimes.com) 30

Google announced Wednesday that its quantum computer achieved the first verifiable quantum advantage, running a new algorithm 13,000 times faster than a top supercomputer. The algorithm, called Quantum Echoes, was published in the journal Nature. The results can be replicated on another quantum computer of similar quality, something Google had not demonstrated before. The quantum computer uses a chip called Willow, which was announced in December 2024. Hartmut Neven, head of Google's Quantum AI research lab, called the work a demonstration of the first algorithm with verifiable quantum advantage and a milestone on the software track.

Michel H. Devoret, who won this year's Nobel Prize in Physics and joined Google in 2023, said future quantum computers will run calculations impossible with classical algorithms. Google stopped short of claiming the work would have practical uses on its own. Instead, the company said Quantum Echoes demonstrated a technique that could be applied to other algorithms in drug discovery and materials science.

A second paper published Wednesday on arXiv showed how the method could be applied to nuclear magnetic resonance. The experiment involved a relatively small quantum system that fell short of full practical quantum advantage because it was not able to work faster than a traditional computer. Google exhaustively red-teamed the research, putting some researchers to work trying to disprove its own results.

Prineha Narang, a professor at UCLA, called the advance meaningful. The quantum computer tested two molecules, one with 15 atoms and another with 28 atoms. Results on the quantum computer matched traditional NMR and revealed information not usually available from NMR. Google's research competes against Microsoft, IBM, universities and efforts in China. The Chinese government has committed more than $15.2 billion to quantum research. Previous claims of quantum advantage have been met with skepticism.

Google's Quantum Computer Makes a Big Technical Leap

Comments Filter:
  • Chat bots are the future, and this isn't a chat bot.

    • Everybody is so negative these days. Why not say, "I'm glad to hear google is doing real physics-based research with big potential for long-term impact instead of just chat bots!"
      • They've followed the same pattern for a quarter of a century... make something people love... get 99.9% of the way to a major breakthrough... force people at gunpoint to use something... and then kill the project. It's not going to be any different this time.

      • Haters are gonna hate. It's all they have. It's so sad too.

    • by gweihir ( 88907 )

      There is no product. Just a gigantic misdirection operation. QCs do not and cannot scale to anything useful in terms of computation capabilities. You can do Physics experiments with these things, but that is it.

      • by gweihir ( 88907 )

        The truth hurts if you live your live in delusions. Moderating things down does not change the truth.

    • It is if you are a cat. I don't know why, but quantum is just like that: you stop asking "why".

  • "Quantum Leap" was RIGHT THERE! How could you miss it?

  • 100 Big corporation publishes paper claiming to finally achieve quantum supremacy
    200 Wait a few days
    300 Some random math guy demonstrates how to calculate the same thing faster on a conventional consumer laptop
    400 GOTO 100

    • 100 Big corporation publishes paper claiming to finally achieve quantum supremacy 200 Wait a few days 300 Some random math guy demonstrates how to calculate the same thing faster on a conventional consumer laptop 400 GOTO 100

      No wonder we were told to avoid goto loops.

  • The whole thing is a giant lie by misdirection. The actual factorization record, for example, after 50 years of research (!) is 35. Not 35 bits, 35. And that was not even done with Shor's algorithm but a special one that can only factor 35.

    The fact of the matter is that QC effort scales exponentially (!) in the number of bits and (!) in the length of the calculation. This tech will not ever go anywhere for computations. We may find out something about quantum effects, and that makes research worthwhile, bit

    • That's why they renamed the department Quantum AI research lab. When you are going full bullshit might as go all in and let AI do the bullshitting for you.
      • by gweihir ( 88907 )

        Ha! That does make a lot of sense. Also see from me being down-moderated that the scam apparently works nicely on a ton of people. Idiots in denial, the usual.

  • by michaelmalak ( 91262 ) <michael@michaelmalak.com> on Wednesday October 22, 2025 @01:00PM (#65743336) Homepage

    Willow can mutually entangle 105 qubits; this is not like IBM's Condor with 1121 qubits where it is suspected only a dozen or two qubits can be mutually entangled.

    The internal state of 105 mutually entangled qubits corresponds to 2^105 * 2 (for phase & amplitude) * 3 (bits per analog value, akin to SSD TLC) = 3 * 10^31 bytes = 30,000,000 yottabytes.

    Notice that I said "internal state". Reading those values out is the challenge, usually involving re-running the same program over and over to get a probability distribution. Once you measure, all the quantum states collapse.

    • by gweihir ( 88907 )

      105 qbits? And how long do they stay entangled when you actually do computations with those? Sounds pretty pathetic to me. And no, you cannot even factorize 32 bit with that, as that is already wayyy too complex.

      Throwing around large numbers does not make this thing any less useless.

      • Beauregard [arxiv.org] can handle 51 bits using 105 qubits.
        • by gweihir ( 88907 )

          Which probably is much worse, because the increased depth leads to higher decoherence risks. But it hardly matters. This stuff will never scale to useful sizes with useful computation lengths. Nothing exponential ever does unless these sizes are tiny. For QC workloads, useful sizes are large. Factoring 35 does nothing. With the one exceptions of running Physics experiments, but these are not computations, no matter the lies.

  • If you can't make a better quantum computer to solve algorithms you are interested in, just make an algorithm that your hardware performs well on! Maybe they can get some tips on this from Oracle, VW and Samsung about this.
  • I'm waiting for the quantum LLM processing where it makes it's own algorithm and doesn't know how to say what it is. Once the processing is both quantum and AI, the hype level should tear a hole in reality, which will be momentarily fun to watch.
    • by gweihir ( 88907 )

      At current scaling progress, you probably need to wait a few billion years for a QC that can actually run a small LLM. Or maybe it is not even possible in this universe.

The disks are getting full; purge a file today.

Working...