

Google's Quantum Computer Makes a Big Technical Leap (nytimes.com) 30
Google announced Wednesday that its quantum computer achieved the first verifiable quantum advantage, running a new algorithm 13,000 times faster than a top supercomputer. The algorithm, called Quantum Echoes, was published in the journal Nature. The results can be replicated on another quantum computer of similar quality, something Google had not demonstrated before. The quantum computer uses a chip called Willow, which was announced in December 2024. Hartmut Neven, head of Google's Quantum AI research lab, called the work a demonstration of the first algorithm with verifiable quantum advantage and a milestone on the software track.
Michel H. Devoret, who won this year's Nobel Prize in Physics and joined Google in 2023, said future quantum computers will run calculations impossible with classical algorithms. Google stopped short of claiming the work would have practical uses on its own. Instead, the company said Quantum Echoes demonstrated a technique that could be applied to other algorithms in drug discovery and materials science.
A second paper published Wednesday on arXiv showed how the method could be applied to nuclear magnetic resonance. The experiment involved a relatively small quantum system that fell short of full practical quantum advantage because it was not able to work faster than a traditional computer. Google exhaustively red-teamed the research, putting some researchers to work trying to disprove its own results.
Prineha Narang, a professor at UCLA, called the advance meaningful. The quantum computer tested two molecules, one with 15 atoms and another with 28 atoms. Results on the quantum computer matched traditional NMR and revealed information not usually available from NMR. Google's research competes against Microsoft, IBM, universities and efforts in China. The Chinese government has committed more than $15.2 billion to quantum research. Previous claims of quantum advantage have been met with skepticism.
Michel H. Devoret, who won this year's Nobel Prize in Physics and joined Google in 2023, said future quantum computers will run calculations impossible with classical algorithms. Google stopped short of claiming the work would have practical uses on its own. Instead, the company said Quantum Echoes demonstrated a technique that could be applied to other algorithms in drug discovery and materials science.
A second paper published Wednesday on arXiv showed how the method could be applied to nuclear magnetic resonance. The experiment involved a relatively small quantum system that fell short of full practical quantum advantage because it was not able to work faster than a traditional computer. Google exhaustively red-teamed the research, putting some researchers to work trying to disprove its own results.
Prineha Narang, a professor at UCLA, called the advance meaningful. The quantum computer tested two molecules, one with 15 atoms and another with 28 atoms. Results on the quantum computer matched traditional NMR and revealed information not usually available from NMR. Google's research competes against Microsoft, IBM, universities and efforts in China. The Chinese government has committed more than $15.2 billion to quantum research. Previous claims of quantum advantage have been met with skepticism.
Product killed because it's not a chat bot (Score:2)
Chat bots are the future, and this isn't a chat bot.
Re: (Score:2)
It's google (Score:2)
They've followed the same pattern for a quarter of a century... make something people love... get 99.9% of the way to a major breakthrough... force people at gunpoint to use something... and then kill the project. It's not going to be any different this time.
Half [qbit] flaH (Score:1)
Re: (Score:1)
Haters are gonna hate. It's all they have. It's so sad too.
Re: (Score:1)
There is no product. Just a gigantic misdirection operation. QCs do not and cannot scale to anything useful in terms of computation capabilities. You can do Physics experiments with these things, but that is it.
Re: (Score:2)
The truth hurts if you live your live in delusions. Moderating things down does not change the truth.
Re:killed because it's not a chat bot (Score:1)
It is if you are a cat. I don't know why, but quantum is just like that: you stop asking "why".
WTF, headline writers? (Score:2)
"Quantum Leap" was RIGHT THERE! How could you miss it?
Re: (Score:2)
More like "tiny quantum hop".
Re: (Score:2)
Re: (Score:1)
My prediction is quantum computing will never produce anything of value, same as AI, same as bitcoin, same as NFTs. Just a dumb grift.
After 50 years of research with no computing capabilities to speak of as of today, this prediction is easy to make. The only thing you need to do is look at actual facts instead of getting blinded by flowery language and empty promises.
Re: (Score:2)
And by the negative moderation, we can see nicely that the scam works well on the usual idiots.
Re: (Score:2)
> Google stopped short of claiming the work would have practical uses on its own Just like AI slop, never any actual examples of it working as advertised. Just 'its slop now, but in a few years it will take over the world' My prediction is quantum computing will never produce anything of value, same as AI, same as bitcoin, same as NFTs. Just a dumb grift.
Just you wait. One day, the AI slopo-sphere will merge completely with quantum computing and then the whole world will be in utter AWE of the quantum-powered slopo-sphere!
Re: (Score:1)
Wait wait wait....the quantum-powered slopo-sphere doesn't have crypto. You meant to say "quantum-powered slopo-sphere...on the blockchain!"
Re: (Score:2)
Wait wait wait....the quantum-powered slopo-sphere doesn't have crypto. You meant to say "quantum-powered slopo-sphere...on the blockchain!"
Somewhere a trend-hopper just passed out.
Re: (Score:3)
Actually, for simulating molecules quantum computers OUGHT to have a big advantage. But it's not clear to me that for this purpose Google has an advantage over DWave. And that's a really special case (though sometimes important).
Quantum algorithm so far (Score:2)
Re: (Score:2)
No wonder we were told to avoid goto loops.
No, it does not (Score:1)
The whole thing is a giant lie by misdirection. The actual factorization record, for example, after 50 years of research (!) is 35. Not 35 bits, 35. And that was not even done with Shor's algorithm but a special one that can only factor 35.
The fact of the matter is that QC effort scales exponentially (!) in the number of bits and (!) in the length of the calculation. This tech will not ever go anywhere for computations. We may find out something about quantum effects, and that makes research worthwhile, bit
Re: (Score:2)
Re: (Score:2)
Ha! That does make a lot of sense. Also see from me being down-moderated that the scam apparently works nicely on a ton of people. Idiots in denial, the usual.
30 million yottabytes (Score:3)
Willow can mutually entangle 105 qubits; this is not like IBM's Condor with 1121 qubits where it is suspected only a dozen or two qubits can be mutually entangled.
The internal state of 105 mutually entangled qubits corresponds to 2^105 * 2 (for phase & amplitude) * 3 (bits per analog value, akin to SSD TLC) = 3 * 10^31 bytes = 30,000,000 yottabytes.
Notice that I said "internal state". Reading those values out is the challenge, usually involving re-running the same program over and over to get a probability distribution. Once you measure, all the quantum states collapse.
Re: (Score:2)
105 qbits? And how long do they stay entangled when you actually do computations with those? Sounds pretty pathetic to me. And no, you cannot even factorize 32 bit with that, as that is already wayyy too complex.
Throwing around large numbers does not make this thing any less useless.
Re: (Score:2)
Re: (Score:2)
Which probably is much worse, because the increased depth leads to higher decoherence risks. But it hardly matters. This stuff will never scale to useful sizes with useful computation lengths. Nothing exponential ever does unless these sizes are tiny. For QC workloads, useful sizes are large. Factoring 35 does nothing. With the one exceptions of running Physics experiments, but these are not computations, no matter the lies.
Make a better algorithm (Score:2)
Algorithm? (Score:2)
Re: (Score:2)
At current scaling progress, you probably need to wait a few billion years for a QC that can actually run a small LLM. Or maybe it is not even possible in this universe.