Physicists Extend Qubit Lifespan In Pivotal Validation of Quantum Computing (sciencealert.com) 22
An anonymous reader quotes a report from ScienceAlert: Keeping enough qubits in their ideal state long enough for computations has so far proved a challenge. In a new experiment, scientists were able to keep a qubit in that state for twice as long as normal. Along the way, they demonstrated the practicality of quantum error correction (QEC), a process that keeps quantum information intact for longer by introducing room for redundancy and error removal. The idea of QEC has been around since the mid-90s, but it's now been shown to work in real time. Part of the reason for the experiment's success was the introduction of machine learning AI algorithms to tweak the error correction routine.
"For the first time, we have shown that making the system more redundant and actively detecting and correcting quantum errors provided a gain in the resilience of quantum information," says physicist Michel Devoret, from Yale University in Connecticut. [...] Like many quantum physics experiments, this one was run at ultra-cold temperatures -- a hundred times colder than outer space, in this case. The setup has to be carefully controlled in order to protect the qubit as much as possible. The error-corrected qubit lasted for 1.8 milliseconds -- only a blink as we might experience it, but an impressive span for a qubit operating on the quantum level. Now the research team will be able to refine the process further. "Our experiment shows that quantum error correction is a real practical tool," says Devoret. "It's more than just a proof-of-principle demonstration." In this case the breakthrough was down to several different factors, rather than one change. The QEC code was actually one from 2001, but improvements to it as well as upgrades to the quantum circuit fabrication process made a difference.
"Our experiment validates a cornerstone assumption of quantum computing, and this makes me very excited about the future of this field," says Volodymyr Sivak, a research scientist at Google and formerly at Yale University.
The research has been published in Nature.
"For the first time, we have shown that making the system more redundant and actively detecting and correcting quantum errors provided a gain in the resilience of quantum information," says physicist Michel Devoret, from Yale University in Connecticut. [...] Like many quantum physics experiments, this one was run at ultra-cold temperatures -- a hundred times colder than outer space, in this case. The setup has to be carefully controlled in order to protect the qubit as much as possible. The error-corrected qubit lasted for 1.8 milliseconds -- only a blink as we might experience it, but an impressive span for a qubit operating on the quantum level. Now the research team will be able to refine the process further. "Our experiment shows that quantum error correction is a real practical tool," says Devoret. "It's more than just a proof-of-principle demonstration." In this case the breakthrough was down to several different factors, rather than one change. The QEC code was actually one from 2001, but improvements to it as well as upgrades to the quantum circuit fabrication process made a difference.
"Our experiment validates a cornerstone assumption of quantum computing, and this makes me very excited about the future of this field," says Volodymyr Sivak, a research scientist at Google and formerly at Yale University.
The research has been published in Nature.
A hundred times colder than outer space? (Score:1)
And how exactly was that calculated? Just curious.
Re: (Score:1)
Re: (Score:2)
brb (Score:1)
changing my password to 2048 char
Re: (Score:2)
I recently changed mine to hunter2^11
50 years. (Score:3)
It seems like that it will be 50 years before we get past the most rudimentary computations requiring quantum computing. The concept of quantum computing id fascinating but there simply aren't a wide variety of applications for it. I have no doubt it will continue to be pushed due to it's usefulness in breaking cryptography but beyond that I don't see much promise beyond very specialized applications.
I hope that in 100 years that I will appear like a total idiot akin to the critics talking about computers in the 1940s but I fear that I will not.
Re: (Score:3)
The concept of quantum computing id fascinating but there simply aren't a wide variety of applications for it.
That was also true of today's computers that were once described as a solution looking for a problem.
Share an independent source to support that assertion, please. The fact is that computers were very useful from day one, both before and after ENIAC. They could immediately tackle practical problems much faster than people could deal with by any other means at the time. Which is not at all the case with quantum computers, after more than a decade of intensive research and humongous funding.
Yea, figuring out... (Score:1)
...the highest factor of 2 to the 18th power in 52 minutes is pretty useful stuff for the Everyman.
Re: (Score:1)
Very snappy and condescending, but also retarded, because analog computers existed at the time and there was already a need for business machines, and because the highest prime factor of 2^18 is 2 you nutsack.
Re: (Score:2)
They could immediately tackle practical problems much faster than people could deal with by any other means at the time.
Re: (Score:2)
While the current quantum computers are basically useless, there are many quantum algorithms that would be huge. For example, Monte Carlo integration on quantum machines the error in the mean goes as 1/N trials compared to 1/sqrt(N) for classical. Such an algorithm would upend engineering, chemistry, biology, finance, etc. (A. Montanaro, http://dx.doi.org/10.1098/rspa... [doi.org]).
The question is when will quantum computers have enough fidelity to do such an algorithm. This paper is a step. Or half step.
Validation? Bullshit! (Score:1)
It is still unclear whether this will ever be of any practical use for computations.
Re: (Score:2)
At what point will quantum computers be. (Score:1)
...used for improving future quantum computers? That will be the tipping point. And what about when AI goes quantum? it will be all over for biological beings.
Re: (Score:2)
Well, when I first looked into QCs, they could factor 12. So a factor of (less than) 2x in 35 years! And now maybe another factor of 2x now. If they continue at this pace, the sun will explode before they catch up with a cheap PC.
Quantum Squantum (Score:2)
I suspect I've got a lot of good company.
Still haven't answered the critical question (Score:2)
Is the amount of energy required to keep coherence of N qubits exponential in N? If so, quantum computers are mere curiosities and won't affect anything.
In a way, it would make sense for that to be true: the laws of physics conspiring against "magic".