Scientists Break Quantum Entanglement Record At 18 Qubits (zmescience.com) 97
hackingbear writes: Researchers at the Chinese University of Science and Technology have demonstrated stable quantum entanglement with 18 qubits, surpassing the previous world record of 10, also held by the same team. This represents a step toward realizing large-scale quantum computing, according to a recent study published in the journal Physical Review Letters. Physicist Pan Jianwei and his colleagues achieved the new record by simultaneously exploiting three different degrees of freedom-paths, polarization and orbital angular momentum of six photons, the fundamental particle of light. The outcome combination resulted in a stable 18-qubit state. Full control over the number of entangled particles determines the fundamental ability for quantum information processing, according to the study. There are early-stage quantum computers out there that argue more qubits -- such as IBM's 50-qubit machine and Google's 72-qubit Bristlecone, but in those cases, the individual quantum states of the qubits aren't (fully) controllable. "The team's next step will be to realize a 50-qubit entanglement and manipulation," according to Wang Xilin, a member of the team. The same research team also held the world record on quantum communication distance as well as operating the world's first quantum communication satellite.
Re: (Score:2)
Re: (Score:2)
Re:What is going on? (Score:5, Funny)
It means the writer believes that people who understand the phrase "simultaneous exploit of three different degrees of freedom-paths, polarization and orbital angular momentum" might not know what photons are.
Re: (Score:2)
Imagine all the pretty messages we could make if we had access to all of Unicode!
Or maybe Apple could start using apostrophes.
Re: (Score:2)
Because they type on Apple keyboards and the artists who create those keyboards think that (some Unicode character) is prettier than a plain old apostrophe.
Re: (Score:1)
Well, not really.
Classic processors will never be able to solve some problems (unless P=NP).
If QC doesn't deliver... it doesn't deliver. It's not going to become irrelevant due to classical computing. It either works (and it really is doing quantum computing), and it provides access to scaling that classical computers never will... or it doesn't.
Re: The problem with quantum computing (Score:1)
So your cat may or may not be alive?
Re:The problem with quantum computing (Score:4, Interesting)
I think you have no idea what QC actually is.
QC is limited only by the size of a stable system that you can build. But the stable systems you can build give you any/all/every answers IMMEDIATELY. That's unprecedented in standard computing. The bigger the system you can stabilise, the bigger the questions you can answer immediately (i.e. factorise this 2048-bit number), and you can answer ALL such questions in that same timeframe (so once you can break one 2048-bit key instantly, you can break them all instantly).
That's getting some SERIOUS funding from military sources, not to mention all the other questions that it can answer. Investors won't go away until the horse has been proven dead and flogged for decades, but we still just keep advancing (sure, it's not super fast, but when computing was brand new, one transistor was the size of an apple and very limited... 40 years later, we had GHz microprocessors on the head of a pin).
However, classical processors have hit size limits (too large and the signals have to be asynchronous to propagate at the speed of light and stay consistent, but we haven't even got those kinds of chips yet), speed limits (same problem with speed-of-light), thermal limits, etc. and haven't significantly advanced in years. There's a reason that every desktop is still only "2Ghz" or so (maybe 3/4 in short bursts but not sustained performance), and the fastest mainframes in the world are still in that order of magnitude. I had a 2GHz machine in 2000-something. 18 years hasn't made that much of a difference to classical processor speeds in consumer kit at room-temperature! Sure, we have multiple cores, but that's just making the processing harder, and the processor hotter, and we KNOW that now everything you want to do scales in parallel.
Classical will never make QC irrelevant, it's an entirely different kettle of fish. If anything QC will render traditional computing entirely insecure overnight. We are already having to make QC-safe algorithms now, we're so worried about what might happen if someone builds a decent-size one.
The first nation to perfect QC is going to be a world leader... it'll be bigger than the space race and the nuclear race combined. Not just from a military point of view, but also from the sheer number of problems that your physicists and mathematicians and engineers can just say "Solve this, I've reworded it in QC language" (which we're already preparing and were before physical QC machines even existed!), and they literally get the answer as soon as they press Go. You will literally advance science overnight just by having a single QC machine large enough (enough qubits) to run a decent-size QC algorithm on... it will answer questions that we currently can't answer with a billion years of traditional computation, overnight. 1000 qubits is all you need to change the world, most likely. Given that the record is beat every year, that's not far off.
Re: (Score:3)
Classical will never make QC irrelevant
That's like saying that chisels will never make hammers irrelevant.
"QC" will never make "classical" irrelevant either.
Your use of the words "classical" and "traditional" when referring to existing computers is very loaded and highly questionable. Why not just call them "digital" computers? Do you have a hidden agenda?
Re: (Score:1)
Saying "digital" is inaccurate. Both quantum computing and classic computing have digital components.
"classical" is the term used in scientific Quantum Computing literature to refer to the current way we do computing.
The people who write that literature have a hidden agenda*, yes. It's logical that they'll use language like that.
(*) ie. Getting more funding.
Re: (Score:2, Informative)
"Classical" is the term use by physicists to refer to pre-GR and pre-QM physical theories. That you don't know it it's not his fault.
Re: (Score:2)
"Classical" is the term use by physicists to refer to pre-GR and pre-QM physical theories.
Which is exactly why we shouldn't call modern digital computers "classical", since without an understanding of quantum mechanics, we would not be able to make the chips that they are built out of.
"Classical computers" may be then abacuses, Babbage's difference engine, and early electronic computers (e.g. those built out of vacuum tubes), but certainly not anyithing built with integrated circuits since the 1960s.
Re:The problem with quantum computing (Score:4, Informative)
The bigger the system you can stabilise, the bigger the questions you can answer immediately (i.e. factorise this 2048-bit number), and you can answer ALL such questions in that same timeframe (so once you can break one 2048-bit key instantly, you can break them all instantly).
That is simply not true.
http://www.quantumforquants.org/quantum-computing/limits-of-quantum-computing/
Re:The problem with quantum computing (Score:5, Interesting)
QC is limited only by the size of a stable system that you can build. But the stable systems you can build give you any/all/every answers IMMEDIATELY.
Looks like you are the clueless one here. No, it does not give you answers immediately at all. You still have to feed in the data and do computation steps and you have to do this slowly and carefully to avoid decoherence. And if it decoheres, you have to do everything again from scratch. And due to noise, you either have to add a lot of error-correcting steps or run it for a lot of times. There is nothing "immediate" here.
Re: (Score:1)
Sigh.
That's the practicalities of the prototype systems.
The answer is immediate. O(1). Not O(n) or worse as in any classical system.
It might take a day to set up the machine. The answer is immediate. It might take a day to set up for the next calculation. But the answer is immediate.
The practicalities of feeding the data and receiving the answer are irrelevant compared to a billion years of intensive conventional computation to factor a huge prime, for example.
Re: (Score:1)
Shor's algorithm doesn't run in O(1) time.
Re: (Score:1)
Neither does Grovers, which is O(\sqrt{N}).
Re:The problem with quantum computing (Score:5, Informative)
The answer is immediate. O(1). Not O(n) or worse as in any classical system.
I'm not sure how you could have come to believe this ... but it's incorrect. Quantum computers can use algorithms that scale better than classical ones - say, scaling as O(n^2) rather than O(e^n) - but they don't generate an answer *immediately*.
For example, Shor's algorithm [wikipedia.org] for factorisation runs in O((logN)^2 * loglogN * logloglogN) time, while the classical general field number sieve [wikipedia.org] which does the same thing runs in (roughly) O(e^(1.9 * (logN)^0.33 * (loglogN)^0.66) time. That's a massive improvement - going from subexponential to polynomial time - but it's still not instant.
Re: (Score:1)
First off O(1) isn't immediate. It could take 1,000 years. It's just guarantees that it'll take the same time for any input.
Quantum computing is not nearly the panacea you believe it to be. Your above posts show you have a very frail grasp of the capabilities of a large and reliable quantum computer. It cannot just take any problem reworded in quantum language and crunch an answer instantly. In fact, there are few problems which can be efficiently computed on a QC. All a QC will do is reduce (not eliminate)
Re: (Score:2)
"Immediate" is O(0). O(1) is "you have to wait, but how long does not really depend on the input". Does nobody understand the fundamental definitions anymore?
Re: (Score:2)
Sigh.
That's the practicalities of the prototype systems.
The answer is immediate. O(1). Not O(n) or worse as in any classical system.
Sounds like someone who never built anything practical of scale (and also, as other /.ers have pointed out, doesn't know what O(1) means*).
"The practicalities" of real systems are often the things that matter most. An algorithm that solves a problem in O(n^2) will finish faster on a modern computer than an algorithm that solves the same problem in O(nlogn) running on a computer from 1975. Big-O notation was invented essentially as a way to "neutrally" evaluate algorithms without taking into account the actu
Re: (Score:2)
Not to mention that some problems sped up by quantum computation (like Grover's algorithm) still won't have an instantaneous answer, but instead can be solved a good amount faster.
Not true (Score:1)
A Feymann theoretical quantum computer would go through all states instantaneously and find a perfect solution. But nobody has such a thing.
What they have is analogue computers using various properties, e.g. superconduction magnetics, spinning calcium atoms, etc. Those they try to configure into a circuit to solve a problem.
It's not instantaneous, these networks don't settle down immediately, and it doesn't always get an optimal result (in cases where alternate digital algos exist those have found better s
Re: (Score:3)
I think you have no idea what QC actually is
Great, I thought you were going to explain what it actually is, instead you explain what it might be able to do. This in a nutshell is why most people poopoo quantum computing, no one 'in the know' actually explains what it physically is. Instead we get to hear about theories and more rounds of taxpayer funding.
Re:The problem with quantum computing (Score:5, Interesting)
Probably. With the progress they are making, they will most assuredly not deliver in the next 50 years and without any fundamental breakthrough (not on the horizon and cannot be planned or forced) it may take 1000 years or longer for this to become useful at all. At the moment, they seem to be able to add about 1 Qbit/year for actual computations. And the impression that this may scale sub-linear is not off the table at all.
Time for the hype to die down, there is nothing useful this technology can do.
Re: (Score:2)
You are assuming the Chinese are telling the truth about what they've done and not performing a tail wiggling exercise for their State and its funding.
Re the fundamentalist Christian state, let me remind you the fundamentalists have a direct line to G-d through the Archangels....Gomer and Goober.
A few more bits... (Score:2)
Re: (Score:1)
Re:A few more bits... (Score:4, Informative)
Actually quite a few more. Even to break an ECC modulus, they need about 300 more. With the scaling of around 1Qbit/year since 2001 (when they factored 15 on a 7 Qbit machine), my guess would be that it will take a few centuries to get there.
Silicon scaled exponentially almost from the beginning and continued to do so for a long time. That is what makes it powerful today. QCs have never scaled better than linearly and are still at a ridiculously useless size as a consequence, after about 30 years of applied research. They may also well scale sub-linearly. The whole thing would have been dropped as a dead-end a while ago, except that many humans run after every hype that tickles their fantasy.
Re: (Score:2)
With the scaling of around 1Qbit/year since 2001 (when they factored 15 on a 7 Qbit machine), my guess would be that it will take a few centuries to get there.
Likely to become a non-linear progress, though.
Re: (Score:2)
Indeed. Looks very much like it is actually sub-linear and will hit a wall pretty soon. The effort invested today is massively larger than back in 2001 and they still have only a pathetic linear increase to show for it.
Re: (Score:2)
Couldn't they just link multiple of these machines together to reach that magic 300 Qbit number?
No. To be useful, multiple qubits have to be entangled, and share a single superposition.
Re: (Score:3)
That is the big problem with QCs: Computations on them are not divisible in smaller sub-tasks. If you have an arbitrary large number of 299 QBit QCs, they are completely worthless to solve even a single 300 QBit problem.
18 or 20 qbits? (Score:2)
One of the linked articles [sciencealert.com] claims that another team set the 'real' record at 20 qbits...
Re: (Score:2)
Read that linked article again, carefully
Broken (Score:2)
Wow... they broke quantum entanglement? Nice!
Re: (Score:2)
Due to the nature (and implementation) of Shor's Algorithm, which is used in factoring large integer on a quantum computer, the ammount of qbits needed to factor an n-bit integer is actually 2n+3. So you can snooze a bit longer... ;)
Source: https://arxiv.org/abs/quant-ph/0205095
Re: (Score:2)
I know. But due to the general cluelesness observable here on this topic, I uses 1QBit/1bit as an obvious lower bound.
Re: (Score:2)
You're assuming factoring the number is required. There are other ways to crack RSA. These may be stupider on a regular computer, but work well on a quantum one.
Re:Wake me up when they can do 2048 qbits (Score:4, Insightful)
I think it's almost always a mistake to assume linear progress on R&D -- a single breakthrough can drastically alter time-lines.
Re: (Score:2)
Observable facts say linear or worse, and there have been quite a few breakthroughs to allow even that.
My guess is that the idea of the QC will end up on the trash-heap of science, like converting lead to gold. Possible today, but completely useless.
epistemological doubts (Score:1)
The whole quantum computing concept is based on the presumption that quantum entanglement is an ontological phenomenon, a "resource". However, if entanglement turns out to be a mere epistemological effect (telling us more about what we know or what we can not know than about what is really there), then quantum computer will never outperform the scaling of classical computing.
Bell's inequality and the related experiments have ruled out naive hidden variables IF quantum interactions are local and causal. But
anonumous (Score:1)
Correction ... (Score:2)
... photons, the fundamental particle of light.
The photon is the fundamental quantum of light.
Almost time to build The Ark (Score:2)
When they get to 300, sh*t's gonna get real.