Australian Researchers Devise Fault-Tolerant Quantum Computer 63
schliz writes "Researchers have devised a theoretical quantum computer that could function even if one in four qubits were missing. The design is claimed to be the first that tolerates both qubit loss and decoherance to this extent. It performs calculations by measuring, rather than manipulating qubits, so there are fewer points of failure."
Is this news? (Score:4, Interesting)
The problem with this (Score:4, Interesting)
The problem here is the one fault it's not tolerant of is that it isn't even close to being a practical quantum computer, and so lands squarely in that magic world with all the high efficiency solar cells, nanotube based ultracaps, and the myriad of medical discoveries, of which only a very, very few actually make it to market -- the rest are dead ends, for whatever reason. I am actually beginning to find these announcements a little depressing. Either there's something really wrong with our "get it to market" system, or there's an awful lot of bullcrap out there. Neither answer is good.
Re:The problem with this (Score:3, Interesting)
Re:Is this news? (Score:3, Interesting)
Most likely that depends on the error rate of the physical hardware. The more errors the fault tolerance has to deal with, the more overhead there will be.
Latency is not really what is important to quantum computers. The typical use case for quantum computers is long running computations. The more interesting question is by how large a factor does the number of qubits increase, and does the possible number of qubits in a quantum computer increase fast enough with little increase in error rate to make such techniques useful? The next question will be by which factor the computation will be slowed down. But if the technique is feasible then the slowdown probably doesn't matter as the quantum computer will be much faster than all the alternatives anyway.