Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Science

Is Quantum Computing Impossible? (ieee.org) 222

"Quantum computing is complex and it's not all it's cracked up to be," writes Slashdot reader nickwinlund77, pointing to this new article from IEEE Spectrum arguing it's "not in our foreseeable future": Having spent decades conducting research in quantum and condensed-matter physics, I've developed my very pessimistic view. It's based on an understanding of the gargantuan technical challenges that would have to be overcome to ever make quantum computing work.... Experts estimate that the number of qubits needed for a useful quantum computer, one that could compete with your laptop in solving certain kinds of interesting problems, is between 1,000 and 100,000. So the number of continuous parameters describing the state of such a useful quantum computer at any given moment must be at least 2**1,000, which is to say about 10**300. That's a very big number indeed. How big? It is much, much greater than the number of subatomic particles in the observable universe. To repeat: A useful quantum computer needs to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe. At this point in a description of a possible future technology, a hardheaded engineer loses interest....

[I]t's absolutely unimaginable how to keep errors under control for the 10300 continuous parameters that must be processed by a useful quantum computer. Yet quantum-computing theorists have succeeded in convincing the general public that this is feasible.... Even without considering these impossibly large numbers, it's sobering that no one has yet figured out how to combine many physical qubits into a smaller number of logical qubits that can compute something useful. And it's not like this hasn't long been a key goal.... On the hardware front, advanced research is under way, with a 49-qubit chip (Intel), a 50-qubit chip (IBM), and a 72-qubit chip (Google) having recently been fabricated and studied. The eventual outcome of this activity is not entirely clear, especially because these companies have not revealed the details of their work...

I believe that, appearances to the contrary, the quantum computing fervor is nearing its end. That's because a few decades is the maximum lifetime of any big bubble in technology or science. After a certain period, too many unfulfilled promises have been made, and anyone who has been following the topic starts to get annoyed by further announcements of impending breakthroughs. What's more, by that time all the tenured faculty positions in the field are already occupied. The proponents have grown older and less zealous, while the younger generation seeks something completely new and more likely to succeed.

He advises quantum computing researchers to follow the advice of IBM physicist Rolf Landauer. Decades ago Landauer warned quantum computing's proponents that they needed a disclaimer in all of their publications.

"This scheme, like all other schemes for quantum computation, relies on speculative technology, does not in its current form take into account all possible sources of noise, unreliability and manufacturing error, and probably will not work."
This discussion has been archived. No new comments can be posted.

Is Quantum Computing Impossible?

Comments Filter:
  • by Ukab the Great ( 87152 ) on Sunday November 18, 2018 @07:16PM (#57664698)

    Quantum computing is simultaneously both possible and impossible.

    • Quantum computing is simultaneously both possible and impossible.

      It was possible, the last time I looked. Then I looked again, and it wasn't.

      • Yep. You can use it crack certain cryptography problems faster; problem though, the algorithm scales differently and doubling the key size makes it much harder to crack. Whereas, using traditional brute force on regular computers, doubling the key size only helps a little bit. So 32 bit encryption will eventually fall to quantum computers.

        But even 128 bit keys, it doesn't look promising to crack more than a tiny volume of stuff, even in hundreds of years.

        It will be valuable to certain areas of scientific re

        • You can use it crack certain cryptography problems faster;

          One in particular: That maths wonks are running out of excuses to design new algorithms. There's only so many zero-knowlege group key management IND-CCA blind signcryption schemes you can publish before people fall asleep. By coming up with this unicorn-magic break-all-existing-algorithms space-alien wish-fulfilment technology, said maths wonks get another ten to twenty years of publishing papers on algorithms resistant to unicorns, magic, sharks with lasers, and so on. That's why there's so much concern

        • Re:Simple answer (Score:4, Insightful)

          by glenebob ( 414078 ) on Monday November 19, 2018 @03:32AM (#57666120)

          Yep. You can use it crack certain cryptography problems faster; problem though, the algorithm scales differently and doubling the key size makes it much harder to crack. Whereas, using traditional brute force on regular computers, doubling the key size only helps a little bit.

          Is it opposite day? I must have missed the tweet.

    • by Presence Eternal ( 56763 ) on Sunday November 18, 2018 @07:52PM (#57664816)

      It is good to see people thinking outside of the box.

      The cat keeps distracting them.

    • by msauve ( 701917 ) on Sunday November 18, 2018 @08:09PM (#57664870)
      Wave if you're a particle!
    • by Trogre ( 513942 )

      Quick, somebody measure it!

    • Yes, but the more precisely the possibilitness is determined, the less precisely the impossibilitness can be known, and vice versa
  • by SuperKendall ( 25149 ) on Sunday November 18, 2018 @07:21PM (#57664714)

    I can't be the only one here that goes to look for a bug that vanishes when I am doing any kind of problem.

    Now THAT is Quantum Computing.

  • Huh? (Score:5, Insightful)

    by JaredOfEuropa ( 526365 ) on Sunday November 18, 2018 @07:24PM (#57664720) Journal

    A useful quantum computer needs to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe

    I thought that the whole point of quantum computers was that there's no need to describe or process all possible states. And that the difficulty of practical quantum computers is that the qubits need to "work together": you can't just make 1 cubit, then make 1023 more and build yourself a 1024 cubit computer.

    The guy obviously knows way more about quantum computers than I do. But I've never seen the difficulties of quantum computing described in this manner.

    • Re:Huh? (Score:4, Interesting)

      by Ramze ( 640788 ) on Sunday November 18, 2018 @08:19PM (#57664912)

      I think the key problem is theory vs physical reality. In theory, if you have a set of qubits entangled with zero noise at near absolute zero, you can send a quantum program to the qubits & have them process your data without you worrying about what their individual states are & then capture their completed output.

      In reality, how do you entangle enough qubits to be useful? How do you prevent noise or correct for the errors of noise? How do you ensure your qubits are properly entangled? How do you accurately send your quantum program to the qubits for processing? How do you aide in processing the qubits accurately without generating more noise? How do you extract the output without generating more noise? And ultimately, how are you going to ensure that you are entangling 10^300 attributes of your qubits perfectly in the first place, much less correcting for errors in processing them?

      I think the TL,DR is that this quantum physicist sees all the places errors can creep in and how difficult it can be to correct for them. The answers he sees coming from the community seems to be to just add more qubits for error correction - or even process the same data with multiple quantum computers or with multiple paths through the same qubits.

      I understand his/her frustration. It seems a difficult task to precisely manipulate qubits using modern technology, and an impossible task to know and/or set the states of everything to ultimately know for certain whether an error has been generated.

      • Most of those problems are just engineering issues which are being constantly improved: larger, lower noise systems of qubits with longer and longer coherence times are being made every single day. The issue with noise and error is, as it turns out, already a basically solved problem: quantum error correction [wikipedia.org] exists. As long as your qubits are good enough, you can devise systems that are error-free. Without error correction quantum computers would almost certainly never work. With it, creating one is just a

        • Re:Huh? (Score:4, Insightful)

          by Ramze ( 640788 ) on Sunday November 18, 2018 @09:55PM (#57665344)

          I tend to agree, and apparently so do IBM, Google, et al. Still, the larger the system, the more error prone it becomes. Obviously, we have quantum computers (or at least functioning parts of ones) working today and can entangle up to 50 qubits or more with relative stability... but, the question is whether we can do it at the scale needed to be "useful" (according to this individual) without losing the signal for all the noise.

          This person's perspective is that what we naively see as an engineering problem to be resolved with future refinements is actually an issue that can't be resolved because nature at a fundamental particle physics level can't be controlled or tuned to the degree necessary to get one working, nor reasonably checked for accuracy because the states to be checked are beyond astronomical.

        • Re:Huh? (Score:4, Insightful)

          by sjames ( 1099 ) on Monday November 19, 2018 @01:36AM (#57665908) Homepage Journal

          Hot fusion is also "just an engineering problem".

          • Re: (Score:2, Insightful)

            by Anonymous Coward

            Hot fusion is also "just an engineering problem".

            Using the word "also" makes it look like you are grouping fusion and quantum computing into the same level of possible, which is both not true and possibly showing a deep misunderstanding of the phrase "just an engineering problem"

            It comes down to how different people use the word "impossible"

            To some, impossible means the laws of physics explicitly do not allow it.
            To others, impossible means the laws of physics may not yet exclude it but there are no examples to demonstrate it could happen.

            "Just an engineer

        • by SirSlud ( 67381 )

          Most of those problems are just engineering issues

          Yes, that's the point engineering issues so challenging that they will not be solved in a 'for pratical purposes' future.

          The issue with noise and error is, as it turns out, already a basically solved problem

          I assume Mikhail Dyakonov knows more about this stuff than you do and thus that you know not the devil in the details.

        • by jythie ( 914043 )
          The thing about engineering problems is they still have to contend with the question of what is and is not possible, but are even more constrained in their options than purely theoretical. The limits of engineering hit LONG before the limits of ideal physics.
      • by Anonymous Coward

        For all problems that can be meaningfully sped up in a quantum computer, it's easy to check the answer using a classical computer. So as long as there's a good probability of extracting the right answer we're good.

      • Re:Huh? (Score:5, Interesting)

        by epine ( 68316 ) on Sunday November 18, 2018 @09:55PM (#57665340)

        In reality, how do you entangle enough qubits to be useful? How do you prevent noise or correct for the errors of noise? How do you ensure your qubits are properly entangled? How do you accurately send your quantum program to the qubits for processing? How do you aide in processing the qubits accurately without generating more noise?

        I've never believed in quantum computing, because I've never seen a lay publication that does half of these questions justice.

        Under you've seen the ceiling properly described, a technology simply doesn't exist.

        No one in this field ever bothers to describe the ceiling.

        In CMOS, you always had "when does the transistor become too small?" Some of the early answers were wrong (100 nm was once mooted as a frightening bogie man), but at least you would read sensible speculation.

        At what point, in a practical sense, does the quantum entanglistor become inseparable from local environmental noise?

        Silence. Crickets. Crickets on top of crickets. Crickets inside of crickets. Crickets alive and dead at the same time. All kinds of crickets. But never any sensible speculation.

        • by ceoyoyo ( 59147 )

          I'm not sure I follow. You don't believe quantum computing is possible because the people you've asked about it are hesitant to provide incorrect guesses?

      • In theory, if you have a set of transistors connected with no-resistance wires, you can send program to the bits & have the CPU prrocess your data without you worrying about what their individual states are & then capture their completed output.

        In reality, how do you connect enough transistors to be useful? How do you prevent EMI noise or correct errors from the noise in wires? How do you ensure your transistors are correctly connected? How do you accurately write the program to the memory? How do y

    • by quax ( 19371 )

      Actually your are pretty much on point.

      Frankly this article is pretty pathetic and embarrassing.

    • Both of the issues he raises (error handling while scaling up qubits and proof of outperforming classical computing) have been addressed (in fact, they've been posted to /. before,) and are no longer considered issues.
      Chances are this guy is a cryptocoin shill, realized they will be worthless by 2023 when quantum computers can run Shor's algorithm, and is campaigning against them in the hopes of making people lose interest. It's sad and pathetic, really, trying to stop technological development for memebi
    • I think his point roughly boils down to: We are counting this 1000 qubit computer being a juxtaposition of all 2^1000 states, in the first place. How do you know you have achieved that status? Are you going to measure those 2^1000 states? No, of course not.

      The initial state is something that is easy to describe in the abstract on a chalkboard, but what it really means in the physical world is extremely problematic.

      There is a lot of handwaving around this point from the quantum computing enthusiasts. "Oh

    • by Anonymous Coward

      >But I've never seen the difficulties of quantum computing described in this manner.

      That's because you've been reading hype by people who have an economic interest in keeping the hype train rolling...

    • I thought that the whole point of quantum computers was that there's no need to describe or process all possible states. And that the difficulty of practical quantum computers is that the qubits need to "work together": you can't just make 1 cubit, then make 1023 more and build yourself a 1024 cubit computer.

      The guy obviously knows way more about quantum computers than I do. But I've never seen the difficulties of quantum computing described in this manner.

      I think it's important to express measures in this way because it keeps everyone honest. People are cheating at least in marketing jargon. Simply belching out number of qubits in something is like belching out the number of transistors in a flash drive and using that to draw conclusions about it's processing performance relative to other components.

      Given we have people building "topological" computers with a whole lot of qubits that don't map to anything resembling exponential performance curve I think it

  • And has been for about 2 decades or so. Even if the physical universe supports it (and that is a big if, given the exactness required and the problem of noise), it may well be impossible to build a QC of meaningful size. It does look now very much that it is either infeasible or far, far in the future (i.e. >100 years and possibly much more).

    And to all you attack dogs that cannot bear having your dreams criticized: I am not opposed to QC in any way. I just do not see it happening.

    • by ganv ( 881057 )
      It seems odd that these pessimistic voices are speaking up just as real calculations are beginning to be done on quantum computers. Google is making claims that their 72 Q-bit system might achieve quantum supremacy in the coming year, meaning it would outperform a classical computer on a certain problem. Most think this claim is a bit exaggerated, but it seems it is going to happen in the next decade. The trapped ion computers are just being scaled to more than a few Q-bits. These are hard projects. I
      • by ceoyoyo ( 59147 )

        That's not quite what quantum supremacy means. It's not their quantum computer doing a computation faster than a conventional computer. That would be a very slippery benchmark. First question... what conventional computer?

        Quantum supremacy means demonstrating that your quantum computer can complete certain computations with less computational complexity than a classical computer. In the typical examples, the classical complexity is exponential and the quantum complexity is theorized to be subexponential.

        Th

        • by gweihir ( 88907 )

          There is a difference between "nice effect" and "actually useful"....

          • by ceoyoyo ( 59147 )

            Yes. Quantum supremacy doesn't mean you've got a computer that does something that wasn't possible before. It means you've got a computer that, scaled up, could do something that wasn't possible before.

            Properly scaled up is a major caveat, particularly in quantum computing.

            • by gweihir ( 88907 )

              Scaling will be the killer. So far, QCs seem to be scaling extremely badly, and I do not really see that changing. If it remains like this, useful sizes will not happen, and the whole idea will go to the (pretty large) heap of alternate computing hardware that did not pan out.

              • by ceoyoyo ( 59147 )

                Yes. I was involved in teaching a month-long quantum computing course, and the experts in the field were pretty skeptical. Approaches like D-Wave's might turn out to be the winner: build some specialized hardware that works with the limitations and then try to find problems that work on it. The other approach, trying to make a general purpose quantum computer, is a much dicier proposition. Even then, Shor's algorithm isn't quite what the pop science articles paint it as.

      • by jythie ( 914043 )
        Wel, yeah. As hype increases people talk more about the subject, thus one hears dissenting opinions more often than when it is out of the news cycle.
  • by igor.sfiligoi ( 1517549 ) on Sunday November 18, 2018 @07:33PM (#57664744)

    The author makes a great point about the near impossibility of perfect, error-free quantum computation.
    But this has been realized a few years back by most quantum algorithm developers, too.

    Many recent algorithms assume that the quantum computation will be partially faulty.
    And they work around it.

    Yes, that makes these algorithms harder to design and they are less efficient compared to the ones assuming no errors, but they still seem to provide a way forward.
    I would definitely not write off quantum computing yet.

    • You needn't write off quantum computing. However, others are.

      There are ways to null the noise, but such methods and algorithms need to be repetitious, while errors are minimized. At some point, a linear method of error reduction becomes possible at a plausible size/cost/effort. That revolution is not now, as described.

      Many millions, perhaps billions of $currency have been spent so far, with results that are realistically described by the poster. This is not like the olden days, when people started integrati

    • The author makes a great point about the near impossibility of perfect, error-free quantum computation. But this has been realized a few years back by most quantum algorithm developers, too.

      Many recent algorithms assume that the quantum computation will be partially faulty. And they work around it.

      Yes, that makes these algorithms harder to design and they are less efficient compared to the ones assuming no errors, but they still seem to provide a way forward. I would definitely not write off quantum computing yet.

      If whatever augmentation you can dream up doesn't follow an exponential growth curve (NONE OF THEM DO) then it's not worth thinking about on these scales.

      Lets say your able to do quantum error correcting using fan-outs of supporting qubits. None of the imagined schemes to achieve this come close to exponential scaling.

      Likewise no kind of oversampling or related scheme anyone has been able to dream up to account for noise allows exponential scaling.

      There becomes a decidedly non-exponential curve after which

  • We'll know when the wave equation collapses.
  • ... because of it being such a subjective topic. Hint: you have one built in.

  • Makes no sense (Score:5, Insightful)

    by cryptizard ( 2629853 ) on Sunday November 18, 2018 @07:53PM (#57664822)

    So the number of continuous parameters describing the state of such a useful quantum computer at any given moment must be at least 2**1,000, which is to say about 10**300. That's a very big number indeed. How big? It is much, much greater than the number of subatomic particles in the observable universe.

    I am struggling to come up with some way that this part makes any sense at all. It sounds like the kind of thing someone who is definitely not an expert the area would say. He is expressing the number of possible configurations of 1,000 qubits but that is only something you care about if you are simulating a quantum computer with a classical one. The whole point of quantum computers is that you don't have to do that.

    Also a simple counterexample to this sentiment is given later on, when mentioning that Google already has a 72-qubit computer. Just storing the states of a 72-qubit machine would be substantially more than the entire capacity of the internet, implying that since we somehow did it then enumerating all the states is not necessary.

  • by Anonymous Coward

    No

  • I'm pretty sure that the CIA and friends would pay all of the money (*all of it*) to have a box that could crack public key encryption. How feasible is this? Is it on the horizon, or one of those things (like practical fusion) that always will be?

  • I think it is waste of time and money, with errors in quantum computing, this is like making an analogue computer to work.
    We should focus of using the light as a signal with proper switches and keep the computers digital/binary.

  • I tend to be skeptical about things like QC because we have heard so many hyped up stories over the years. Cold fusion, fusion generators, room-temperature super-conduction, 100 mpg engines, etc. are just a few examples. That does even begin to address things like WinFS, the decentralized web, mainstream crypto-currencies, or 100 TB hard drives that were supposed to be here by now. But plenty of breakthrough technologies have come about in spite of all the skepticism around them (I am working on one of my o
  • my clients/employers would ever have any kind access to/make use of. I don't know why any employee/contractor would accept that as a terms of employment
    As for implants for my defective eyes and/or other senses, computer interfacing, nervous system interfacing I would definitely consider it when it looks advantageous and useful.
    But in reality I am probably to old (63) to get there.

    Just my 2 cents ;)
  • Perhaps writing the code in this way would help with the problem suggested in this article. Whats the name of that language again? lol
  • by g.random ( 724823 ) on Sunday November 18, 2018 @08:53PM (#57665068)
    If an elderly but distinguished scientist says that something is possible, he is almost certainly right; but if he says that it is impossible, he is very probably wrong. -- Arthus C. Clarke
    • by Anonymous Coward

      A writer of fiction's opinion on cutting edge scientist is as good as the scientist's fiction writing skills.

      Expertise is non-transferable.

    • by McWilde ( 643703 )
      The ultimate consequence of this quote always seems to me that everything is very probably possible.
  • by Anonymous Coward

    1000 qubits have 2^1000 possible states, yes. That does not mean you need 2^1000 parameters to describe them. 1000 will do.

    By the argument in this article, electronic computers with 1Kb of RAM are impossible.

  • Makes since, lots of money at stake and they only have ~5 years before quantum computers destroy their pump & dump campaign, gotta keep the suckers buying shitcoins.
  • It just comes down to Cost vs Benefit vs bragging rights e.g. Fusion power while there are cheaper energy sources it is no rush to make fusion power except bragging rights at the moment same goes with Quantum computing if the benefit could be defined as giving a nation a really good advantage it would have lots of money thrown at it which would then attract more people to work on it
  • ... i want to know when the Positronic Brain will be perfected!

  • The author makes a significant error which falsies his entire line of reasoning. The number of continuous variables in a 1000 qubit register is 2000, not 2^1000. Furthermore, the least technically difficult application of a qubit is to create an ALU operating on two qubit based registers. In that system the only entanglement is between the two electrons in a cubit, something that has been accomplished. The number of discrete states which can be held by each qubit depends on how noise free the system is, whi
  • by Anonymous Coward

    WT actual F is going on in the comments section right now?! In the last few days I've had to raise my filter from -1 up to 1, and the quality of discussion is still basically trash. The apk impersonator spam in first doesn't help, but even that's just the tip of the iceberg.

    If ever there were a potential application for quantum computing, I'd say that's it's in /. moderation!

    Maybe I should actually log in for a change and accrue some mod points to tackle this.

  • is another AI winter? https://en.wikipedia.org/wiki/... [wikipedia.org]
    Great for getting all possible mil and gov funding.
  • Then a Quantum Blockchain Coin. Or Quantum VR. Or Quantum NOSQL databases. C'mon people, think out of your comfy Einstein inspired box! OK, not really. Fusion reactors. Self driving cars. Quantum computing. Sometimes the last 10% or 5% or 1% of development is where the rubber doesn't always meet the road and the whole thing, no matter how promising/life changing/world saving (pick any two) finally just doesn't work in the real world with real world requirements and expectations.
    • by Tablizer ( 95088 )

      Quantum AI blockchain running node.js microservices via VR running NOSQL databases in the IOT edge cloud.

      Oh, I just had a synergygasm!

  • by sgunhouse ( 1050564 ) on Monday November 19, 2018 @12:47AM (#57665820)
    By his logic... my very first computer was an RCA VIP, it came with a whopping 2K of RAM. That's a measly 16384 bits - not counting internal registers, flags, etc.So to actually model all the possible internal states of just the RAM is 2^16384 which is roughly 10^500. I'm sure you know how the rest of the argument goes.

    A thousand qubits is simply 1000 mutually interacting particles. You're not trying to represent every possible state (and as the possible states are infinite, you couldn't). His argument is complete nonsense and tells you nothing at all.
  • by Tablizer ( 95088 ) on Monday November 19, 2018 @02:24AM (#57665986) Journal

    Quantum physics is always teasing us with almosts: almost instantaneous communication, almost energy out of nowhere, almost backward time travel, etc.

    After all these teases, I'd bet on quantum computing having an inherent flaw nobody has discovered yet.

    Schrodinger Lucy is holding the football again...

  • by aleksander suur ( 4765615 ) on Monday November 19, 2018 @04:30AM (#57666226)
    First of all, a quantum computer is not a regular computer with added magical pixie dust, it's not a "better" computer, it's a very different type of computer. Generally much more limited computer at that, but it can solve a certain subset of problems that a conventional computer practically cant. All a quantum computer needs to do in order to be a roaring success is to solve one such impossible problem. I suspect we are pretty close to that.

    Quantum computer is more like a test tube than a computer. In the sense that the best way to find out how a chemical reaction will run is to do it in a test tube, instead of trying to simulate in on a classical computer. Quantum computer is just more generic than that and you can reduce wider range of problems down to quantum algorithms.

  • They gave up trying to make the Field Effect Transistor in the '30s until the right technology came along...

  • I think this article really overestimates the drive for quantum computing on a grand scale, and I don't think it was ever sold to use by the experts as something that we would actually see in "the foreseeable future". As such, that makes the author's premise disingenuous.

    I always assumed we'd have optical computers long, LONG before a general-purpose quantum computer, and I don't think it's unreasonable to stand by that statement. That said, I don't think that warrants slowing down any resaerch towards
  • by gotan ( 60103 ) on Monday November 19, 2018 @11:50AM (#57667902) Homepage

    To my understanding these are the core arguments of the article:

    1) The feasibility of quantum computing is based on the assumption, that the effort (e.g. for error correction) scales with the number of qbits (in the example 1000), not the dimension of the superimposable state vector (2^1000). According to the author it is not yet proven that that is the case.

    2) For a useful quantum computer it must be possible to manipulate qbits (with quantum gates) at will, i.e. move them around and "process" them like we do with classical bits in a classical computer nowadays.

    3) In theoretical concepts of quantum computers perfect quantum gates are assumed, but quantum gates are physical devices. Rotating a spin by 90 deg might be achieved by applying a magnetic field of a given strength for a precise length of time. But in the physical world the precision of such manipulations is always finite, so maybe the result is somewhere between an 89 and 91 deg rotation and the axis might be slightly off too. Such imprecision might even occur when storing or transferring qbits (the information) in/between their physical storage. In lengthier calculations such errors add up, a bit like in analog computers. That would (severely?) limit the usefulness of quantum computers.

    This is very unlike classical logical gates where anything above a certain voltage is interpreted as "1", anything below as "0" and logical gates consist of voltage controlled switches (transistors) in either "on" or "off" state that is clearly defined and leaves a wide error margin in terms of voltage.

    To summarize: The physical world is far messier than the theoretical concepts of quantum computing and it has yet to be shown, that error correction mechanisms to control that "messiness" are feasible.

    These problems are not new, and AFAIK there are theoretical as well as experimental efforts made to counter them. The article presents a very disillusioned view of the advances in that respect and suggests that it might be even impossible to overcome the problems. Sadly, instead of making the points by giving examples of the efforts and the advances or non-advances that were made, a lot of space in the article is simply wasted by pointless comparisons of the number of superimposable quantum states to the number of particles in the universe and the like. The question is not how big that number is but if it really represents the size of the obstacle/necessary effort on the way to quantum computing.

    OTOH it should be noted, that even the theoretical concepts of quantum computing, i.e. quantum information theory, broadened our understanding of quantum mechanics. E.g. experiments on entangled states like EPR, delayed quantum eraser or "quantum teleportation" (which should really be named "quantum state teleportation") can be viewed from a new perspective.

  • About time we heard some sense instead of constant cheerleading. Just because scientists and engineers say something is doable and should be done, it doesn't mean there is any reality to the thing. Let's hear more actual opposition based on real science and math to easy plans and projects . I'm sick of hearing breathless pie-in-the-sky schemes that are given the imprimatur of science and tech that are just manipulations for money, position, or fame.
  • So it is not as easy as those ass hats in Personel implied. Get back to work.
  • QC may be impossible (we won't know if we don't try), but for sure not for the misconceived reasons stated in the OP.

    * The point in QC is that to control 2^300 states you need to control 300 qubits, and for Quantum error correcting sequences

    * People started to think about quantum error correction about 2 decades ago, and have come great lengths in reducing the overhead since then

    * The big question is not if it is technologically feasible (would be in latest 20 years from now), but if highly entangled Quant

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...