Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Science

A Pair Of Quantum Computing Articles 161

Will G writes: "3DRage has posted an article entitled "Quantum Computers: How they work and How they will effect us" by Alan Cline. Not only can quantum computers run one billion times faster than typical silicon-based computers, but also theoretically, they can run and consume no energy. That being true, quantum computers could obsolete the silicon chip much as the transistor did the vacuum tube. This paper is intended for the general reader, and explains basic quantum computer features, and the paradoxical effects quantum theory produces in a practical world. This paper discusses how quantum computers originated, the inevitability of their use, and how they differ from classical computers." An interesting nugget to add to this comes from leelaw2000, who writes: "New Scientist have published this little news story about the development of a kind of quantum shielding that might help the development of real quantum computers. Now if they can just get Quake on it ..."
This discussion has been archived. No new comments can be posted.

A Pair Of Quantum Computing Aricles

Comments Filter:
  • by volsung ( 378 )
    I shall be forced to say "Nee!" at you.

    Nee!

    Nee!

    Nee!

    Now go fetch me a shrubbery!

  • Any disc using GMR (Giant Magnetic Resistance), use principles of quantum mechanics. This is very different than quantum computing, and it is used today, in any HD greater than 1.8 GB

    (GMR was discovered in 88 by Mário Baibich)
  • I wonder if this would increase the amount of functional code being written. The lisp guys finally win after all?
  • Sounds like an interesting story, but I'm getting a 'Forbidden' message on the 3drage link...anyone got another link handy?
  • With computing power like that how many of these would we actually need.

    We would just need to figure out how to use quantum entanglement to communicate with the server and we could all use the same computer.

    instantaneous communication would that mean no more lag in xpilot?
  • a working (simple) quantum computer has already been built

    I don't think so. There has not, to my knowledge, been a single experiment proving that quantum entanglement exists that didn't rely on fudged data. (Usually, they extrapolate based on probabilities of photon detection). All observed phenomena (we're talking about entanglement here) can be easily explained by local processes - instantaneous 'action at a distance' has never been shown definitively.
  • The amount of misconception and hype surrounding quantum computing is absolutely staggering. There is absolutely no experimental evidence or rational justification that QC will require zero energy, nor is there any evidence that QC will EVER be feasible for computation.

    What does happen, in many of these cases (as cold fusion was to some degree) is that a LOT of physicists survive on grants gleefully awarded by unknowing foundations and governmental agencies (in other words, by money from you and me folks). Any questioning of QC is instantly stamped 'troll' and dismissed from journals - too many people rely on it for their livelihood. When one does manage to pin them down, even the physicists will argue for 'quantum magic' and mysticism! Puh-lease!

    This came up a couple of times before in Slashdot, and there was a really interesting commentary by someone (Nightlight3) who seemed to know what they were talking about, and had some very rational reasons for why quantum computing won't happen. Read it here [slashdot.org]. It also contains links to other informative articles.

    But, these fantasy stories certainly do draw readership.

    Hype hype hype hype hype hype hype hype hype hype hype hype hype hype hype hype!
  • Obviously the reboot will require somewhat ER type operations. A computer that runs on power is rebooted by removing power. A computer that doesn't use power is rebooted by adding power.

    1. Squirt conductive gell on your paddles
    2. Rub together
    3. Yell "Clear!"
    4. Put paddles on the 'Reboot Terminals' (tm)
    5. Shock that computer into the boot loading state...

    Note: Multiple shocks may be required.

  • There is a glaring inaccuracy in the presentation of the 3Drage article which I feel compelled to correct. Number one, the article states that a quantum computer has no net energy consumption. Of course, this isn't true. While the quantum computer is inherently reversible, this only means that it will consume the minimum amount of energy needed. You have to erase information in the quantum computer eventually, or else junk bits accumulate, and hence dissipate a cquantity of heat equal to k_BTln 2 (where k_B is Boltzmann's constant). The irreversibility also means only that an adiabatic cycle applies in the thermodynamic system of a quantum computer, meaning only that the minimum amount of energy is consumed, which is not zero (I forget the formula though). And in practice, you'll wind up using simple things like lasers, RF transmitters, and things like that to control your quantum computer via a classical machine, meaning that you necessarily will dissipate heat much more than this minimum. To say you can do work (such as computation) without consuming energy is in direct violation of the laws of thermodynamics. TANSTAAFL. And to say that a quantum computer is a billion times faster than a classical machine is an egregious overstatement. In all likelihood, quantum computers can't perform basic arithmetic operations any faster than a classical computer; in fact, because of the way they are constructed, quantum computers are likely to be considerably slower at these things! You can't push a qubit to switch faster than the underlying quantum technology will allow, and the times that most quantum systems take to switch are slow by the standards of today's computers. The most promising approach so far, based on nuclear magnetic resonance, and described in the 3drage article, has switching times measured in milliseconds, which makes it even slower than the Intel 4004 in this respect, because the rate at which you can reliably send RF pulses to alter the state of the qubits is so low. The only reason why this can be faster than even today's supercomputers is because--with the right algorithms--you can take advantage of quantum parallelism. Meaning you can do factoring and extract discrete logarithms in O(n^2) time instead of in exponential time, and searches in O(sqrt(n)) time instead of linear time. If you can find a quantum algorithm for 3D rendering (which is not so impossible), then you may have such dramatic speedups as well.
  • I find it particularly amusing that the most powerful computing science the world can conceive of is based on apparently illogical theories and imaginary numbers. Sure, they work, but that's gotta keep a few scientists awake and staring at the walls at night. Good for them :)

    Imagine what the tiniest error could do in a logic matrix this complicated... how susceptible to interference is the quantum bit?

    Also imagine the possibilities for networking... if you were to take into consideration the "action at a distance" effect to transmit data somehow you could transmit a great deal of data in zero time. You'd still have to send that first bit, but after that, you could (possibly?) maintain an instantaneous communication state regardless of distance. Guess we won't have to build fiber lines to Mars anymore :)

    Great, the site has gone down now before I finally got to the G article... either an elite hack, slashdot effect or certain parties don't want us to know that information :) Wonder if there are any spanners out there keeping tabs on this stuff... if you take time travel into consideration you can build a computer that will provide you with the answer instantly before it does the actual calculations... just have it send the answer information back in time. Put that in your pipe and smoke it :)
  • (-1, Offtopic ;-)

    ...or in rather old-fashioned physics terminology, as quoted in Flanders & Swann's [tripod.com] excellent song "The First and Second Law" [tripod.com]

    The First Law of Thermodynamics:-
    Heat is work and work is heat
    ...
    The Second Law of Thermodynamics:-
    Heat cannot of itself pass from one body to a hotter body

    sorry - no explicit mention of the third law. (But some gems such as

    Heat is work and work's a curse
    And all the heat in the universe
    Is gonna cool down,
    'Cos it can't increase
    Then there'll be no more work
    And there'll be perfect peace
    )

    At the risk of drifting even further off-topic, they also have a great Gnu [tripod.com] song

    I'm a Gnu
    I'm a Gnu
    The g-nicest work of g-nature in the zoo
    Maybe we could persuade RMS to sing it for us?
    --
  • Bonker writes: Early experiments where researchers shot electrons through tiny holes in a lead sheild and onto film created similiar diffraction patterns, because, since electrons are indeed particles, they are also waves.

    Actually, those are the later experiments :-) The early experiments came before film existed, in 1801. Thomas Young, that Georgian-era nerd, did simple experiments with light coming through two slits. Later, watching ducks at the park, he saw the constructive interference of their wakes on the water, and realized that light had a wave nature.

    Not impressive enough for you? Well, a decade later Young studied the newly-discovered Rosetta Stone, and through it became the first modern person to translate Egyptian hieroglyphics!
  • Foul language, a sign of hostility.

    That Monty Python guy, *he* (That's right, "him") is one funny guy. Jackass.

  • You're completely right and probably a physicist. What made me stop reading this article was the example the authors give of reversible processes: the diffusion of molecules from a perfume bottle. This is one of the canonical examples of a thermodynamically irreversible process.

    Jan-Pascal
  • The first thing to come to mind after reading the article was Jane from Orson Scott Card's Ender series (Ender's Game, Speaker for the Dead, Xenocide, Children of the Mind). The circumstances of her creation were a little different than just flipping on a computer, but she was still a quantum computer for most of her life.

    We've got a ways to go before reality is stranger than fiction.
  • Agreed, Tubes are not obsolete... Like you said, in high power applications you can only use tubes because transistors cannot control the power necessary for things such as radar and high-power broadcasting. Imagine a vacuum tube the size of a person... I'm sceptical of QC in general, mostly cuz im a sceptical kinda person ;-) But from what I understand of QC, its mostly special purpose, so this wont exactly help your game of quake3....
  • Odd, since I don't have tubes in my speakers...
    --
  • Because with a machine this powerful built by Intel, you just know they'll cause a rip in the space/time continuum and cause the end of all reality as we know it.

    Or, it'll just create BSOD's billions of times faster than any current silicon-based PC.

  • Why is it that whenever anything interesting is posted on Slashdot, people CANNOT RESIST throwing in the obligatory inane "Just wait until it runs Linux" / "Hope someone's porting Quake" / "Imagine that as your firewall" / "blah blah blah Descent" / "blah blah blah Descent" comments?

    It's a running gag. Just like with various friends and family I cannot resist throwing in references to The Great Race ("Push the button, Max!"), Monty Python and the Quest for the Holy Grail ("So, logically, if she weighs the same as a duck..."), The Princess Bride ("Inconceivable!"), Pinky and the Brain (Pinky singing "Brainstem! Brainstem!"), Bill Murray's Star Wars lounge singer (don't ask)...each reference carrying not only its literal meaning but the accumulated context of previous uses.

    It's just one of those bizzare aspects of human behavior that certain phrases are often repeated...some sort of culture binding mechanism, I suppose.

    So, can you imagine a Beowulf cluster of these? B-)

    Tom Swiss | the infamous tms | http://www.infamous.net/

  • If you don't want a faster computer, you're not an imaginative person. Either that, or you can't program.

    I didn't say I didn't want a faster computer...I said such people exist. (Though with what I'm playing with these days, I'm satified with hardware that some would consider outdated; my fastest is a 500Mhz K6, and I'm typing this on a P-90 I use mostly as an X terminal.)

    Just like cars - some people find that a basic transportation mobile meets all their needs, and innovations that allow more speed and higher accelation just don't excite them much.

    Tom Swiss | the infamous tms | http://www.infamous.net/

  • Didn't ocelotbob say we're at the stage digital computers were at in the 1930's?
    ie we have them but its very limited and 'ivory tower' at the moment
  • Have a look on Freshmeat for QCL and QDD, these are efforts to get the software ball rolling (somewhat) while we still don't have the hardware.

    Of Course this isn't practically useful, but we can forsee many software/language changes that are in store. These are theoretical exercises that let you sniff around and prototype but not test.

    The first company that puts a quantum register on a PCI card will make a pretty penny :)

  • Not actually dead; it says:
    "Forbidden

    You don't have permission to access /guides/quantum/ on this server.

    Apache/1.3.3 Server at www.3drage.com Port 80"
  • >Insert amusing quip about favorite operating system/game/programing language running on quantum computer

    Mod this guy up. He said it all.

  • Could somebody put a mirror of the 3DRage article up somewhere? I get 403 errors..

    /alex
  • by rkent ( 73434 )
    Heh... okay, adequately addressed in other posts. But they weren't there when I started writing, I promise!
  • You point a flashlight at it, thus bombarding it with photons, untill the BSOD goes away!
  • when a quantum computer hangs, exactly how do you reboot a machine which consumes no power?

    Hold it upside-down and shake vigorously.

  • Enough with the Quantum computing articles. Sure it kicks ass, and we all want them, but can't we just let this one rest until we can at least have one of the damned things?
  • Um, no. A one time pad is this. Deliver a key that is as long as your cipher text, composed of completely random numbers. Add each number in the key to the corresponding number in the cipher text to produce your encrypted content. This is completely unbreakable, assuming that you have a source for generating truly random numbers. The reason that it is called a one time pad is that governments used to give their agents pads of numbers to encode messages- as they would use pages they would tear them off and destroy them. Hence the name one time pad.
  • In quantum terms, while the boss in the lab observing the programmers, they are in particle state, doing one thing at a time. But when the boss is out of the room, no longer able to observe them, they change to a wave state, and can be said to be doing everything at once. The boss returns, and they collapse back into particle state. Therefore, to increase productivity, the boss should stay out of the lab.

    I'm not really sure I get the "wave state". Is that in contrast to being units, they become one big heterogeneous wave, with all their particles mixed together?

    Maybe it's that I just don't really know anything about "quatum wave properties", I took Chemistry for my science sequence in college, I took Physics in high school (which was some time ago). Never really got into anything cool like this in my studies, (although my Calculus professor did explain the chaos theory, which was cool).

  • I'm trying to read about Quantum and I keep getting an Apache error message about unauthorized access.

    If someone has a dumbed-down explanation of Quantum computing that would be nice, 'cause I still have no idea what they are. It seems like they're trying to represent bits at the molecular level, but why that's better is beyond me.
  • Dude, I would own if I could be in all possible places and yet none of them at once! Thresh is going down!

    psxndc

  • Tech Companies Prepare for Quantum Computing

    San Jose, CA - In the wake of recent advances in quantum computing research, the world's hi-tech companies are gearing up for what is sure to be the single most significant breakthrough in IT since the integrated circuit. "We have committed our most valued resources to proactive quantum computing initiatives," said Steve Ballmer of Microsoft. "In fact, a year from now, we'll be known as Miqrosoft!" Jenny Tright, PR directory for Oracle-soon-to-be-Oraqle makes similar claims: "Transitioning to a new paradigm is never easy. We worked long and hard with our high-paid marketing consultants before deciding to capitalize on this new technology. After countless bagels, pizza lunches, and French dinners, we arrived at the ultimate strategy: a name change!"

    Considerable re-directions seem to be in store for many technology firms, yet some are content with their current strategy. Compaq, in a recent press release, states, "We already have what it will take to compete in the new quantum computing marketplace. Don't be fooled by the imitators--we had a 'q' in our name first!"

    Apple could not be reched for comment.

  • Well, I THINK the link is broken. It doesn't work, and it hasn't since the article appeared on slashdot. Anyone got an alternate or something?
  • From the very last line of the article:

    Computer science is still immature for its barely 80 years, [...]. Who knows what the next 870 years will bring?

    870 years eh? My guess is that the 2.6 kernel might be out by then...

  • I'm kind of curious. One of the things that is always mentioned when quantum computing is mentioned is that factoring two large numbers becomes feasible. Is that based upon the numbers we use today or does this go for any factorisation? Does this apply to any other problems that are presumed to be "difficult"?
  • I'm sure quantum computing is going to be a wonderful thing when we have developed techniques to make effective use of the principles.

    However I fear that this document has sacrificed technical accuracy for hype, and the author perhaps does not understand the topic as thoroughly as the article implies.

    For example:

    • Hype -
      quantum computers [can] run one billion times faster than typical silicon-based computers
      If we had built a quantum computer and benchmarked it, then I'd accept such claims. Since quantum effects occur at the atomic level, they're sure to be fast. How fast? Looked like the author chose a billion as a "really impressive number".

    • Technical inaccuracy - The article calls the XOR gate a "universal gate" (i.e. a gate which can be combined to form any logical function). Now my recollection of boolean logic may be faulty but I do not believe that XOR is a universal gate. NAND and NOR are, however, but the article does not describe these as they apparently do not apply to quantum computing. Perhaps in quantum computing XOR is the universal gate, but it isn't that way in classical computing! The author makes this claim twice, and includes a truth table which is incorrect (it is the AND gate truth table).

    Richard Feynman wrote in his popular exposition of Quantum Electrodynamics that he had taken particular care in ensuring that his statements, although simplified, were still technically accurate. In other words he did not lie to the audience by introducing concepts as truth which have been proven to be false.

    Although my understanding of quantum mechanics, quantum computing and quantum electrodynamics is low, what I do understand tends to make me distrust the details of this article.

  • as for the comment about factoring, factoring is an NP-complete problem. go review your theory books.

    I am pretty up-to-date on that. I think you are mistaken about that. Factoring is NP, of course, but whether it is NP-complete is not known.

    Give a reference to prove me wrong

  • It is known that quantum computers (potentially) can factor number in log n time (as opposed to roughly sqrt(n) time for standard computers).

    However it is not know whther P=NP in terms of quantum computers as factorisation is not known to be NP.

    Also there is no reason to think that quantum computers will be faster or indeed more suitable for most problems than ordinary computers.

  • How would quantum computing nullify a One Time Pad?


    ----------
    No army can withstand the strength of an idea whose time has come.
  • That looks like RSA could soon be bolloxed then, let's hope they get quantum crypto up and running soon...

    The Chicken of Uber

  • Not only that, but they state that Chip speed is rising exponentially.

    It seems to me that doubling every 18 months is a mathematical, not exponential, progression.

    I think they dumbed it down purposefully for us non-quantum physicists.

  • Hmm, actually I meant geometrically. But now that the crack is wearing off, I'm thinking that geometrical and exponential are not mutually exclusive.
  • Articles on quantum computing focus on processing speed.

    But what about memory? If processing is fast but memory is low, the constraints on quantum computing will be different than if memory is high.

    Consider Quake, for example. A fast processor isn't exciting unless it has the memory to handle sophisticated (i.e. high memory) maps.

    How does quantum memory work? What are the constraints on it?




    Check out Java Drivers for CueCat at: http://www.popbeads.org/Software [popbeads.org]
  • "HE not only produces horrible humor???" HE!? Monty Python is not a person, it is a comedy troupe! Obviously you know absolutely NOTHING about them, what makes you think you are a qualified judge of their humor?

    As long as I'm here...

    Obligatory (or at LEAST gratuitous) Spaceballs reference:

    Quantum computers are reversible...
    Like my raincoat!

  • As a game programmer, I wouldn't mind having a quantum CPU to do visibility testing, raytracing/colision testing, and per-pixel calculations. There are a lot of applications for games (and even just graphics) that use a large "problem space".
  • By the way, please don't follow that "q-bits" convention in the New Scientist article. All of the people in the field refer to them as "qubits". Try reading http://arxiv.org/archive/quant-ph (AKA www.lanl.gov). There's no sense in needless term duplication. Thanks.
  • The "billion times faster" statement was a bit of a stretch. Yeah, it may be able to solve some problems a billion times faster, but comparing the speed of a quantum computer and a conventional computer is a bit tricky (it's not just a constant factor or even close).

    You will probably never be able to just drop a quantum CPU into your computer and be off and running, since they're a lot more difficult to work with and not necessarily any better than conventional computers for some pretty common problems. For example, Shor's algorithm for factorization on a quantum computer involves a step that's done on conventional computers, since it's short step that isn't worth coming up with a quantum algorithm for.

    Quantum computers are best at problems whcih require searching a large problem space, not just crunching a bit of numbers to get a bunch of values that you'll end up using. A problem like factoring numbers, you search many numbers but only end up using maybe 2 of them. With graphics you actually use all of the values computed (pixel values, etc.), so a quantum computer would not be so good.
  • If you don't want a faster computer, you're not an imaginative person. Either that, or you can't program.

    I can use any ammount of speed I get. Easily.

    Fractals are cool, but you can't zoom around any deeper than the very surface at realtime. I'd love to be able to view fractals much faster. I started on an apple 2 that took eight hours to render a shallow julia set, my current P3 800 does that in seconds, my Athlon 900 is even faster.

    Then there are my experiments in modelling. I wrote a simple program for viewing the output of an equation on x and y in 3d. When I wrote it in the late 80s it took about a minute to draw a screen. Now, unaccelerated (no 3d card) it runs fast enough for a realtime display.

    Give me a more powerful CPU and I'll model more complex equations, or in more detail. Or I'll view deeper fractals, or do one of a million interesting computation problems that are currently out of my reach due to CPU speed.

    If the only thing you can think of that requires a fast CPU is Quake, then you'd probably be happy with a PS2 and WebTV.
  • Besides, when is the gap the widest? When only a few people/agencies have one. If the NSA and a few top universities have computers then the gap is huge because only the richest of the rich can get time on one. But if the technology is developed into a cheap commodity it might still only be directly accesible by the rich, but it'll trickle down eventually.

    In a world of ten supercomputers, there's no way a poor african tribesman would *ever* get near one, let alone get to run a job on one. But with the current computer situation he could get a c64 or such fairly easily. I myself learned most of my programming foundation on an Apple 2, the concepts still apply directly. This way the poor have-not could train himself in the new technology and eventually become a have and in the process directly help many have-nots.

    The cheaper we can make technology, the more likely it is that someone poor will have access to it. If only universities (and three-letter agencies) ever develop quantum computing then it'll never reach the less fortunate.
  • I was wondering much the same thing. Also, isn't it true that Quantum computers are only better than typical silicon computers in just a few aspects(like factoring and encryption).

    People through the word quantum out, and it must mean something great and excellent right? Truth be told, I think this is just another logical progression in technology.

    So my side, ho hum... Just like the discovery of new laser types that will supposedly make DVD obsolete last year.

  • This article (the one on 3drage) might not win, but I would bet it would place fairly well in the 'least-content-per-page-to-increase-banner-ads-rev enue' contest.

    Seriously, on my screen, I'd say that at least 3/4 of the page is composed of sidebars, banners, ads, table of contents etc. and at 1600x1200 I can see no more than two paragraphs of content: this is totally ridiculous.

    Even the NYT switched to multi-page format as a default, but at least their chunks are page-length, and one can easily see the article on a single page via the handy link at the bottom (which I usually use when the article is more than two pages).

    Anybody has a link to a similar article in a more reasonable format ? I refuse to give money to a site that cares (much) more about banner revenue than reader comfort.
  • One thing from the first article made my brain tick... 'Completely reversable'. This means you run a program to get it's output. But it also means you can run the program backwards to get the original input.

    Now, when put into light with the idea of a database, this almost sounds like a built-in, real-time transaction log. I hadn't even heard of this effect before in relation to quantum processing, can anyone back it up with any more fact?

    This level of reliability and recoverability is amazing (if true)... I seriously think this idea has more potential than the 'no energy used' idea because after all, entropy must increase in a forward-time universe.

    --
    Gonzo Granzeau

  • hey, the side effect of quantum computing of being able to reverse the arguement was brought up by the article, i didn't make it up! heh This is why I thought it was a valid discussion point.

    Besides, I kind of wanted the magic mind reader...To be honest, a quantum computer might be able to tell you things like 'The password for this computer is [a,b,c,d,e...]' and be able to test every answer, all the same time, depending on the number of qbits of the quantum computer.

    --
    Gonzo Granzeau

  • Very valid comments about entropy... and because they would be reversible, there's a chance. But because of the low efficiency of heat transfer devices, I suspect there will be quite a bit of energy lost.

    And I agree with you again, the first article was very 'pie in the sky' with very little actual data. Because they saying a computer in 2020 will have 160 Gig of RAM, with a revolution such as quantum computers, there most likely will not be 'RAM' per se. A revolution of this magnatitude completely changes things, not a 'slight CPU modification'. Otherwise we'd see questions like 'Can I still use my TNT2 with a quantum computer?'

    --
    Gonzo Granzeau

  • What you say is most likely true (I am not an expert, and therefore cannot register judgement of fact).

    You also raise another good question. Will quantum computers replace the current style of computer? Will we reach a point where the CPU's are so cheap and powerful (and RAM is plentiful) where everything can be done by the CPU and RAM? Will quantum computing even be applicable to applications such as web browsing and gaming?

  • Without delving too far down into flame, are you suggesting that I put a quantum computer on the 3D card of my 700Mhz Athlon system?

    To answer your second point, the article said billion, not million. As the late Carl Sagan would have you know, there is a big difference between a billion and a million...say 4 orders of magnitude.

    It certainly seems that you understand the value of using snippets of a conversation to make a point, so I don't understand the reason for your reply.

  • I think that quantum technology can be used for the purpose that you discuss, it will just come in a completely different form than what you are accustomed to.

    For instance, routing could be an application where a quantum computer would be beneficial. Building immense and extremely complicated routing tables would be suited to quantum computing. Having a router CPU that could literally analyze ALL of the routes a packet could take from point A to point B could be very beneficial.

    Quantum computing, if successfully introduced to gaming, I believe would eliminate the 3D chipset completely. Currently, we have 3D cards in order to take that processor intensive load off of the CPU. Having a quantum CPU would effectively eliminate the need for a second CPU to munge graphics.

  • Okay, let's throw out some business jargon and change "paradigms" here. Quantum computers do not have 3D cards, they have no CPU, no RAM, none of that nonesense. Not as I can explain it in 500 words or less, anyhoo.

    A quantum computer is literally going to be a new type of computing. Not just as different from a integrated circuit as is a vacuum tube, but as different as an IC is from fire or the wheel. There will be no quantum "chips", no system bus, no SDRAM, no nothing. You will literally have a thing you plug into an interface, probably not even that. Why the hell would any single individual own one of these? If quantum computing is a billion times faster, than one "quantum computer" would take care of the gaming needs of China or India.

    Whoa...think of 1 billion people playing Quake all at once...

  • Disclaimer: I'm a computer science and physics student (combined honours). This means I know more than nothing about this, but not a whole lot more.

    Most of the claims in the article are exaggerated. The "consumes no energy" thing is really just theoretical. There are hard minimum limits on how much energy classical computations consume, but no such limits on quantum computations, creating the theoretical possibilty of "free" (from the energy point of view) computations. Of course, you do have to expend energy to read the answer, as someone else pointed out...

    The "obviates all encryption" claim has some validity. Quantum computing reduces the complexity of certain computations. For instance, a linear seach that is O(n) on a classical computer becomes O(sqrt[n]) on a Q.C. Likewise, cracking RSA-style public key encryption changes from an exponential-time problem to one that can be solved in sub-exponential time. That's not to say it would be trivial to crack a 4096-bit key, but it would be possible to do so within some non-insane timespan.

    As for quantum computing doing infinite computations in a second, this is also a misinterpretation. A slighly better (but still not perfect) way to think of things is that quantum computers do things in a massively parallel way. Maybe you want to think about them as non-deterministic finite automata. That's about the best I can come up with in terms of classical analogies.

    I might have mentioned cold fusion, except that I believe that cold fusion is more likely than quantum computing.

    Quantum computing is solidly based on widely-accepted theories. More importantly, a working (simple) quantum computer has already been built. With both strong experimental and theoretical support in place, I don't see why you have trouble believing in it. The only question is when it will become practical... As for AI, and natural language processing, QC may just be the technology that enables those things. Read Roger Penrose's "The Emperor's New Mind" for more info...
  • The tubes are in the AMPS, not the SPEAKERS.

    There are tube microphones, pre-amps, phono-stages, amps, etc., but I have yet to see a tube speaker...
    --
  • I apologize; I thought that the difference was pretty implicit in what I wrote. To clarify:

    Setting a bit: introducing information into a system.

    Clearing a bit: erasing information from a system.

    Introducing information to a system has no required energy, but the erasure of information does have a minimum energy.

    If a byte has the value 0xFF, then in order to change it to 0x00 I have to erase eight bits of information before I can put in my new eight bits. (Note that most computers do the erase-and-overwrite in one step, but thermodynamically, they're two steps.) In other words, I blow eight bits of information away (which requires energy) and put a new octet in (which does not require energy).

    Clearing the eight bits requires a minimum amount of energy given by kT (k = 1.38E-23 J/K, and T = 3.2K, the ambient temp of the universe). That's 4.42E-23 Joules per bit cleared.

    Setting the bits? 0 Joules.

    Again, this is all in the dimly-remembered past of my college physics. So take it with a grain of salt. :)
  • If I remember my college physics right, the laws of thermodynamics can be summed up as:
    1. You can't win.
    2. You can't break even.
    3. You can't quit the game.

    The First Law of Thermodynamics says that entropy never decreases; the Second Law of Thermodynamics says that entropy never remains constant; and the Third Law says that you can't find a process that doesn't involve entropy.

    But then again, it's been a long time since my college physics courses.
  • If nothing else, think of games: Quake and Unreal Tournament do huge amounts of number-crunching.
    I know it's hard to understand, but believe it or not, not everyone loves such games. There are people now - not many, but some - who find a P90 adequate for word processing and net access and have no strong desire for anything more powerful.

    Tom Swiss | the infamous tms | http://www.infamous.net/

  • Quantum computers do not have 3D cards

    What part of "quantum chip be put onto a 3d card" do you not understand? He didn't say put a 3d card into a quantum computer.

    If quantum computing is a billion times faster, than one "quantum computer" would take care of the gaming needs of China or India.

    And current computers are millions of times faster than the originals. It should then follow that the united states should have exactly 250 computers?

  • First, there is the QM way of representing information

    Oh, yeah, is that the way whereby when you inspect it it becomes destroyed? Really, does QM itself show any promise for data storage? Aren't you talking about molecular storage? (like in crystals or something). Seems to me QM is really good for processing. I wouldn't trust it to store the state of my cat ;)

  • Without delving too far down into flame, are you suggesting that I put a quantum computer on the 3D card of my 700Mhz Athlon system?

    Yes, that is what the original poster suggested, which I don't see as being that bizarre. You know "computer" doesn't have to mean the entire system including peripherals and monitor. It could mean just a quantum cpu.

    To answer your second point, the article said billion, not million.

    So? I was impeaching your logic. Current computers are already orders of magnitude larger than the first computers. Does that mean that it is insane to give each person their own computer, instead of sharing the equivalent processing power (~250 computers, given the population of the United States is ~250) amongst all? No. It means that with the new power we'll come up with new things to do.

    Putting a quantum cpu on a 3D chip, or imagining that we might actually have new uses for orders of magnitude more processing power just doesn't seem that bizarre to me.

  • This [carolla.com] seems to be the same article, pretty much.
    --
  • This is, of course, assuming that such powerful computers have applications beyond number-crunching and the military.

    Of course they would. Every time computer power goes up an order of magnitude, there are always pundits claiming no normal person could ever use that much speed, and they are always quickly proven wrong. If nothing else, think of games: Quake and Unreal Tournament do huge amounts of number-crunching. Until we get to the point where computers can render ray-traced scenes at 60 fps in 36000x24000 pixels (huge flat-panel displays at 300 dpi) there will always be a use for more CPU power.

  • While all the hype goes on for the next 20+ years until they make practical devices, I suggest that computing mesh impementations be explored. If you have a grid of 4096x4096 cmos single bit computers running at 100 Mhz (all quite slow by todays standards), you can do amazing things, like...

    Pump anything you want in/out of the edges at 100Megasamples/second (at minimum), do as much hard math/matching as you need, and get the results out.

    It would be possible to use "defective" chips as long as the boundary cells were all good, much like we use LCD panels with bad pixels today.

    The major hurdle is ween programmers off of the von Neuman architechure, and get them into something that just seems like the biggest gate array in the universe.

    --Mike--

    No patents were harmed in the creation of this posting

  • The first article very summarily states that the quantum computing process is "completely reversible" and so consumes "no net energy." Now that I think about it, I barely remember covering quantum computing in undergrad, and I wasn't a physics major, so I don't know all the terms. But it was my impression that the particles serving as "bits" in the quantum machine had to be suspended somehow, and that the mechanism for doing this is extrememly power-intensive. And that this high rate of power consumption was in fact one of the things keeping quantum computers from using more than 4-6 bits at a time.

    So, basically what I mean is, I don't know what I'm talking about here, but the claim that "they don't consume energy" smells funny based on the little I did learn about quantum computing at one point. Like, even if it's theoretically true, it's deceptive to put it just that way. So, can someone more versed in physics enlighten me and the /. crowd at large?

  • First paragraph: correct, but obnoxious. There is indeed a set of consequences of quantum theory that includes wave-particle duality, although it is not a fundamental assumption of quantum theory. Second paragraph: correct, the original poster is being silly. But when you say spin can be either up or down and that the state space is 2D, you of course realize that information in a quantum computation can be represented in a superposition state, which can then be manipulated to result in a particular discrete observable given a particular result. Third paragraph: the original poster is talking out of his ass, true. The "speedup" is the result of the fact that a QC algorithm can effectively run a large number of "computations" in parallel, in Hilbert space, resulting an answer in far fewer steps. Of course, only some problems are "quantum computationally feasible" in the sense of taking far fewer steps to solve in a QC process than in a conventional algorithm (Schnor, Grover, etc. are a couple of algorithms that would be particularly useful and nifty if we had a big enough QC to solve them in practical situations). Similar in nature to the various P* complete etc. ways of describing problems that are polynomial time under parallel processing, etc.
  • ...are older than quantum computers. There are articles going back decades on how reversible computers could be made to consume arbitrarily low amounts of energy. This has nothing whatsoever to do with quantum computing (except that quantum computers are just one kind of (hard to make) reversible computer).

    Slashdot already has an article on the 'shielding' method. Search for 'decoherence free subspaces'.

    --
  • Great, now audiophiles will insist on buying transistor-based speakers, because "they sound better, really!"
  • I know there have been alot of errors mentioned but I didn't see this one yet. In 20 years, by Moore's law, processors won't be just 40 times faster (if that's what was implied by the 40Ghz statement) they will be more like 5,160 times faster. Because speed (or at least transistors)will have doubled like 13.333 times. I am not sure if I did the math right but only 40Ghz? We should be there in about 6 1/2 years following Moore's law. I will finally be able to run Quake III at my desired 50,000 frames/second. I can tell the difference I swear!!
  • As the late Carl Sagan would have you know, there is a big difference between a million increased by four magnitudes and a billion... say nine billion.

    One million = 1,000,000 = 1 x 10^6
    One billion = 1,000,000,000 = 1 x 10^9
    9 - 6 = 3

    Where the hell did you get 4 from?? Figured the factor of 1,000 had four digits in it? Try and get your math right the next time you're being condenscending about numbers.

  • Um, the transistor didn't wipe out the vacuum tube. Trust me, tube-based guitar amplifiers sound a million times better than anything base on transistors. And they're still being made.

    There's no such thing as an obsolete technology, merely one that's got a smaller application base than it used to have.

    Furthermore, with this billion-fold speed increase, what kind of peripherals are you going to have?


    ----------------------------------------
    Yo soy El Fontosaurus Grande!
  • Dont warning bells go off when you hear these phrases: "Consumes no power" "Runs infinite computations per second" "Obviates all encrytion" etc?

    Rather than blindly believing it one should remember what came of the big hype surrounding: AI -"we're only a few years away ©1967", COBOL -"the programming language for managers and non-techies", 4GL -"Natural Language Processing", The "paperless office", National missile defense, 100% portable java code, MULTICS, the "New Economy", the "Information Super Highway", and just about every CASE tool ever.

    I might have mentioned cold fusion, except that I believe that cold fusion is more likely than quantum computing.

  • "Quantum Computers: How they work and How they will effect us"

    Although quantum mechanics leads to very strange and wonderful things I doubt anything they do will effect us. They might affect us. Next story, please! I don't read items that so blatantly butcher the language.
  • While setting a bit may not require any energy expenditure, reading it will, thus any computation based on a previously set bit will consume energy. Now, if you can prove the ability to set, read, and clear a bit with no energy being expended, you're either the harbinger of a new age of computing and life as we know it, or just insane ;-)
  • I seriously think this idea has more potential than the 'no energy used' idea because after all, entropy must increase in a forward-time universe.

    Actually, that is not the case: entropy must not DECREASE in physical processes, but there is no requirement that it INCREASE. In fact, reversible physical processes are those in which entropy remains the same...irreversible physical processes are those in which entropy increases.

    That being said, I would agree with you that this reliability and recovery would be the truly amazing part of these systems...although I submit that I don't understand what these statements really mean in terms of the physics (i.e. based on the first article, I'm a tad skeptical of the statements being made.....).

  • Indeed! In the real world, you can't win, can't break even, and can't quit the game....More technically (from Eric's Treasure Trove of Physics [treasure-troves.com]):

    • First Law: energy is conserved - change in internal energy come from changes in heat and work done on the system.
    • Second Law: entropy doesn't decrease
    • Third Law: at absolute zero, entropy tends to a constant; for crystalline solids, that constant is zero
  • If you can show me how in all concievability that cat can be both dead and alive, then quantum theory is possible - otherwise it just won't work.

    This is the "Schroedinger's Cat" example of quantum mechanics, filtered through a particular "Many Worlds" interpretation of quantum mechanics...but a cat is not a quantum system, and the interpretation you choose to apply to describe your philosophical position does not affect the physical system one bit. The cat is an analogy, if you will, not to be taken literally (although that is one other philosophical interpretation of the theory). It is an analogy for the way that quantum states "superimpose" on other quantum states: an electron when observed has either spin up or down, but while it is evolving unobserved, it really CAN BE in a state which is both up and down at the same time....and it "picks" which state to be in when observed in certain percentages based on the evolution of the state (again, this description is colored by a particular interpretation...if you want, rather than "picking a state" think "choosing a universe where the observation is made"). The technical details can be found in any undergraduate quantum mechanics textbook.

    Quantum mechanics is the realization, at small distance scales (atomic and smaller), systems have to be described in terms of different dynamics than they do at the macroscopic level. There is nothing strange about this...physics at larger scales is always a limiting case of the physics at smaller scales. And quantum mechanics itself is extremely well tested and understood (all of modern chemistry, semiconductor development, biochemistry, superconductors, particle physics, etc. are based on quantum theory). Quantum behavior is not only conceivable and possible, but it appears from experiments that it IS the way reality is constructed; it is far from busted, and we are a much happier world for discovering it.

  • One thing from the first article made my brain tick... 'Completely reversable'. This means you run a program to get it's output. But it also means you can run the program backwards to get the original input.

    While I haven't been able to get to the first article (I'm getting a 403 error), I suspect that "completely reversible" doesn't quite mean what you think it means. As another poster pointed out, reversing a "multiply by zero" program would essentially create a magic mind-reader.

    What I suspect "completely reversible" means is that the machine can determine all possible input states that produce a given output state. So reversing a "multiply by zero" program would wind up producing all possible numbers that you could've input into the system. Which doesn't sound impressive; however if you consider reversing something else, such as (it's in BASIC, because this seemed like something best illustrated with GOTO's and GOTO's make my brain jump back to BASIC):

    10 INPUT A : REM OUR INPUT STATE
    20 IF A = 10 THEN GOTO 100
    30 A = A * 2
    40 GOTO 100
    100 PRINT A : REM OUR OUTPUT STATE

    So if we run this program backwards with an output of '10', the quantum computer (using the whole quantum non-deterministic magic) would be able to simultaneously step backwards from line 100 to both lines 20 and 40. From line 20, it would continue back to 10 and from line 40 it would continue back to 30, 20, and 10. All of this would occur in the same amount of time it would take to run forward through the code. Finally, you'd wind up with a set of valid input states, one where A = 5 and one where A = 10.

    However, standard IANAQuantumPhysicist disclaimers apply. I could be totally off-base with this explaination. But it seems to fit my understanding of the processing magic that quantum computers bring to the table (i.e. being able to do a bunch of simultaneous, parallel computations in linear time).

  • A one-time pad is not encryption, it is a method of verification. As for the encryption thing: all forms of encryption (so far as I know) can be brute-forced. With the supposedly unlimited power of a quantum computer, you could brute-force anything instantly.

    CAP THAT KARMA!
    Moderators: -1, nested, oldest first!
  • One of the things about quantum theory that I've never been able to grasp is the parallel universe/unknown value part. You see, the idea behind quantum computing is sort of like this: You put a cat in a box with a dish of cyanide poisoned water. The next time you check, 2 outcomes are possible: the cat is alive, or the cat is dead. To do this on an 8 bit scale, Take 8 cats. Same for 16,32,64,128 bit, etc.

    Now here's the zinger; in one universe, the cat is alive, and in another, the cat is dead. How? The cat is either dead or alive - there aren't 2 cats. Just because you can't tell if the cat isn't there until you can see it (another thing about quantum theory - if you can't observe it, it is in all states simultaneously) doesn't mean you have a cat that is both dead and alive.

    If you can show me how in all concievability that cat can be both dead and alive, then quantum theory is possible - otherwise it just won't work. One busted theory, and a dissapointed world without its computer.

    One more thing: quantum computing, if it exists/work, would effectively nullify encryption. Ouch.

    CAP THAT KARMA!
    Moderators: -1, nested, oldest first!

  • At this point, quantum technology is obviously in its early years, and applying quantum technology to computers would imho only be the start to many more advanced technologies.

    I have read that the applications of quantum tech are much more widespread than most people relize. For example, it has been proven that one can measure the width of a human hair using a laser with quantum enhansments without the laser touching the hair whatsoever. They have also successfully 'teleported' a single atom instantly from one place to another (about 100 feet from the origin).

    There are many more possibilities in the future of quantum technologies, and I think quantum computing is probably going to be the least of these achievements - even if it may be the first.

    A good read, however fiction, is Micheal Cricton's 'Timeline', which covers some facts regarding quantum tech at this point, and also goes into some ideas of where things could progress to.

  • Damian Conway threw together a brilliant application for this new breed of computers. QSP [uwinnipeg.ca]

    Imagine not having to wait Log N for anything!

    Dancin Santa
  • ... those articles blew my mind. But one has to consider this: how practical is it to be salivating when this sort of technology is decades away? And while it does sound nice and eco-friendly, I think that there's potential to widen the "techonology gap" between the have and the have-nots even further. This is, of course, assuming that such powerful computers have applications beyond number-crunching and the military.
  • Ok so the first thing every one is talking about is this "Complete Reverseable" computation. Well thats wrong and taken out of context (trust me Im very adept at this theory and work with it on a daily basis) The nature of quantum gates, which comprise a quantum computer, are not reversable! though there are a few quantum alg that can be reversed it doesent me all can.

    Secound - while a quantum computer may be able to do a ton of caculations at the same time we will never know the answers with the current theory of QM and here is why. Think of a Quantum Bit, a QuBit (and we are not building Ark's here), as a unit vector in 3d space. ITs possible for this vector to point in any direction orginating at the orgin, thus creating a "ball" in space. Now when 2 Qubits are put into the system (a Q-gate) u get 2 out puts (one from the first out put = the first imput un-altered, the secound output an altered form of the secound input). So how do we know what we did, well you observe the system, take a measuremnt. To take a measurement in QM you can only measure orthogional states, ie two possible out comes in a QuBit system, by doing so you force the "ball" (which is all possible outcomes) into one of two vectors thus reducing your infinte caculation. And after you take that measurement it doesent mean that the QuBit u measured will give you the same answer if you measure it again!

    well I think thats enough for your to think on.
  • by MattJ ( 14813 ) on Monday January 08, 2001 @11:44AM (#522988) Homepage
    I understand it differently. AFAIK, Landauer showed that enmergy must be spent to erase information, not to clear a bit. You are erasing information if you use a gate with more input wires than it has output wires. For example, a clasical NOT gate has one input, one output, and no info is erased because you can always tell what input caused a given output; it's reversible. But a classical AND gate could cause a 0 output in three possible ways (00, 01, 10), and you can't tell which; it's irreversible, because you've erased information. Read http://www.qubit.org/intros/compSteane/qcintro.htm l for some more info.

    Also, someone should note that the energy savings from reversible computation are real but very, very tiny. Chips would have to get 1 million times more efficient than they are now for the energy costs of (current) irreversibility to manifest themselves. And if you expect a quantum computer to operate without tons of expensive, high-powered supporting equipment around it (NMR machines, optical pumps, liquid helium-cooled ion traps), you'd better add a couple more decades onto your time estimate.
  • by rjh ( 40933 ) <rjh@sixdemonbag.org> on Monday January 08, 2001 @11:24AM (#522989)
    Zero-energy computation isn't anything new, in theory; we've known what must be done to achieve zero-energy computation for a long time. We just haven't quite been able to figure out how to do it.

    In principle, setting a bit requires no expenditure of energy; it's clearing the bits that requires an energy expenditure. So, provided you can figure out a memory design which permits that bits be set and never cleared, you can achieve zero-energy computation.

    Note that I'm using "requires" in a very narrow context here. Setting a bit requires no expenditure of energy, but all the computers we have right now expend energy to set bits. That's a limitation of design, not any thermodynamic limitation we're currently aware of.

    All of this comes to you courtesy of some long-ago college courses on the physics of computation. I may be misremembering quite a bit. :)
  • by TwP ( 149780 ) on Monday January 08, 2001 @11:28AM (#522990) Homepage
    To prepare a q-bit into a particular state, you hit it with one photon of the proper frequency and phase. To get the information out of the q-bit, it releases one photon, indistinguishable from the original photon used to prepare the q-bit. At least that is how it works in theory.

    In real life, we don't have systems accurate enough to deliver one photon to one atom (or nucleus). Instead, we play the odds and bombard the q-bits with a very large number of photons until it is in the proper state. All the other photons are lost.

    Technically, they could capture all the photons emitted by the q-bits and return them into the system at a later time. But I don't think that will happening any time in my lifetime!


    -----------------

  • by Phronesis ( 175966 ) on Monday January 08, 2001 @11:20AM (#522991)
    This article completely misunderstands quantum computing. A key point is with respect to reversibility and energy consumption. The simplest picture of quantum computing, as proposed by Feynman and Deutsch, involves doing the processing by reversible steps, but absolutely requires making an irreversible quantum measurement at the end to discover the result.

    In other words, if there is no thermal dephasing, you can operate with no energy consumption so long as you never look at the output, but there is a rigorous minimum value of energy that it costs to look at the output. This limit is set by basic thermodynamics and is inescapable.

    In practical terms, cooling the computer to feasible cryogenic temperatures will consume lots of energy even when the qbits do not. Moreover, the fact that you will run the computer at finite temperature makes it necessary to apply error-correcting codes to compensate for thermal dephasing. Error-correcting steps are irreversible and thus consume energy during the calculations.

  • what I was always curious about was whether quantum computing could be put into non cpu environments, still processing, but not as general purpose. For example, could a quantum chip be put onto a 3d card, and make it work a million times faster? Or could it be put to use in network switches with regard to 100% optical switches pushing us into 1TBit networking?

    ---
  • by Bonker ( 243350 ) on Monday January 08, 2001 @12:02PM (#522993)
    for Quantum Computing Purposes...

    Skip this if you've had even Physics 101.

    First of all, Quantum Theory as we know it has been devised over the last century. I could name a lot of famous scientists names like Heisenberg, Schroedinger, and Fermi, but you don't care so I won't.

    The meat and potatoes of quantum theory is this: All particles, no matter what the size, act as both a wave and a particle. According to research, either the location *or* the mass of a particle may be known at any time.

    Also, as we all know, wave interfere with eachother. If the crests of two waves overlap, they grow. This is referred to as 'Constructive' interference. 'Destructive' interference happens when a crest of one wave overlaps the trough of another wave. This gives rise to many observable phenomena, such as diffraction lines you can see when you stare at a bright light through your eyelashes. This is what causes 'ice rings' around bright lights in cold weather and the occasional 'moon ring'. It's also why you have to have your surround sound speakers positioned just so, so that they don't interfere with eachother.

    Early experiments where researchers shot electrons through tiny holes in a lead sheild and onto film created similiar diffraction patterns, because, since electrons are indeed particles, they are also waves. The real shock comes when you only shoot one electron (or other particle) at a time through a sheild to create a pattern on film. Even though there was nothing for the particles to interfere with when shot one at a time, they *still* created a diffraction pattern.

    This gives rise to the thought that particles that store their energy in 'quanta' and are small enough not to interact instantly with their environment, exist in multiple probability states. The electrons that created the diffraction pattern were interfering with the possibility that they existed elsewhere in the experiment.

    In quantum computing, this is useful because electrons can be made to do different things at the same time, such as be in different places or aborb and release different amounts of energy. They can also simply stop existing at one place and start existing at another. They can also rock back and forth through time. Quantum computing, for the uninitiated, relies on harnessing these seemingly paradoxical phenomena. If the theories are all correct, this means that information storage will simply become infinite because there are an infinite number of states that any electron can occupy. Energy required to run a quantum process will be very little or zero, due to basic laws of thermodynamics and quantum physics. Speed of computations will be astronomical because quantum interactions take place on the pico-scale.

    Quite a nifty thing...

    Schroedinger's Cat says: It is not the world that must bend, but your mind. You must realize taht there is no mouse.
  • by Karpe ( 1147 ) on Monday January 08, 2001 @11:13AM (#522994) Homepage
    I suggest any reader interested in getting a good introduction to QC to take a look at a presentation given by Rob Pike at USENIX, available in MP3 audio here [technetcast.com]. It talks about the motivations on using information quantum mechanicaly (intrinsic parallelism, we are running out of atoms, etc); some historic aspects (Feynman's question: Can a computer simulate a QM system?); the approximations that you eliminate when you use QM computing devices (bits are not independent, but entangled); some algorithms (factoring, searching), etc. Not only nice, but funny too. Don't forget to get the slides also.

    Just notice that there are two different aspects when we talk about QM systems, which most of the time are treated together: First, there is the QM way of representing information, which is to some point a reality now (on modern, high density Hard-Disk, for instance), the other is QM computers, which is something for way in to the future.
  • The simple answer is "possibly"

    For example, it is possible that quantum computing can greatly increase 3D rendering. Basically, the main problem in ray tracing is finding the correct number of solutions that will lead a light ray to the point the eye is looking at. There are stochastic methods, like Metropolis, that greatly speed up the process of determining these solutions, but like most stocastic methods when compared to quantum methods, they are unreliable and slow (although when compared to deterministic methods, they are unreliable and fast). In a quantum 3D chip, you can theoretically easily find all of the solutions in a very short time, and thus determine the light levels for the point. This would in effect give you a perfect ray trace in a few cycles/point.

    And even then, given enough qbits, you could be running those raytracing calculations on all of the points, oversampled by 256 to give a nice antialias.

    But this is all in theory, because there are severe limitations on the logic that one can do with a quantum computers today. While the above could be modeled, I don't think we'll know for a while if it can be.

All seems condemned in the long run to approximate a state akin to Gaussian noise. -- James Martin

Working...