A Pair Of Quantum Computing Articles 161
Will G writes: "3DRage has posted an article entitled "Quantum Computers: How they work and How they will effect us" by Alan Cline. Not only can quantum computers run one billion times faster than typical silicon-based computers, but also theoretically, they can run and consume no energy. That being true, quantum computers could obsolete the silicon chip much as the transistor did the vacuum tube. This paper is intended for the general reader, and explains basic quantum computer features, and the paradoxical effects quantum theory produces in a practical world. This paper discusses how quantum computers originated, the inevitability of their use, and how they differ from classical computers." An interesting nugget to add to this comes from leelaw2000, who writes: "New Scientist have published this little news story about the development of a kind of quantum shielding that might help the development of real quantum computers. Now if they can just get Quake on it ..."
Nee! (Score:1)
Nee!
Nee!
Nee!
Now go fetch me a shrubbery!
Quantum Mechanics and Hard Disks (Score:1)
(GMR was discovered in 88 by Mário Baibich)
can't read output 'till end? (Score:1)
dead link? (Score:1)
real wireless (Score:1)
We would just need to figure out how to use quantum entanglement to communicate with the server and we could all use the same computer.
instantaneous communication would that mean no more lag in xpilot?
Re:Never gonna happen (Score:1)
I don't think so. There has not, to my knowledge, been a single experiment proving that quantum entanglement exists that didn't rely on fudged data. (Usually, they extrapolate based on probabilities of photon detection). All observed phenomena (we're talking about entanglement here) can be easily explained by local processes - instantaneous 'action at a distance' has never been shown definitively.
Re:Never gonna happen (absolutely right) (Score:1)
What does happen, in many of these cases (as cold fusion was to some degree) is that a LOT of physicists survive on grants gleefully awarded by unknowing foundations and governmental agencies (in other words, by money from you and me folks). Any questioning of QC is instantly stamped 'troll' and dismissed from journals - too many people rely on it for their livelihood. When one does manage to pin them down, even the physicists will argue for 'quantum magic' and mysticism! Puh-lease!
This came up a couple of times before in Slashdot, and there was a really interesting commentary by someone (Nightlight3) who seemed to know what they were talking about, and had some very rational reasons for why quantum computing won't happen. Read it here [slashdot.org]. It also contains links to other informative articles.
But, these fantasy stories certainly do draw readership.
Hype hype hype hype hype hype hype hype hype hype hype hype hype hype hype hype!
Re:Power requirements ;) (Score:1)
Obviously the reboot will require somewhat ER type operations. A computer that runs on power is rebooted by removing power. A computer that doesn't use power is rebooted by adding power.
Note: Multiple shocks may be required.
Of course you consume energy... (Score:1)
Damn... that's powerful science. (Score:1)
Imagine what the tiniest error could do in a logic matrix this complicated... how susceptible to interference is the quantum bit?
Also imagine the possibilities for networking... if you were to take into consideration the "action at a distance" effect to transmit data somehow you could transmit a great deal of data in zero time. You'd still have to send that first bit, but after that, you could (possibly?) maintain an instantaneous communication state regardless of distance. Guess we won't have to build fiber lines to Mars anymore
Great, the site has gone down now before I finally got to the G article... either an elite hack, slashdot effect or certain parties don't want us to know that information
Re:Second Law violation (Score:1)
sorry - no explicit mention of the third law. (But some gems such as
)At the risk of drifting even further off-topic, they also have a great Gnu [tripod.com] song
Maybe we could persuade RMS to sing it for us?--
Re:A breify Quantum Physics Thingy... (Score:1)
Actually, those are the later experiments
Not impressive enough for you? Well, a decade later Young studied the newly-discovered Rosetta Stone, and through it became the first modern person to translate Egyptian hieroglyphics!
Re:obligatory monty python reference (Score:1)
Foul language, a sign of hostility.
That Monty Python guy, *he* (That's right, "him") is one funny guy. Jackass.
Re:Actually, QC's will consume lots of energy. (Score:1)
Jan-Pascal
Anyone remember Jane? (Score:1)
We've got a ways to go before reality is stranger than fiction.
Re:Tubes Obsolete? (Score:1)
Re:Use of quantum computing in non cpu environment (Score:1)
Re:As obsolete as the vacuum tube? (Score:1)
--
Hope Intel never sees this (Score:1)
Or, it'll just create BSOD's billions of times faster than any current silicon-based PC.
Re:Score 5: Interesting,... WITH a doubt (Score:1)
It's a running gag. Just like with various friends and family I cannot resist throwing in references to The Great Race ("Push the button, Max!"), Monty Python and the Quest for the Holy Grail ("So, logically, if she weighs the same as a duck..."), The Princess Bride ("Inconceivable!"), Pinky and the Brain (Pinky singing "Brainstem! Brainstem!"), Bill Murray's Star Wars lounge singer (don't ask)...each reference carrying not only its literal meaning but the accumulated context of previous uses.
It's just one of those bizzare aspects of human behavior that certain phrases are often repeated...some sort of culture binding mechanism, I suppose.
So, can you imagine a Beowulf cluster of these? B-)
Tom Swiss | the infamous tms | http://www.infamous.net/
Re:This is amazing... (Score:1)
I didn't say I didn't want a faster computer...I said such people exist. (Though with what I'm playing with these days, I'm satified with hardware that some would consider outdated; my fastest is a 500Mhz K6, and I'm typing this on a P-90 I use mostly as an X terminal.)
Just like cars - some people find that a basic transportation mobile meets all their needs, and innovations that allow more speed and higher accelation just don't excite them much.
Tom Swiss | the infamous tms | http://www.infamous.net/
Re:Score 5: Interesting (Score:1)
ie we have them but its very limited and 'ivory tower' at the moment
Re:Score 5: Interesting (Score:1)
Have a look on Freshmeat for QCL and QDD, these are efforts to get the software ball rolling (somewhat) while we still don't have the hardware.
Of Course this isn't practically useful, but we can forsee many software/language changes that are in store. These are theoretical exercises that let you sniff around and prototype but not test.
The first company that puts a quantum register on a PCI card will make a pretty penny :)
Another dead (or deadly?) link? (Score:1)
"Forbidden
You don't have permission to access
Apache/1.3.3 Server at www.3drage.com Port 80"
Re:Insert Amusing Title (Score:1)
Mod this guy up. He said it all.
Re:Use of quantum computing in non cpu environment (Score:1)
oops. (Score:1)
Re:Power requirements ;) (Score:1)
Re:Power requirements ;) (Score:1)
Hold it upside-down and shake vigorously.
Ho-hum... (Score:1)
Re:Quantum theory (OTP) (Score:1)
Re:Can anyone get past page 4? (Score:1)
I'm not really sure I get the "wave state". Is that in contrast to being units, they become one big heterogeneous wave, with all their particles mixed together?
Maybe it's that I just don't really know anything about "quatum wave properties", I took Chemistry for my science sequence in college, I took Physics in high school (which was some time ago). Never really got into anything cool like this in my studies, (although my Calculus professor did explain the chaos theory, which was cool).
Can anyone get past page 4? (Score:1)
If someone has a dumbed-down explanation of Quantum computing that would be nice, 'cause I still have no idea what they are. It seems like they're trying to represent bits at the molecular level, but why that's better is beyond me.
Quake? (Score:1)
psxndc
News Story (Score:1)
San Jose, CA - In the wake of recent advances in quantum computing research, the world's hi-tech companies are gearing up for what is sure to be the single most significant breakthrough in IT since the integrated circuit. "We have committed our most valued resources to proactive quantum computing initiatives," said Steve Ballmer of Microsoft. "In fact, a year from now, we'll be known as Miqrosoft!" Jenny Tright, PR directory for Oracle-soon-to-be-Oraqle makes similar claims: "Transitioning to a new paradigm is never easy. We worked long and hard with our high-paid marketing consultants before deciding to capitalize on this new technology. After countless bagels, pizza lunches, and French dinners, we arrived at the ultimate strategy: a name change!"
Considerable re-directions seem to be in store for many technology firms, yet some are content with their current strategy. Compaq, in a recent press release, states, "We already have what it will take to compete in the new quantum computing marketplace. Don't be fooled by the imitators--we had a 'q' in our name first!"
Apple could not be reched for comment.
Link is broken! (Score:1)
A looooong wait... (Score:1)
Computer science is still immature for its barely 80 years, [...]. Who knows what the next 870 years will bring?
870 years eh? My guess is that the 2.6 kernel might be out by then...
Would there be NP problem ramifications? (Score:1)
Never mind the accuracy, feel the hype! (Score:1)
However I fear that this document has sacrificed technical accuracy for hype, and the author perhaps does not understand the topic as thoroughly as the article implies.
For example:
Richard Feynman wrote in his popular exposition of Quantum Electrodynamics that he had taken particular care in ensuring that his statements, although simplified, were still technically accurate. In other words he did not lie to the audience by introducing concepts as truth which have been proven to be false.
Although my understanding of quantum mechanics, quantum computing and quantum electrodynamics is low, what I do understand tends to make me distrust the details of this article.
Re:Would there be NP problem ramifications? (Score:1)
I am pretty up-to-date on that. I think you are mistaken about that. Factoring is NP, of course, but whether it is NP-complete is not known.
Give a reference to prove me wrong
Re:Would there be NP problem ramifications? (Score:1)
However it is not know whther P=NP in terms of quantum computers as factorisation is not known to be NP.
Also there is no reason to think that quantum computers will be faster or indeed more suitable for most problems than ordinary computers.
Re:Quantum theory (Score:1)
----------
No army can withstand the strength of an idea whose time has come.
Re:Side comment on the energy section... (Score:1)
The Chicken of Uber
Re:Silly error in article (Score:1)
It seems to me that doubling every 18 months is a mathematical, not exponential, progression.
I think they dumbed it down purposefully for us non-quantum physicists.
Re:Silly error in article (Score:1)
Quantum RAM (Score:1)
But what about memory? If processing is fast but memory is low, the constraints on quantum computing will be different than if memory is high.
Consider Quake, for example. A fast processor isn't exciting unless it has the memory to handle sophisticated (i.e. high memory) maps.
How does quantum memory work? What are the constraints on it?
Check out Java Drivers for CueCat at: http://www.popbeads.org/Software [popbeads.org]
Re:obligatory monty python reference (Score:1)
As long as I'm here...
Obligatory (or at LEAST gratuitous) Spaceballs reference:
Quantum computers are reversible...
Like my raincoat!
Re:Use of quantum computing in non cpu environment (Score:1)
Re:Silly error in article (Score:2)
Re:Use of quantum computing in non cpu environment (Score:2)
You will probably never be able to just drop a quantum CPU into your computer and be off and running, since they're a lot more difficult to work with and not necessarily any better than conventional computers for some pretty common problems. For example, Shor's algorithm for factorization on a quantum computer involves a step that's done on conventional computers, since it's short step that isn't worth coming up with a quantum algorithm for.
Quantum computers are best at problems whcih require searching a large problem space, not just crunching a bit of numbers to get a bunch of values that you'll end up using. A problem like factoring numbers, you search many numbers but only end up using maybe 2 of them. With graphics you actually use all of the values computed (pixel values, etc.), so a quantum computer would not be so good.
Re:This is amazing... (Score:2)
I can use any ammount of speed I get. Easily.
Fractals are cool, but you can't zoom around any deeper than the very surface at realtime. I'd love to be able to view fractals much faster. I started on an apple 2 that took eight hours to render a shallow julia set, my current P3 800 does that in seconds, my Athlon 900 is even faster.
Then there are my experiments in modelling. I wrote a simple program for viewing the output of an equation on x and y in 3d. When I wrote it in the late 80s it took about a minute to draw a screen. Now, unaccelerated (no 3d card) it runs fast enough for a realtime display.
Give me a more powerful CPU and I'll model more complex equations, or in more detail. Or I'll view deeper fractals, or do one of a million interesting computation problems that are currently out of my reach due to CPU speed.
If the only thing you can think of that requires a fast CPU is Quake, then you'd probably be happy with a PS2 and WebTV.
Re:This is amazing... (Score:2)
In a world of ten supercomputers, there's no way a poor african tribesman would *ever* get near one, let alone get to run a job on one. But with the current computer situation he could get a c64 or such fairly easily. I myself learned most of my programming foundation on an Apple 2, the concepts still apply directly. This way the poor have-not could train himself in the new technology and eventually become a have and in the process directly help many have-nots.
The cheaper we can make technology, the more likely it is that someone poor will have access to it. If only universities (and three-letter agencies) ever develop quantum computing then it'll never reach the less fortunate.
Re:2 words - Beowulf Cluster ! (Score:2)
People through the word quantum out, and it must mean something great and excellent right? Truth be told, I think this is just another logical progression in technology.
So my side, ho hum... Just like the discovery of new laser types that will supposedly make DVD obsolete last year.
Semi OT: per-page content on ad hungry sites (Score:2)
Seriously, on my screen, I'd say that at least 3/4 of the page is composed of sidebars, banners, ads, table of contents etc. and at 1600x1200 I can see no more than two paragraphs of content: this is totally ridiculous.
Even the NYT switched to multi-page format as a default, but at least their chunks are page-length, and one can easily see the article on a single page via the handy link at the bottom (which I usually use when the article is more than two pages).
Anybody has a link to a similar article in a more reasonable format ? I refuse to give money to a site that cares (much) more about banner revenue than reader comfort.
Side comment on the energy section... (Score:2)
Now, when put into light with the idea of a database, this almost sounds like a built-in, real-time transaction log. I hadn't even heard of this effect before in relation to quantum processing, can anyone back it up with any more fact?
This level of reliability and recoverability is amazing (if true)... I seriously think this idea has more potential than the 'no energy used' idea because after all, entropy must increase in a forward-time universe.
--
Gonzo Granzeau
Re:Side comment on the energy section... (Score:2)
Besides, I kind of wanted the magic mind reader...To be honest, a quantum computer might be able to tell you things like 'The password for this computer is [a,b,c,d,e...]' and be able to test every answer, all the same time, depending on the number of qbits of the quantum computer.
--
Gonzo Granzeau
Re:Side comment on the energy section... (Score:2)
And I agree with you again, the first article was very 'pie in the sky' with very little actual data. Because they saying a computer in 2020 will have 160 Gig of RAM, with a revolution such as quantum computers, there most likely will not be 'RAM' per se. A revolution of this magnatitude completely changes things, not a 'slight CPU modification'. Otherwise we'd see questions like 'Can I still use my TNT2 with a quantum computer?'
--
Gonzo Granzeau
You raise a good point (Score:2)
You also raise another good question. Will quantum computers replace the current style of computer? Will we reach a point where the CPU's are so cheap and powerful (and RAM is plentiful) where everything can be done by the CPU and RAM? Will quantum computing even be applicable to applications such as web browsing and gaming?
Re:Use of quantum computing in non cpu environment (Score:2)
To answer your second point, the article said billion, not million. As the late Carl Sagan would have you know, there is a big difference between a billion and a million...say 4 orders of magnitude.
It certainly seems that you understand the value of using snippets of a conversation to make a point, so I don't understand the reason for your reply.
Re:Use of quantum computing in non cpu environment (Score:2)
For instance, routing could be an application where a quantum computer would be beneficial. Building immense and extremely complicated routing tables would be suited to quantum computing. Having a router CPU that could literally analyze ALL of the routes a packet could take from point A to point B could be very beneficial.
Quantum computing, if successfully introduced to gaming, I believe would eliminate the 3D chipset completely. Currently, we have 3D cards in order to take that processor intensive load off of the CPU. Having a quantum CPU would effectively eliminate the need for a second CPU to munge graphics.
Re:Use of quantum computing in non cpu environment (Score:2)
A quantum computer is literally going to be a new type of computing. Not just as different from a integrated circuit as is a vacuum tube, but as different as an IC is from fire or the wheel. There will be no quantum "chips", no system bus, no SDRAM, no nothing. You will literally have a thing you plug into an interface, probably not even that. Why the hell would any single individual own one of these? If quantum computing is a billion times faster, than one "quantum computer" would take care of the gaming needs of China or India.
Whoa...think of 1 billion people playing Quake all at once...
Re:Never gonna happen (Score:2)
Most of the claims in the article are exaggerated. The "consumes no energy" thing is really just theoretical. There are hard minimum limits on how much energy classical computations consume, but no such limits on quantum computations, creating the theoretical possibilty of "free" (from the energy point of view) computations. Of course, you do have to expend energy to read the answer, as someone else pointed out...
The "obviates all encryption" claim has some validity. Quantum computing reduces the complexity of certain computations. For instance, a linear seach that is O(n) on a classical computer becomes O(sqrt[n]) on a Q.C. Likewise, cracking RSA-style public key encryption changes from an exponential-time problem to one that can be solved in sub-exponential time. That's not to say it would be trivial to crack a 4096-bit key, but it would be possible to do so within some non-insane timespan.
As for quantum computing doing infinite computations in a second, this is also a misinterpretation. A slighly better (but still not perfect) way to think of things is that quantum computers do things in a massively parallel way. Maybe you want to think about them as non-deterministic finite automata. That's about the best I can come up with in terms of classical analogies.
I might have mentioned cold fusion, except that I believe that cold fusion is more likely than quantum computing.
Quantum computing is solidly based on widely-accepted theories. More importantly, a working (simple) quantum computer has already been built. With both strong experimental and theoretical support in place, I don't see why you have trouble believing in it. The only question is when it will become practical... As for AI, and natural language processing, QC may just be the technology that enables those things. Read Roger Penrose's "The Emperor's New Mind" for more info...
Nope, *you* don't... (Score:2)
There are tube microphones, pre-amps, phono-stages, amps, etc., but I have yet to see a tube speaker...
--
Re:Zero-energy computation (Score:2)
Setting a bit: introducing information into a system.
Clearing a bit: erasing information from a system.
Introducing information to a system has no required energy, but the erasure of information does have a minimum energy.
If a byte has the value 0xFF, then in order to change it to 0x00 I have to erase eight bits of information before I can put in my new eight bits. (Note that most computers do the erase-and-overwrite in one step, but thermodynamically, they're two steps.) In other words, I blow eight bits of information away (which requires energy) and put a new octet in (which does not require energy).
Clearing the eight bits requires a minimum amount of energy given by kT (k = 1.38E-23 J/K, and T = 3.2K, the ambient temp of the universe). That's 4.42E-23 Joules per bit cleared.
Setting the bits? 0 Joules.
Again, this is all in the dimly-remembered past of my college physics. So take it with a grain of salt.
Second Law violation (Score:2)
The First Law of Thermodynamics says that entropy never decreases; the Second Law of Thermodynamics says that entropy never remains constant; and the Third Law says that you can't find a process that doesn't involve entropy.
But then again, it's been a long time since my college physics courses.
Re:This is amazing... (Score:2)
Tom Swiss | the infamous tms | http://www.infamous.net/
Re:Use of quantum computing in non cpu environment (Score:2)
What part of "quantum chip be put onto a 3d card" do you not understand? He didn't say put a 3d card into a quantum computer.
If quantum computing is a billion times faster, than one "quantum computer" would take care of the gaming needs of China or India.
And current computers are millions of times faster than the originals. It should then follow that the united states should have exactly 250 computers?
Re:Rob Pike's talk at technetcast (Score:2)
Oh, yeah, is that the way whereby when you inspect it it becomes destroyed? Really, does QM itself show any promise for data storage? Aren't you talking about molecular storage? (like in crystals or something). Seems to me QM is really good for processing. I wouldn't trust it to store the state of my cat ;)
Re:Use of quantum computing in non cpu environment (Score:2)
Yes, that is what the original poster suggested, which I don't see as being that bizarre. You know "computer" doesn't have to mean the entire system including peripherals and monitor. It could mean just a quantum cpu.
To answer your second point, the article said billion, not million.
So? I was impeaching your logic. Current computers are already orders of magnitude larger than the first computers. Does that mean that it is insane to give each person their own computer, instead of sharing the equivalent processing power (~250 computers, given the population of the United States is ~250) amongst all? No. It means that with the new power we'll come up with new things to do.
Putting a quantum cpu on a 3D chip, or imagining that we might actually have new uses for orders of magnitude more processing power just doesn't seem that bizarre to me.
new link (Score:2)
--
Re:This is amazing... (Score:2)
Of course they would. Every time computer power goes up an order of magnitude, there are always pundits claiming no normal person could ever use that much speed, and they are always quickly proven wrong. If nothing else, think of games: Quake and Unreal Tournament do huge amounts of number-crunching. Until we get to the point where computers can render ray-traced scenes at 60 fps in 36000x24000 pixels (huge flat-panel displays at 300 dpi) there will always be a use for more CPU power.
Meanwhile... why not explore computing meshs? (Score:2)
Pump anything you want in/out of the edges at 100Megasamples/second (at minimum), do as much hard math/matching as you need, and get the results out.
It would be possible to use "defective" chips as long as the boundary cells were all good, much like we use LCD panels with bad pixels today.
The major hurdle is ween programmers off of the von Neuman architechure, and get them into something that just seems like the biggest gate array in the universe.
--Mike--
No patents were harmed in the creation of this posting
Consumes no energy?! (Score:2)
So, basically what I mean is, I don't know what I'm talking about here, but the claim that "they don't consume energy" smells funny based on the little I did learn about quantum computing at one point. Like, even if it's theoretically true, it's deceptive to put it just that way. So, can someone more versed in physics enlighten me and the /. crowd at large?
Re:A breify [sic] Quantum Physics Thingy [sic]... (Score:2)
Reversible computers... (Score:2)
Slashdot already has an article on the 'shielding' method. Search for 'decoherence free subspaces'.
--As obsolete as the vacuum tube? (Score:2)
Another error not 40Ghz (Score:2)
Re:Use of quantum computing in non cpu environment (Score:2)
One million = 1,000,000 = 1 x 10^6
One billion = 1,000,000,000 = 1 x 10^9
9 - 6 = 3
Where the hell did you get 4 from?? Figured the factor of 1,000 had four digits in it? Try and get your math right the next time you're being condenscending about numbers.
Obsolete? Huh? (Score:2)
Um, the transistor didn't wipe out the vacuum tube. Trust me, tube-based guitar amplifiers sound a million times better than anything base on transistors. And they're still being made.
There's no such thing as an obsolete technology, merely one that's got a smaller application base than it used to have.
Furthermore, with this billion-fold speed increase, what kind of peripherals are you going to have?
----------------------------------------
Yo soy El Fontosaurus Grande!
Never gonna happen (Score:2)
Rather than blindly believing it one should remember what came of the big hype surrounding: AI -"we're only a few years away ©1967", COBOL -"the programming language for managers and non-techies", 4GL -"Natural Language Processing", The "paperless office", National missile defense, 100% portable java code, MULTICS, the "New Economy", the "Information Super Highway", and just about every CASE tool ever.
I might have mentioned cold fusion, except that I believe that cold fusion is more likely than quantum computing.
Bzzzzt! Try again (Score:2)
Although quantum mechanics leads to very strange and wonderful things I doubt anything they do will effect us. They might affect us. Next story, please! I don't read items that so blatantly butcher the language.
Re:Zero-energy computation (Score:2)
Re:Side comment on the energy section... (Score:2)
I seriously think this idea has more potential than the 'no energy used' idea because after all, entropy must increase in a forward-time universe.
Actually, that is not the case: entropy must not DECREASE in physical processes, but there is no requirement that it INCREASE. In fact, reversible physical processes are those in which entropy remains the same...irreversible physical processes are those in which entropy increases.
That being said, I would agree with you that this reliability and recovery would be the truly amazing part of these systems...although I submit that I don't understand what these statements really mean in terms of the physics (i.e. based on the first article, I'm a tad skeptical of the statements being made.....).
Re:Second Law violation (Score:2)
Indeed! In the real world, you can't win, can't break even, and can't quit the game....More technically (from Eric's Treasure Trove of Physics [treasure-troves.com]):
Re:Quantum theory (Score:2)
If you can show me how in all concievability that cat can be both dead and alive, then quantum theory is possible - otherwise it just won't work.
This is the "Schroedinger's Cat" example of quantum mechanics, filtered through a particular "Many Worlds" interpretation of quantum mechanics...but a cat is not a quantum system, and the interpretation you choose to apply to describe your philosophical position does not affect the physical system one bit. The cat is an analogy, if you will, not to be taken literally (although that is one other philosophical interpretation of the theory). It is an analogy for the way that quantum states "superimpose" on other quantum states: an electron when observed has either spin up or down, but while it is evolving unobserved, it really CAN BE in a state which is both up and down at the same time....and it "picks" which state to be in when observed in certain percentages based on the evolution of the state (again, this description is colored by a particular interpretation...if you want, rather than "picking a state" think "choosing a universe where the observation is made"). The technical details can be found in any undergraduate quantum mechanics textbook.
Quantum mechanics is the realization, at small distance scales (atomic and smaller), systems have to be described in terms of different dynamics than they do at the macroscopic level. There is nothing strange about this...physics at larger scales is always a limiting case of the physics at smaller scales. And quantum mechanics itself is extremely well tested and understood (all of modern chemistry, semiconductor development, biochemistry, superconductors, particle physics, etc. are based on quantum theory). Quantum behavior is not only conceivable and possible, but it appears from experiments that it IS the way reality is constructed; it is far from busted, and we are a much happier world for discovering it.
Re:Side comment on the energy section... (Score:2)
While I haven't been able to get to the first article (I'm getting a 403 error), I suspect that "completely reversible" doesn't quite mean what you think it means. As another poster pointed out, reversing a "multiply by zero" program would essentially create a magic mind-reader.
What I suspect "completely reversible" means is that the machine can determine all possible input states that produce a given output state. So reversing a "multiply by zero" program would wind up producing all possible numbers that you could've input into the system. Which doesn't sound impressive; however if you consider reversing something else, such as (it's in BASIC, because this seemed like something best illustrated with GOTO's and GOTO's make my brain jump back to BASIC):
10 INPUT A : REM OUR INPUT STATE
20 IF A = 10 THEN GOTO 100
30 A = A * 2
40 GOTO 100
100 PRINT A : REM OUR OUTPUT STATE
So if we run this program backwards with an output of '10', the quantum computer (using the whole quantum non-deterministic magic) would be able to simultaneously step backwards from line 100 to both lines 20 and 40. From line 20, it would continue back to 10 and from line 40 it would continue back to 30, 20, and 10. All of this would occur in the same amount of time it would take to run forward through the code. Finally, you'd wind up with a set of valid input states, one where A = 5 and one where A = 10.
However, standard IANAQuantumPhysicist disclaimers apply. I could be totally off-base with this explaination. But it seems to fit my understanding of the processing magic that quantum computers bring to the table (i.e. being able to do a bunch of simultaneous, parallel computations in linear time).
Re:Quantum theory (OTP) (Score:2)
CAP THAT KARMA!
Moderators: -1, nested, oldest first!
Quantum theory (Score:2)
Now here's the zinger; in one universe, the cat is alive, and in another, the cat is dead. How? The cat is either dead or alive - there aren't 2 cats. Just because you can't tell if the cat isn't there until you can see it (another thing about quantum theory - if you can't observe it, it is in all states simultaneously) doesn't mean you have a cat that is both dead and alive.
If you can show me how in all concievability that cat can be both dead and alive, then quantum theory is possible - otherwise it just won't work. One busted theory, and a dissapointed world without its computer.
One more thing: quantum computing, if it exists/work, would effectively nullify encryption. Ouch.
CAP THAT KARMA!
Moderators: -1, nested, oldest first!
Only the begining of quantum technology (Score:2)
I have read that the applications of quantum tech are much more widespread than most people relize. For example, it has been proven that one can measure the width of a human hair using a laser with quantum enhansments without the laser touching the hair whatsoever. They have also successfully 'teleported' a single atom instantly from one place to another (about 100 feet from the origin).
There are many more possibilities in the future of quantum technologies, and I think quantum computing is probably going to be the least of these achievements - even if it may be the first.
A good read, however fiction, is Micheal Cricton's 'Timeline', which covers some facts regarding quantum tech at this point, and also goes into some ideas of where things could progress to.
Did someone say quantum superpositions? (Score:2)
Imagine not having to wait Log N for anything!
Dancin Santa
This is amazing... (Score:2)
This is bull! (Score:2)
Secound - while a quantum computer may be able to do a ton of caculations at the same time we will never know the answers with the current theory of QM and here is why. Think of a Quantum Bit, a QuBit (and we are not building Ark's here), as a unit vector in 3d space. ITs possible for this vector to point in any direction orginating at the orgin, thus creating a "ball" in space. Now when 2 Qubits are put into the system (a Q-gate) u get 2 out puts (one from the first out put = the first imput un-altered, the secound output an altered form of the secound input). So how do we know what we did, well you observe the system, take a measuremnt. To take a measurement in QM you can only measure orthogional states, ie two possible out comes in a QuBit system, by doing so you force the "ball" (which is all possible outcomes) into one of two vectors thus reducing your infinte caculation. And after you take that measurement it doesent mean that the QuBit u measured will give you the same answer if you measure it again!
well I think thats enough for your to think on.
Re:Zero-energy computation (Score:3)
Also, someone should note that the energy savings from reversible computation are real but very, very tiny. Chips would have to get 1 million times more efficient than they are now for the energy costs of (current) irreversibility to manifest themselves. And if you expect a quantum computer to operate without tons of expensive, high-powered supporting equipment around it (NMR machines, optical pumps, liquid helium-cooled ion traps), you'd better add a couple more decades onto your time estimate.
Zero-energy computation (Score:3)
In principle, setting a bit requires no expenditure of energy; it's clearing the bits that requires an energy expenditure. So, provided you can figure out a memory design which permits that bits be set and never cleared, you can achieve zero-energy computation.
Note that I'm using "requires" in a very narrow context here. Setting a bit requires no expenditure of energy, but all the computers we have right now expend energy to set bits. That's a limitation of design, not any thermodynamic limitation we're currently aware of.
All of this comes to you courtesy of some long-ago college courses on the physics of computation. I may be misremembering quite a bit.
Re:Silly error in article (Score:3)
In real life, we don't have systems accurate enough to deliver one photon to one atom (or nucleus). Instead, we play the odds and bombard the q-bits with a very large number of photons until it is in the proper state. All the other photons are lost.
Technically, they could capture all the photons emitted by the q-bits and return them into the system at a later time. But I don't think that will happening any time in my lifetime!
-----------------
Actually, QC's will consume lots of energy. (Score:3)
In other words, if there is no thermal dephasing, you can operate with no energy consumption so long as you never look at the output, but there is a rigorous minimum value of energy that it costs to look at the output. This limit is set by basic thermodynamics and is inescapable.
In practical terms, cooling the computer to feasible cryogenic temperatures will consume lots of energy even when the qbits do not. Moreover, the fact that you will run the computer at finite temperature makes it necessary to apply error-correcting codes to compensate for thermal dephasing. Error-correcting steps are irreversible and thus consume energy during the calculations.
Use of quantum computing in non cpu environments (Score:3)
---
A breify Quantum Physics Thingy... (Score:3)
Skip this if you've had even Physics 101.
First of all, Quantum Theory as we know it has been devised over the last century. I could name a lot of famous scientists names like Heisenberg, Schroedinger, and Fermi, but you don't care so I won't.
The meat and potatoes of quantum theory is this: All particles, no matter what the size, act as both a wave and a particle. According to research, either the location *or* the mass of a particle may be known at any time.
Also, as we all know, wave interfere with eachother. If the crests of two waves overlap, they grow. This is referred to as 'Constructive' interference. 'Destructive' interference happens when a crest of one wave overlaps the trough of another wave. This gives rise to many observable phenomena, such as diffraction lines you can see when you stare at a bright light through your eyelashes. This is what causes 'ice rings' around bright lights in cold weather and the occasional 'moon ring'. It's also why you have to have your surround sound speakers positioned just so, so that they don't interfere with eachother.
Early experiments where researchers shot electrons through tiny holes in a lead sheild and onto film created similiar diffraction patterns, because, since electrons are indeed particles, they are also waves. The real shock comes when you only shoot one electron (or other particle) at a time through a sheild to create a pattern on film. Even though there was nothing for the particles to interfere with when shot one at a time, they *still* created a diffraction pattern.
This gives rise to the thought that particles that store their energy in 'quanta' and are small enough not to interact instantly with their environment, exist in multiple probability states. The electrons that created the diffraction pattern were interfering with the possibility that they existed elsewhere in the experiment.
In quantum computing, this is useful because electrons can be made to do different things at the same time, such as be in different places or aborb and release different amounts of energy. They can also simply stop existing at one place and start existing at another. They can also rock back and forth through time. Quantum computing, for the uninitiated, relies on harnessing these seemingly paradoxical phenomena. If the theories are all correct, this means that information storage will simply become infinite because there are an infinite number of states that any electron can occupy. Energy required to run a quantum process will be very little or zero, due to basic laws of thermodynamics and quantum physics. Speed of computations will be astronomical because quantum interactions take place on the pico-scale.
Quite a nifty thing...
Schroedinger's Cat says: It is not the world that must bend, but your mind. You must realize taht there is no mouse.
Rob Pike's talk at technetcast (Score:4)
Just notice that there are two different aspects when we talk about QM systems, which most of the time are treated together: First, there is the QM way of representing information, which is to some point a reality now (on modern, high density Hard-Disk, for instance), the other is QM computers, which is something for way in to the future.
Re:Use of quantum computing in non cpu environment (Score:5)
The simple answer is "possibly"
For example, it is possible that quantum computing can greatly increase 3D rendering. Basically, the main problem in ray tracing is finding the correct number of solutions that will lead a light ray to the point the eye is looking at. There are stochastic methods, like Metropolis, that greatly speed up the process of determining these solutions, but like most stocastic methods when compared to quantum methods, they are unreliable and slow (although when compared to deterministic methods, they are unreliable and fast). In a quantum 3D chip, you can theoretically easily find all of the solutions in a very short time, and thus determine the light levels for the point. This would in effect give you a perfect ray trace in a few cycles/point.
And even then, given enough qbits, you could be running those raytracing calculations on all of the points, oversampled by 256 to give a nice antialias.
But this is all in theory, because there are severe limitations on the logic that one can do with a quantum computers today. While the above could be modeled, I don't think we'll know for a while if it can be.