Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Science Technology

Further Advances In Quantum Computing 148

Porfiry writes: "Scientists at the U.S. Department of Energy's Los Alamos National Laboratory have taken another step forward in the quest for a quantum-based computer by demonstrating the existence of a physical state immune to certain types of information-corrupting "noise," which could otherwise disrupt computations based on quantum states. The essential phenomenon that the Los Alamos team demonstrated is a state in what is called a "decoherence-free subspace." The researchers showed this state's existence using entangled photons, paired particles of light whose conditions are intimately linked."
This discussion has been archived. No new comments can be posted.

Further Advances In Quantum Computing

Comments Filter:
  • by Anonymous Coward
    "decoherence-free subspace."

    This looks like an ideal holiday destination to me. It could even beat Hawaii if it would accept being part of the Union.

  • yes, somebody probably can.

    --

  • by Sanity ( 1431 ) on Tuesday October 31, 2000 @09:10PM (#659075) Homepage Journal
    Dr A Lectron, A noted quantum physist, was recently asked whether he had been successful in building the first quantum computer.

    He responded "well, yes and no...".

    --

  • If someone came up with an equasion you could plug an NP complete problem into, you could plug any NP complete problem into it, but quantum computer isn't a 'mathmatical breakthough' in the sense people talk about solving NP complete things. It's the same math, just doing a bunch of it at once. You probably could put in any NP complete problem, but it's not 'solving' them in anything other then the standard way, it's just doing it really really parallelized.

    -David T. C.
  • Actually, many things in computers work of probability. For example, when finding if a large number is prime, that's basically a crap shoot. The odds are only 99.9999999999% it's prime, using the standard calculation methods. However, the odds that your computer will get his with a cosmic ray or have some sort of internal failure are actually lower then this. It will be the same with quantume computers. You just repeat the calculation ten times or so, and everything's fine. Hey, they should be running at the speed of light anyway. And anyone using a quantum computer to add two number is crazy. If we ever get them, they will not be 'quantum computers', they will be 'normal computers with a quantum co-processor'.

    -David T. C.
  • I might agree with you and the moderators that this post was funny, but I unfortunately actually have a clue about quantum computing.
  • Sorry, it's just that moderators here go bananas over +:Funny. Maybe it's me, but it seems like they're taking the attitude "it's a joke, so it needs to be moderated as funny" instead of "that's really funny, I should mod it up." But then I'm not fond of most of the moderation here. I'd love to see Slashdot with about 1/100th of the mod points given out. But whatever, any +mod is better than a -mod.
  • I've read the other replies to your message, and none of them seem to explicitly mention this:

    According to quantum theory, particles such as those in the "spooky" experiment do not have a defined state, until some event causes their wave function - in which all possible states are simultaneously superimposed - to collapse. An observation of the state of one of the particles would be such a collapsing event.

    One widely-accepted current understanding of what can cause quantum wave function collapse, is interaction with the "environment", meaning all the other objects with which it interacts. This phenomenon is known as decoherence, which is where the term "decoherence-free subspaces" comes from. For quantum computing, you want to remain decoherence-free, to be able to take advantage of state superposition.

    Regarding the spooky particles, it's not a question of us just not knowing what their state is; the particles don't have a defined state, and exist in a superposition of all possible states, until something forces their state to be "chosen".

    Assuming for the moment that this postulate somehow represents a form of reality that is meaningful to talk about, if you wait for the entangled particles to separate a bit - or a lot - and then measure the state of one of them, the model requires that the other particle instantaneously assumes the appropriate state dictated by the state "chosen" by the first particle, more or less. This appears to require "spooky" faster than light communication (or, according to string theory, requires 11 dimensions or so.)

    This all goes back to Heisenberg's famous/notorious Uncertainty Principle, which not only puts limits on our ability to measure the states of particles, but puts limits on a particle's ability to be in defined states under certain conditions - for example, if we measure one aspect of a particle's state very accurately, we force other aspects of its state to become undefined, or put another way, force those aspects of its state to exist in a superposition of all possible states for that aspect.

    Spooky enough for you yet?

  • The simpler theory you describe is known in quantum mechanics as a "hidden variable" theory: the idea that a quantum system has a predetermined state even before it has interacted with any other particles. This seems very attractive, and would certainly resolve many of the apparent philosophical mysteries raised by quantum mechanics. But precisely because of this, it would be amazing if physicists hadn't already tried very hard to make this work. They have, dating back to Einstein's famous resistance to this idea of non-locality (communication/action at a distance).

    But in 1964, John Bell proved a theorem which showed that hidden variable theories were inconsistent with the foundations of quantum mechanics. Experimental results have since backed this up. Quantum theory doesn't work without Heisenberg's basic principle. If we throw out Heisenberg, we throw out QM, and along with it the most accurate way that anyone has ever come up with to calculate, predict, and even in a sense, explain the behavior of particles at that level.

    Given that the theory and the math works so well in practice, the implication is that quantum theory at least correlates quite closely to "real" phenomena in some way.

    The confusion and denial which these concepts tend to generate are quite understandable - physicists and philosophers have been arguing about it for decades. Personally, I have little doubt that we have not yet reached "the end of physics", and that there's a good chance that we will make discoveries in future which may help to place our current understanding of QM in a more comprehensible framework. String theory, for example, is an attempt to do that, although it has not yet been fully successful.

    However, it seems fairly clear from the experimental evidence and the available theories, that the universe does not behave in a way that is intuitive to us, with our experience being limited to macro-level phenomena. Even simple first-year physics experiments, such as the dual slit experiment, demonstrate this. Richard Feynman described this experiment as "a phenomenon which is impossible, absolutely impossible, to explain in any classical way, and which has in it the heart of quantum mechanics. In reality, it contains the only mystery ... the basic peculiarities of all quantum mechanics. "

    For a nicely done explanation of the dual slit experiment, try this page [colorado.edu] at U. Colorado. Once you've seen some of these experiments and thought about what they imply, you might just start to see why Einstein used the word spooky in this context.

  • Can anyone explain how quantum physics researchers create entangled photons and can track the members of the pair?
  • "In quantum physics, individual particles have no precise location and can coexist in more than one place at a time. Even distantly separated particles can share nonlocal correlations in a relationship known as "quantum entanglement." The entangled particles in the Los Alamos study are photons the basic particles of light engineered in such a way that they always have correlated polarizations. Polarization is the direction in which the photon's electric field vibrates."

    anyone else think this sounds a lot like philotics and ansibles from Enders Game? :)
  • ""News for Nerds" in this case apparently means a bunch of network admins spouting off about a subject they have no real understanding of. "decoherence-free subspace"? C'mon, explain what that means if you're all so goddamn smart!" It's Star Trekese. I'm sure they'd find that if they reinitialize the subspace manifolds to meet the resonance of hot spaghetti with a nice wine-based sauce with fresh mushrooms that they'd get more computing power. FREE BEOWULF! OPEN NATELIE PORTMAN! GRITS PUBLIC LICENSE!
  • Bell's inequality has been tested with entangled ions with nearly 99% detection efficiency (from the same group at Rice who demonstarted 4 particle entaglemnet in Science). Bell violations are real. Deal with it.
  • It is only the correlations that are non-local in measurements made on entangled states. As long as you cannot control what result you get when you make measurement (e.g. photon polarization), which of course you can't according to quantum mechanics, there is no way to send a message using effects like this. Which is not to say they are useless. You *can* do things like establish a random list of measurements on one of an entangled pair of particles that you know the your partner with the other particle will also measure. This is the basis of quantum cryptography and has been well-demonstrated experimentally.
  • The experiments I was referring to did not use entangled photon states, they used entangled ion states.

    And I did not make up the numbers (though I did get the reference wrong). See: C.A. Sackett, et. al, "Experimental Entanglement of Four Particles" Nature 404 (2000).

    It is true that these entanglement measurements are not a perfect test of the Bell inequalities since, as Sackett pointed out when I heard him speak recently, they do not close the "locality" loophole. They do, however, close the important "uniform sampling" loophole that you are (rightly) critical of.

    Many other experiments (with photon entanglement) have closed the locality loophole. It is true, as far as I know, that no single experiment has gotten sufficient sensitivity using spacelike separated measurements... yet. It seems somehow perverse though to hang a defense of local realism on this fact. When the definitive test of Bell's inequalities arrives, I have no doubt that they will be violated. Again. Mike

  • Sackett's ion experiments are not analogous to polarizer/photon experiments. They do not use Stern-Gerlach magnets, but rather produce entagled states of ions in a trap by using lasers to drive certain transitions.

    Of course, they cannot produce entangled states with 100% efficiency. But, according to Sackett's data (which I have no reason to doubt), they see Bell violations when averaging over all the data, not just those they have deemed to be entangled states. Thus (barring explanations based on hidden, subluminal communication between the ions) they see violations in the true Bell sense (i.e. taking unbiased expectation values over all data points).

    I agree that we have not yet seen a perfect Bell inequality test, but the position you seem to be taking is that quantum mechanical non-locality cannot exist and that there must be a flaw in any experiment that purports to detect it. This is no more reasonable than insisting that quantum non-locality is real in the absence of any evidence.

    Mike

  • I completely agree with your remarks concerning the projection postulate, yet I somehow fail to see the relevance for tests of Bell's inequalities. It's true that measurement is ill-defined in quantum mechanics (as the standard interpretation goes) and I'm sure that it will eventually have to be explained in a genuine, physical way.

    Having said that, I don't think that the meaning of measurement and how it occurs is central to the Bell tests at all. Of course, it is central to the quantum mechanical interpretation of the tests, but (if you believe the results I mentioned earlier) Bell's inequality is violated experimentally. This does not prove quantum mechanics correct, it just proves that classical-like theories cannot be correct.

    Wavefunction collapse is an ad-hoc postulate, and it is right, in my opinion, to be very critical of it. Nevertheless, the collpase postulate is central to quantum mechanics (or something very like the collapse postulate). There are countless experiments that do not involve entanglement at all that the collapse postulate correctly predicts the results of (Stern-Gerlach experiment with multiple magnets in series, quantum "watched pot" effect, etc...). While the collapse postulate is unpalatable, quantum mechanics in its current form cannot survive without it. The solution is not just to toss out the collapse postulate, but to understand the physical nature of measurement and reformulate quantum mechanics so that measurement is no longer a "deus ex machina" kind of thing.

    But I honestly believe the universe is non-classical. Maybe someone will actually be able to factor a small number in 10 years using Shor's algorithm. Would you believe then?

  • First of all:

    While the quantum mechanical explanation of Bell violations relies of the collapse postulate, of course, this has nothing to do with the empirical question of whether Bell violations occur in nature. You seem unconvinced that they have been measured, and I can respect this position to the extent that there are genuine flaws in the experiments. My point is that an empirical measurement of Bell violations has nothing to do the collapse postualte or any other theoretical construction. It is simply a fact about the behavior of nature.

    Secondly, while I have no doubt that the physics orthodoxy has a lot at stake in protecting certain theories, I think you misrepresent the physics community. John Bell himself hated the Copenhagen interpretation and non-determinism in general. He was a strong supporter of Bohm's theory. It seems weird to me that you prevent Bohm's theory as a realist challenge to the more "mystical" aspects of quantum theory since Bohm's theory is explicitly non-local (the pilot wave can change instantaneously throughout space), and Bohm himself was a pretty "mystical" character with his belief in the "implicate order" and whatnot.

    I used to think that Bohm's theory made a lot more sense than orthodox QM, and I thought that the only reason that people held to the orthodox view was inertia or a kind of philosophical malaise left over from Bohr's ideas. Bohm's theory has its own disturbing anti-commonsensical features, however. Look up articles on "surreal trajectories".

    The bottom line is that nature is non-classical, and we have not done very well at understanding that non-classical behavior. Every theory that I know of is somehow deeply unsatisfying, and yet quantum mechanics is able to predict things with incredible accuracy that classical physics completely failed to deal with.

  • I'm not saying that Bell inequality violations prove that orthodox quantum mechanics is correct.

    Bell's theorem sauys (and I'm not being precise here): take a certain class of theories which we can call local and real in the sense that there can be no superluminal influences and a particle locally carries all the information necessary to determine the outcome of any measurement. (Notice that this doesn't say anything about which theories are "reasonable" or not). It can be shown that certain inequalities must be satisfied by any physical theories of this type.

    I have not said anything about quantum mechanics or state vector collapse or anything. Nor do I have to. It is true that the predictions of quantum mechanics violate Bell's inequality (that's why Bell's inequality is interesting).

    It is a question open to experiment whether nature violates Bell's inequalities. We seem unable to agree on whether Bell violations have been measured. If they have been then locally real theories are wrong (not that quantum mechanics is right). If they weren't violated when they should have been, then quantum mechanics is wrong. There aren't any other choices.

  • Note: I am not the poster to whom the below was directed...

    Do you still keep your slide-rule around, because these pesky calculators of today just don't cut it?

    I took the SAT for years in high school, from before they allowed calculators to after. The first year they allowed calculators I brought a slide rule instead, just to make it interesting. I got a better score with it than I did with the calculator.

    Slide rules were laid out for calculation, not for arithmetic: you performed one operation, then flipped the rule over and performed the next, then flipped it over again and performed the next. Math tends to follow certain patterns in calculations, and the slide rule's design took advantage of it. Thus one was able to get the answer faster and more easily. Not to mention that the rule familiarised one with logarithms. Or that it was a Really Cool Thing.

    Still have it somewhere in its leather scabbard.

  • I thought that it was proven that if there is a polynomial time solution for one problem in NP then there is one for *all* problems in NP? Why should this be different for quantum computing based solutions?
  • Towards the end of the article, it mentioned how quantum cryptography generates unbreakable keys which can be used to unlock quantum encrypted data. Anyone who knows more than I on this subject care to comment on that? Is that possible, a non-crackable encryption? (Obviously, since this is all questions, I don't have any answers... ;-)

    ______
    everyone was born right-handed, only the greatest overcome it.
  • by Claudius ( 32768 ) on Wednesday November 01, 2000 @06:21AM (#659095)
    Extremely challenging, like in "it can't work and it won't ever work..."

    ...which makes for nice sounding rhetoric despite its being false. (Normally I hate being baited by trolls, but it's morning and I haven't finished my coffee...).

    A quick search of the Physical Review Letters [aps.org] web site shows 20+ letters in the last five years alone deomonstrating the preparation of entangled quantum states in the laboratory. Furthermore, quantum computation (an application of Grover's algoritm--see, e.g., "Experimental Implementation of Fast Quantum Searching" by Chuang et al., Physical Review Letters Volume 80, Issue 15, pp. 3408-3411) has been demonstrated in the laboratory, so your claims of quantum computation being a mere "mathematical abstraction" do not appear to be valid.

    I'm curious what motivates your objection to quantum mechanics. Do you reject the mathematical theory of quantum mechanics (in all of its various guises) which has held up rather well to experimental validation, or is it instead that the heuristic, post-Copenhagen interpretation of the theory (i.e. "spooky action at a distance") rubs you the wrong way? If the latter, then I think your objections are more semantic than substance.
  • > physical state immune to certain types of information-corrupting "noise,"

    >In the corporate world, we call this "management"

    Actually, I was under the impression that management was the information-corrupting noise.

    Or, alternatively, that management works in a state immune to information, which they consider noise.

  • Semiconductors were developed based upon quantum mechanics. If quantum mechanics doesn't work then neither do semiconducting devices such as transistors, diodes, etc.
  • This is a far cry from understanding the basic principles of EPR.
  • It's been a long time since I saw the ellipsis used in place of the period, comma, or semicolon. After all, most people stop doing that a few weeks into their email experience.

    Thanks for the flashback!

  • Nobidy has said anything anout Michael Chricton's Timeline and quantum compuitng? cool book. I would love a realistic application like that
  • Fine, use a closed, observerless environment for both the photons and the cameras. Again, I ask, how are these two physical scenarios different?
  • No, it's not spooky enough yet. It sounds awfully contrived, as if it (the theory, that is) creates its own spookiness. The observation itself is not spooky. The observation is what I would expect from a simpler theory: that the particles each had a particular spin, and that the spins added to zero to preserve angular momentum, and the observations must therefore correlate.
  • I find the above description similar to the following scenario:

    Throw a rapidly spinning coin up in the air. Have two cameras facing it from opposite directions each take a photo of it using high-speed film. Each camera has a 50-50 chance of taking a picture of a heads or a tails. The cameras now contain a pair of "entangled exposures", so to speak. Send the cameras off in opposite directions by a couple of light years or so, and develop the film. As soon as one picture is developed and shows "heads", we know that the other camera's film will show "tails". Before we developed either picture, each one had a 50-50 shot at showing either heads or tails. But now, as soon as we develop one, the other one can be determined, as if the first camera sent an instantaneous message (ooooo! spoooky!) light years away to the other camera.

    How is this different? If it's not different, then why is either case spooky?
  • first i would just like to say that whatever 3rd grader posted the first post on that site should be drug into the street and beaten with frozen maceral.

    now, maybe quantum computing is a swindle, I'm not smart enough to know, so I'll leave that judgement to you. what i do know is that there are numerous examples of 'science fiction' ridiculousness becoming real science. and i think there is a real need to research into every conceivable area of science, because what your looking for is not always what you find. there are also many examples of research in one area leading to advances in completly different subjects. your rant makes me think your one of those people who always say it can't be done, while others are working on ways to make it happen.

    I think this quote is relevant here, and imho it's very interesting. I wish i had a link back to the /. article it was refering to, but i crtl c, crtl v'd this into my quote.txt file when i saw it in the comments, whoever it is, makes sense to me

    On a similar note, it is implied by quantum physics, that quantum particles appear to be aware of eachother, and that this action at a distance has no time delay, such as the limit of the speed of light. I wonder if SETI is listining to the wrong thing. Imagine this technology coupled with the action at a distance principle, so that you could choose coupled quantum particles to communicate over vast distances with no time delay. Couple these technologies with virtual reality, and it could be possible for communities, seperated by vast distances of space, to communicate and interact in real time. Wouldn't it be surprising to find a universal (literally) quantum network that was in use, and that we have been looking in the wrong place all this time. I would imagine that any species entry in to the universal forum would be predicated by their discovery of the technologies, and their ability to apply them to interact with this universal network. Instead of physically traveling in a space ship to distant worlds, we instead project our consciousness over vast distances with the help of virtual reality and quantum communication. Perhaps ET isn't going to show up in a galactic cruiser, but instead is patiently waiting for us to pick up the damn quantum telephone. We just haven't heard it ring yet.
    -Master Switch, one more element in the machine


    i don't want my tax money to pay for silliness that could win an ig nobel, but then again, how do you know what will lead to the next major advancement?

  • The idea behind the "quantum leap" phrase is that it is discreet. A quantum transition from eg. An N=1 to N=2 state never passes through "N=1.5" (which doesn't even exist, or make sense at all). So people adopted the term in a colloquial fashion to refer to an instantaneous jump, as opposed to a gradual "classical" advancment.

    Of course, like every other colloquialism, it quickly became grossly over- and mis- used.
  • Would this work on two pieces of equipment that WEREN'T attached by optical fibers?

    I don't think it has to be fiber.. That's mainly a photonic medium. Theoretically I believe it could be a super-conducting wire, or even a virtual super-conducting channel through space. Essentially it's anything that will not disturb the messenger quantum particle(s). Light is the easiest thing to deal with as far as I know. As an EE undergrad, we've studied electrons out the wazoo, and we know of their relationship to photons, but I never fully got comfortable with them.. They're discrete transmitter energy packets for charged particles. When an electron slows down, it emits it's momentum energy in the form of a photon.. When an electron speeds up, it's because it was hit by a photon. (though it's also possible for physical collisions, which carry momenum in the normal macro-scopic scale.. They're called phonons I believe). The amount of energy released is the frequency of either the photon or phonon. In both cases (I believe), the medium dictacts the ratio of wave-length to frequency (e.g the speed-limit).

    From what I understand, a photon travelling through space is affected by forces such as gravity (though I don't think any others), but otherwise travel uni-directionally through space until they collide with another charged particle (quark or lepton / sub-nucleid or electron), where it transfers the energy. It's path however works like a wave.. If you consider the medium to be water, and the wave-front itself to be the photon, then it makes more sence, except that the wave's amplitude is so low that only one thing in the entire ocean will ultimately feel it. If enough discrete particles emite photons (even at random frequencies), then the effect will be more like a river wave-front. I don't fully understand how the wave-particle chooses it's path. It's not really attracted to a charged particle, yet at the same time, it's collision rate is substantially higher than say a neutrino (a nearly mass-less, chargless lepton (a brother of the electron)), who can pass through entire galaxies w/o incident.

    The thing that bugs me about quantum physics is the quote, "if it doesnt' completely confound you, then you don't understand it". Well, I've always thought it seemed intuitive, so I must be missing something.. The intuition is that it seems to act very similarly to macro-scopic physics, so long as you consider invisible forces to be virtual springs. For example, the "Einstein Podolsky Rosen Paradox", where nothing can be known about either photon until one hits it's final polarized destination, and that somehow they're transmitting information back and forth instantaneously.. The idea that I read suggests that they are independant somehow in all ways except that they must be orthoginally poloraized (I guess I'd have to learn more about how you can garuntee their orthogniality). That when they collide, they pull from some localized "random information database" and know what the state of the other one was.. It's supported, I believe because you can not monitor their states without messing up the experiment. But it seems to me that there was prior knowledge between the two states.. At the time of their creation, and that they simply carried their information in seperate directions.

    The schrodenger cat experiement (that I recall anyway) said that if you had a cat in a box, and cut the box in half, the cat would be in one of the two boxes (though it was unknown). and that if you seperated the boxes by 100 light years, then opened one, you've immediately transmitted the information to the other box.. The other cat will either be or not be. It's a high level analogy for various quantum properties, but it seems asinine, to suggest that the info was being transmitted only at the time of measurement.. The darn cat _knew_ which box he was in from the beginning. Radomness dictated which box the quantum particle was closer to when they were seperated and they just altered their quantum orbital path to the new confines and inertial frame. The whole abstraction of quantum physics that us lay people get makes it difficult for us to get what's really going on.. All the analogies I've been told do not express the "quantum-wierdness" that everybody always talks about, they're logically founded.
  • but continuing the problem, doesn't it all boil down to random distribution of quantum particles? You had a particle that was created through an interaction of some kind - the electron radiated a photon, or two particles collided/interacted and formed 1 or more resultant particles. Everything about those particles is known to those particles, just not the observer. I understand that you can't "measure" the particle without disturbing it, but the particles themselves aren't magical.. They just move through space and time (and what-ever other curled up dimensions) with the potential to react to one of a discrete set of events.

    If two particles are known to have orthogonal properties (such as polarity), then each particle knew all along what they were, they were just created or selected by some special process right? (I don't really know why Einstein calls it a paradox).

    -Michael
  • Ok, now think about giving a single electron some sort of spin to represent a data state on a computer. Sounds great? Yah, it will be great until someone rubs their sock on your box sending your computer into chaos.

    Hmm, this is right. sounds just as silly as storing a bunch of electrons on an array of millions of small capacitors, and then refreshing them dozens of times per second, to store data bits. And if extra charge comes along, or it's not refreshed long enough, the fragile data gets wiped out. Oh wait a second, that's how DRAM memory works.

    Okay, but it's certainly as ridiculous (sp) as making little tiny EXTREMELY static-sensitive transistors wired in a feedback path with current maintaining them in 2 possible states so they can flip and flop between the two, and thus store data bits. Yeah, and if someone rubs their sock on said transistor substrate, they'll send the data into total chaos. Hey wait a minute, that's how SRAM works.

    So you can see these seemingly delicate scenarios are both in use today, almost certainly both being made to use in the computer you've used to post your sarcastic little statement. BTW, Dr. Kool, since you're at Harvard University, you're probably very aware of some of your fellow Harvard faculty research [harvard.edu] with quantum structures, and, I believe, quantum bits.

    Good day.

  • Haha

    Manipulating individual electrons is a fool's errand.

    Are you the same AC from the beginning of this thread? If it's a fool's errand, then it's a bigger fool who stays with yesterday's 'comfortable' technology instead of pushing forward. You'd probably still be using vacuum tubes instead of solid-state with your forward-thinking ideals. Do you still keep your slide-rule around, because these pesky calculators of today just don't cut it?

    You can't even imagine how small a single electron is.

    Well, if you think of a single isolated electron as a spin 1/2 realization of Poincare algebra, you may be right, but such cases are pure thought experiments. It's the actions of the electron in the presence of other particles that matters. So, look at the solutions for spherical harmonics of solutions for Schrodinger's equation for an electron-proton pair (ie, the hydrogen atom) to get a feel for the size of an electron cloud in the smallest of one of it's common configurations. Probably on the order of Angstroms.

    The electron probability cloud probably get much bigger, fuzzier, and far more complex when you think of electrons in metals and other matter (organics maybe?). In terms of low-dimensional systems (quantum dots / 2-D electron gases) there may be some interesting numbers for the 'size' of the electron, but I don't these cases offhand.

    In terms of not imagining the scale, here you are perhaps right. It's difficult to visually comprehend 10 or so orders of magnitude in distance scale. But I can get some idea of it.

  • Sorry, only have time for a brief comment, but we were going over the EPR in more depth today, and Bell's Inequality, which is roughly an example derived from making spin measurements in the case that hidden variables are present. You go through some calculations assuming there are all sorts of hidden variables that contain the spin information in all the directions, in the case we did today. This yields an equation, which ultimately says one quantity must be less than or equal to another quantity. One can then use this inequaltiy to show that hidden variables aren't present because actual quantum measurements (as experimentally verified) adhere to a different set of conditions which violates Bell's inequality equation. So, thus, there are no hidden variables.

    But it's not as simple as that, it's still a very confusing philosophical confusing notion. Feynman supposedly said anyone that's not bothered by this has rocks in their head. To top it all off, the professor suggested we all to go home tonight, take a bong hit, and think about it some more.

  • actually, i wasn't dissing slide rules, they're one of the coolest mathematical gadgets around. You're right, you really get to learn logarithms when you use them. &lt tongue-in-cheek old-fart speak &gt nowadays, these pesky kids type an equation (in INFIX, not even RPN) into their super-fancy graphing calculators, and get an answer (and can plot it, do least squares, etc). &lt /old fart &gt

    I was just using the slide rule as an example back to antiquity, but of course you probably realized that :-)

    I like one of the scenes in Apollo 13, where they're checking the gimbal coordinates, and you see the NASA engineers with their slide rules hacking away. it's great.

  • Can you provide some cool looking Quantum Mathimatical symbols too please? I am a sucker for punishment

    I just found this page [phys.rug.nl] with some descriptions, and a taste of funky math :-) I haven't really checked it out fully, but it looks like it's probably a good place to get a basic idea of some of these principles (and hopefully they have some decent movies too).

    enjoy.

  • by wass ( 72082 ) on Tuesday October 31, 2000 @09:34PM (#659113)
    This is great. Just today in my quantum mechanics class we were talking about the Einstein-Podolsky-Rosen paradox, and two entangled spin 1/2 particles sent in opposite directions. The particles were entangled, such that their combined angular momentum was in an S=0 state. That is, total angular momentum=0.

    This means that if the spin of one particle is measured in any direction (say out of X or Y or Z for cartesian coordinates), then the spin for the other particle is going to be opposite that measured for the first particle, BUT ONLY IF IT'S MEASURED IN THE SAME DIRECTION. So if you measure the z component of particle 1, you get either h-bar/2 or -h-bar/2, and you know that particle 2, if measured in the z direction, gives the opposite one. This will work if both measurements are in the x, or y, or any other combination of directions. But they must be the same direction.

    One fundamental aspect of spin is that spin operators in different directions don't commute. that is, if one measures the spin in one direction, say Z, then another direction, say X, and then measures the Z direction spin again, it won't necessarily be the same. That is, measuring the X direction between the two Z measurements changed the state of the system.

    So the part of this thought experiment that bothered Einstein and company is that if one can see that if both particles are entangled such that any spin measurement made will be opposite the other particle's measurement, providing the spin direction being measured is the same, then this implies that there are some sort of hidden variables in nature to account for this. Namely, the particles are entangled in seemingly all directions, until that first measurement is made. Surely, then, nature must possess some knowledge about all three orthogonal directions simultaneously.

    But what Bohr and Heisenberg maintained is that one cannot simultaneously measure the X,Y,Z spins. That is, we CANNOT ask about measurements that could be made but were not made, we can only talk about those measurements that were made.

    So it's a bit different than the analogy the article gives about two pennies, one being heads up and one being heads down, because if your penny is heads up, it'll always be heads up, as that is not a fundamental spin-1/2 particle.

    sorry if this post makes ZERO sense, i'm just blabbering about what was pretty cool in quantum class. hopefully tomorrow we'll learn s'more to make it make more sense.

  • damn...things should work now. Pair networks sucks!!
  • Look at this [std.com].
  • Yeah... but it doesnt have much... Just a few quotes, a funny dilbert cartoon, and some amusing quotes from 1/2 centry ago.... (oh.. and the pps who are working on the projects)--- But nothing of any hard content as to progress/work/etc that I could find.
  • A brief introduction to Quantum Cryptography [qubit.org] Looks interesting... I wish I could find more info on the Los Alamos site about what they've done with crypto.

    (Sorry about the double-post; Didnt think about what when I completed the subject line)

  • This works out only if you are in a close, observer-less environment when you take the pictures. If you are present in the room (and even if you are not looking at the coin) then some information about the "head-tail" decision will leak to you and the negatives will choose the state accordingly before you even start to separate your cameras by a few light year.

    There is of course a whole problem in the definition of an "observer". Noone knows what it is supposed to be.
  • They aren't, I think. State superposition doesn't have to happen only at the atomic level, whole macroscopic objects (such as a roll of film, or a cat) can also be in superposition. Often I think of Shroedinger's cat being in two macroscopic quantum states is a consequence of whether the geiger counter detected a radiation emission or not, a atomic-scale event that trickled down to all the other atoms of the box until two distinct universes coexisted simultaneously inside it. Your scenario with the cameras seem perfectly valid to me.

    Now, as to whether the cat itself is an observer and can make the superposition collapse... do physicists know? What does QM say about that?

    (IANA physicist btw)
  • Oh, hey. That's right. Quantum computers WOULD make it trivial to compromise the encryption, say, on a DVD...

    And therefore be illegal under the DMCA.

  • Quantum cryptography is the only encryption method that eliminates person in the middle attacks. Naturally there has been a lot of interest from the security industry. Photons are a good medium for communications since they don't give off much of an electronic or magnetic "wake" that could be detected.

    Slashdot has covered the implications of increased speed in factoring and how that might effect the PGP key space in past articles. The only thing new that I found in the article was how superposition in quantum computing could speed database searching.

    (insert Natalie Portman joke here)
  • You can't effect the outcome of a random event. The article does a good job of explaining this in simple terms using the example of two quarters. Say one quarter is left in an unopened box on earth, and the other quarter is sent to mars to be flipped. The box on Earth can only be opened AFTER the martian flips his quarter on Mars or it voids the experiment. If the Martian quarter comes up heads, they will find the quarter in the box on earth heads up as well. The problem with using this phenomena for communication is that the Martian cannot effect the outcome of the coin toss..
  • I imagine a chess program written for a quantum computer would always play black and you would never win.
  • The goverment will classify a working model so they can keep thier secrets.
  • Do you have a reference to the actual paper describing decoherence free subspaces and one describing what was actually performed in this experiment? I've always been sceptical of quantum computing papers because it has seemed pretty obvious to me that decoherence effects grow exponentially so I'd love to see a good paper contradicting me!
    --
  • If you encrypt a message and send it away while keeping a copy for yourself and someone makes a copy of your message and trys to crack it you would know immediatly because their actions would alter your original copy and you could trash their copy by altering your copy? eh?
  • In all likelyhood, if these machines every become a reality it will be illegal in most nations to own one.

    A few years ago you had to have a license to use a dec alpha as it was classified as a supercomputer

  • by mat catastrophe ( 105256 ) on Tuesday October 31, 2000 @08:21PM (#659128) Homepage
    ...scientists lost their research when the hard drive it was on was reported missing.

    "Well, this guy in overalls showed up and said he needed to take the drive out to be cleaned. I'm a scientist, for Christ's sake, not a technician. How was I supposed to know," one scientist, who spoke to us on condition of anonymity told us.

    Officials at Los Alamos are confident that, by following the lead of the State Department and offering a $25,000 reward, they will soon recover their lost data.

    "Otherwise," said the scientist, "we are, like, so totally fucked. I mean, this project has been hell! Sixty-hour weeks for six months is harsh!"

  • by Trinition ( 114758 ) on Wednesday November 01, 2000 @03:57AM (#659129) Homepage

    Look, Quantum Theory is a theory just like any theory except that it tends to explain a few things that more rudimentary theories cannot. It is not the ultimate reality, though. All of these theories are just casting our eyes further away from the shadow we perceive as reality and towards what is casting that shadow.

    I did research for a paper on quantum computing a couple of years ago. There have been demonstrated uses of quantum effects in this field, as well as the dozens of other fields where quantum theory is applied.

    In fact, they have been able to use NMR to glean the bulk spin of the composing atoms of a liquid and perform simple operations. In another angle of quantum computing, they've been able to use lasers to super-cool cesium atoms and manipulate their quantum states to the same effect.

    All of this is pointing to the fact that quantum theory correctly predicted the ability of such quantum computing. It is enabling the theories from long ago, as well as newer ones, to finally be applied. It has the potential to disrupt (and even re-invent, but that's another story) encryption as we know it.

    Furthermore, quantum entanglement is not a requirement for these processes.

    You seem to believe that reality is what you see. You cannot see quantum states from our macroscopic world. Nor can you see relativity in action. Hundreds of years ago, people couldn't "see" gravity, either. And long before that, people couldn't even "see" air. Please, see the light. Quantum theory, like all theory, is a mathematical abstraction of how the world works. And each successive model is getting closer and closer to what can actually be experimentally observed.

  • More clueless news reporters who want to sound hip insterting terms like "a quantum leap in technology." I can hardly wait.

  • Does anybody else think that the term "coherent" could be substituted for "de-coherence free"? Aren't we making things a little bit more complicated than they should be or is this already so complicated that a little bit more can't hurt?
  • Quantum Computing is NOT the same thing as Quantum Cryptography even though they are both rumored to involve QM ;->.

    Here's an intro on quantum computing for non-physicists [umn.edu]...

  • From the way you describe it here, you seem to be personifying those photons. (Not knocking you, just 1 way of putting it here) I thought I'd just give another way of saying here for others. 2 points of view being better than 1 an' all that.

    That spookyness, I think, comes from our inability to know the polarisation before the event of taking te measurement. As you said, but I think that you can also say that since you don't know the orientation then the polarisation can be in ANY arbitrary orientation. If you think about it, there will then be a 50% chance of being correctly oriented to pass through the filter. If this is the case, then the other photon will always have a 100% chance of passing through if the first of blocked, or a 100% chance of being blocked if the first passes through.

    If you concentrate on the orientation of the first photon as being a probability question and then second photon being of opposite polarisation then you don't see it as a case of many possibilities occupying the same space simultaneously, merely 1 value that you don't happen to know at this time. This point of view seems to give me fewer headaches than others. Hope it does the same to you.

    dnnrly

  • lemme see:

    the Los Alamos team demonstrated is a state in what is called a "decoherence-free subspace." subspace

    and even better: (although I admit it's a bit of a stretch) "Decoherence in Kwiat's system is intentionally created by passing the entangled photons through a roughly 10 millimeter piece of quartz." Ummmm - di-litium, anyone?

    Sorry, these just seemed ironic direct corelations...

  • In other news today, the Internet has been declared a "mathematical abstraction with no bearing on how the real world works." All servers are to be shut down at midnight EST tonight.
  • Guess NSA can't crack RSA yet, or they wouldn't be interested in this technology :)

    m
  • So do I, I'm a physicist who has been working with computers long enough to remember using a soldering iron to 'write' a program.

    Lighten up. It's just a stupid goof, and to be fair it was only ONE moderator who thought it was funny and I never even expected that.
  • Ok, I was wrong, it was two. My mistake, but someone had the good sense to knock it down one.

    I can't be held personally responsible for the moderators.
  • by kfg ( 145172 ) on Tuesday October 31, 2000 @08:59PM (#659139)
    " We got increadable speeds out of this puppy, but we arn't certain just what the data actually is" tester says.

    "When we nailed the data down the thing slowed to a crawl" he continued.

    "To make things worse, sometimes we weren't even sure where the damn thing WAS."

    We'll keep you posted on continuing development, but it looks like it has a ways to go.
  • Silicon ships are vulnerable to thingsl ike EM interference and other things. Thats why we have ECC (error correcting code) memory. It tries to cut down on the number of mistakes for situations like 24/7 servers that can't afford to corrupt over time. Even so, these chips still aren't _perfectly_ error proof. So, as long as the probability of failure for the QC is about the same as the probability of failure for a cilicon ship then its all good.
  • by deglr6328 ( 150198 ) on Tuesday October 31, 2000 @09:45PM (#659141)
    In 1935, Einstein met with Boris Podolsky and Nathan Rosen to formulate a theory which basically said that particles intrinsically possess certain properties before these properties are measured. a "side effect" of the theory is known as the Einstein Podolsky Rosen Paradox (EPR Paradox).

    Suppose you entangled a pair of photons polarised at 90 degrees to each other. You can't know what the polarisations are until you measure them; they could be vertical, horizontal or any angle in between. All you do know for sure is that they are perpendicular to each other. You send these photons off in different directions. At some point as they shoot off into the distance the photons will run into polarising filters you've cunningly put in their path.

    Suppose one photon passes straight through a vertically aligned filter. It must be vertically polarised, so its partner must be horizontally polarised. The second photon would therefore pass through any horizontal filter in its way, but not through a vertical filter. So far so good. One photon is vertically polarised, the other is horizontally polarised, so they are at right angles as they should be, and all's well with the world.

    Not quite. Until the first photon hits the filter, you have no idea whether it will go through or not. And for that matter, the photon doesn't know, what sort of filter it is going to hit until it gets there. Since you know nothing about either photon's individual polarisation until you make a measurement, you only know that the odds of it going through are fifty-fifty, no matter what angle the filter is set at. So the second photon can't know what the first photon will do until it actually does it. Yet the actions of the first photon determine the actions of the second. The second photon has to get some sort of tip-off from the first, even though they are physically a long way from each other.

    What's more, this tip-off has to be instantaneous, because it has to work even if the two photons hit their filters at exactly the same time. It's impossible to predict what either photon will do, and yet the two of them must act in concert so that their polarisations have the correct relationship to each other. This is the "spookiness" that Einstein, Podolsky, and Rosen took such exception to.


  • Good point!

    That's what scientists called 'hidden parameters'. (hidden, because unlike the colour of the ball, you can't simply look at the photon and see it's polarisation)

    I'm sorry I can't simply explain you why this assumption is wrong, it's a couple of pages in a quantum physics book and even that explanation is too short and a little but inaccurate.

    The point is that you can never measure all of your hidden parameters, because one measurement destroys the other parameters. But still the hidden parameters, if they exist, do change the results of some statistical calculations. Therefore its possible to do an experiment to decide if the hidden parameters had some certain value even before you measured them.

    These experiments showed that, in fact, the parameters you don't measure don't have a certain value at all. It's not hidden, it's simply non-existent.

    (There is a difference in quantum physics between 'you don't know some value' and 'it doesn't have a certain value')
  • Its vital connection with the real world is based on a highly dubious (even outright absurd, according to some physicists, including Einstein) conjecture about entangled quantum states (roughly, a special kind of "mystical" non-local correlation among events) which was actually never confirmed experimentally.

    According to many other physicists quantum entanglement does occur, and there is supporting experimental evidence. This evidence is disputed by some people, however the majority accept it as valid.

    I note that Caroline Thompson on her page that she suspects that the experimenters have produced [possibly without realising] detectors that mimic quantum theory. A non-specialist in the field who is skeptic of the results is not the same thing as a definite disproof by a trained experimentalist. Please note that Caroline Thompson offers no experimental results, only some suggested experiments that have not been tested, which could disprove QM as it currently stands.

  • (It is an abiding sadness of mine that there is no accepted text depiction of the the noise that is made when contestants get things wrong in Family Fortunes. I propose that we start using

    \/
    /\

    as that's the sign that appears on the screen, anyway back to the point)

    \/ \/ \/ \/
    /\ /\ /\ /\

    Entanglement has a firm experimental footing as well as an fantastically strong theoretical basis. There've been some experiments in Italy (so no ref) that have been running over the past couple of years (not "the 1980's") that have "proved" entanglement over very large distances (kilometres)

    requiring no belief in mystical instant action-at-a-distance

    It was shown quite beautifully and clearly by one of my Philosophy of Physics lecturers, why EPR doesn't actually break causality, it's not mystical and requires no "belief" (taking you to mean the religious version rather than the scientific one).

    Anyway, now might be good time to remember when the phrase "action-at-a-distance" was first used. It was Newton, trying to explain something he didn't quite understand, which puts you in lofty company indeed.

    No dictionaries were harmed, or indeed used, during the production of this post

    -------------------------------------------

  • Can you read? There are a number of good posts here describing clearly what EPR is and why it is important.

    I was expecting to have to come in here and "kick some ass" as my American neighbour would say, but so far all I've found is valid well backed up comment (apart from this one, of course, and that chap further up who doesn't believe in QM).

    If you want to read real numpty science take a look at this baby [slashdot.org]

    -------------------------------------------

  • Doh! Out physiked again. That's what I get for believing New Scientist.

    I stand corrected.

    -------------------------------------------

  • That's an interesting way to reformulate the Schrodinger's Cat "Paradox" (and one that connects it to the EPR "paradox"), but the classical version usually involves killing the cat. Basically, you put a cat in a box with something that could kill it (eg. a device which releases poison gas). Connect the device to a purely random quantum system, like a particle which has a 50% chance of decaying in a given time period. Then, after the given time period, you could say that the cat is in a superposition of live and dead states. Then you open the box and "the waveform collapses" and the cat is either fully alive or fully dead.

    Note that the paradox doesn't really say anything new. It just takes a strange microscopic process and bumps it up to the level of an equally strange macroscopic process. I guess this might serve the purpose of convincing some people of it's strangeness, but beyond that it's pretty pointless.
  • When people aren't confusing quantum computing with quantum cryptography, they're busy confusing the purposes of the two. Quantum cryptography (of which the above is a reasonably accurate description) is handy method of realtime authentication between two parties. Despite it's name, however, it is not in the most general sense encryption. In real life, people need to encrypt files. They need to be able to leave them on their hard drive in a secure form. They need to send them multiple times to various people. They need to have the capability to use public and private keys and an existing infrastructure (the internet). Quantum cryptography will be useful for a very small handful of organizations who can afford dedicated connections (and who will need other means to secure data once it's been sent). It is in no way a replacement for encryption.

    I'm not an expert, but if and when quantum computing becomes feasable, I see no way that encryption could exist in the way it does today. At best, individuals will have to rely on security through obscurity, creating complicated algorithms using proprietary (quantum) hardware that would be prohibitively expensive for another party to reverse engineer. Or maybe someone will come up with something really new that doesn't rely on quantum-computable math. In any case, the age of PGP and the like will be over.
  • The story hasn't been up for 10 minutes, and look what we have:

    Warning: MySQL Connection Failed: Can't connect to MySQL server on db11.pair.com (61) in /usr/www/users/davew/b2tb/geeklog/public_html/comm on.php on line 79

    Oh well... I'll just have to keep on hitting "reload". :)
  • Even though we cannot instantly transmit information using quantum entanglement, transmitting randomness is still very useful.

    [snip]

    Note that in quantum encryption, we are transmitting the code instantly. The actual message will arrive much more slowly -- at only the speed of light.

    Why can't we transmit information this way? If by observing or altering the particle at my end, it causes the particle at the other end to be altered, I can send a message easily. I believe the experiment you refer to did just that. A change was made on the one particle which was recognized in the other particle. I could be wrong here- feel free to point that out. But I believe in essense the idea is that by touching one particle in some way, it nescessitates that the other particle was always conjoined to the first and so a message could in fact be passed.

    If this is true, one could decide on a predefined syncronization series of changed particles within the background radiation of the universe (like a kind of modem sync). I could change a whole buncha particles and you could look at a whole buncha particles for the signal. Hopefully some of the particles you look at would be paired with some I changed. Once syncronized, we could send messages.

    If this was possible you would even be able to send messages across vast interstellar distances (once you had the sync scheme). Maybe alien races already communicate this way :-) Nah, there's probably not enough paired particles from the big bang in sufficient quantities in the two seperate places. No free lunch and that kind of thing- but it might be theoritically possible.

    Again, I'm way outa my league here, so feel free to point out if I misunderstood the experiment. I just think its a fun speculation.

  • The whole field of Quantum Computing is a mathematical abstraction (fine, as any pure math is, as long as you don't try to claim that's how the real world works). Its vital connection with the real world is based on a highly dubious (even outright absurd, according to some physicists, including Einstein) conjecture about entangled quantum states (roughly, a special kind of "mystical" non-local correlation among events) which was actually never confirmed experimentally. And without that quantum entanglement the whole field is an excercise in pure abstract math with no bearing on reality.

    You could be saying the same thing about aeroplanes.

    Research and experimentation are rarely a waste of effort, whether to prove or disprove, because we have to find out. The skeptics said we would never build aircraft, split the atom, travel safely in railway carriages, go to the moon, etc etc etc.

    I think it is far better to err on the side of curiousity, than to just sit around never asking any questions. History proves this irrefutably. We learn from our successes. We sometimes learn from our mistakes too. It is worth the effort.

    So let these guys go and discover what they can! Let the engineers make it work! Just like the early experiments with aircraft, there will be hundreds of failures before someone gets it right, and when and if it happens, all those who made the effort will be vindicated.

    --

  • by photozz ( 168291 ) <photozz&gmail,com> on Tuesday October 31, 2000 @09:44PM (#659152) Homepage
    physical state immune to certain types of information-corrupting "noise,"

    In the corporate world, we call this "management"

  • This is where the crazy theory gets proved.

    People who didn't understand Newton said the same thing about rockets.

    They were wrong too.

    Which is not to say this will work.

    Just to say don't trust "common sense" over pure maths.

  • A very good book for anyone who's interested in this 'spooky' stuff is 'speakable and unspeakable in quantum mechanics' by J Bell - the man that invented Bells inequalities (the statistical quanties which were alluded to early).
  • More on Quantum Cryptography at:

    http://qso.lanl.gov/qc/ [lanl.gov]

    For those of us who are Paranoically Inclined (tm)

    I want the future now!
  • Sure, quantum computing can factor enormous numbers really fast, but its been pointed out a number of times that as Quantum Computing Taketh Away, it also Giveth:

    Encryption Destroyed and Resurrected

    As mentioned above, the classic problem that a quantum computer is ideally suited for is cracking encryption codes, which relies on factoring large numbers. The strength of an encryption code is measured by the number of bits that needs to be factored. For example, it is illegal in the United States to export encryption technology using more than 40 bits (56 bits if you give a key to law-enforcement authorities). A 40-bit encryption method is not very secure. In September 1997, Ian Goldberg, a University of California at Berkeley graduate student, was able to crack a 40-bit code in three and a half hours using a network of 250 small computers.15 A 56-bit code is a bit better (16 bits better, actually). Ten months later, John Gilmore, a computer privacy activist, and Paul Kocher, an encryption expert, were able to break the 56-bit code in 56 hours using a specially designed computer that cost them $250,000 to build. But a quantum computer can easily factor any sized number (within its capacity). Quantum computing technology would essentially destroy digital encryption.

    But as technology takes away, it also gives. A related quantum effect can provide a new method of encryption that can never be broken. Again, keep in mind that, in view of the Law of Accelerating Returns, "never" is not as long as it used to be.

    This effect is called quantum entanglement. Einstein, who was not a fan of quantum mechanics, had a different name for it, calling it "spooky action at a distance." The phenomenon was recently demonstrated by Dr. Nicolas Gisin of the University of Geneva in a recent experiment across the city of Geneva.16 Dr. Gisin sent twin photons in opposite directions through optical fibers. Once the photons were about seven miles apart, they each encountered a glass plate from which they could either bounce off or pass through. Thus, they were each forced to make a decision to choose among two equally probable pathways. Since there was no possible communication link between the two photons, classical physics would predict that their decisions would be independent. But they both made the same decision. And they did so at the same instant in time, so even if there were an unknown communication path between them, there was not enough time for a message to travel from one photon to the other at the speed of light. The two particles were quantum entangled and communicated instantly with each other regardless of their separation. The effect was reliably repeated over many such photon pairs.

    The apparent communication between the two photons takes place at a speed far greater than the speed of light. In theory, the speed is infinite in that the decoherence of the two photon travel decisions, according to quantum theory, takes place at exactly the same instant. Dr. Gisin's experiment was sufficiently sensitive to demonstrate the communication was at least ten thousand times faster than the speed of light.

    So, does this violate Einstein's Special Theory of Relativity, which postulates the speed of light as the fastest speed at which we can transmit information? The answer is no -- there is no information being communicated by the entangled photons. The decision of the photons is random -- a profound quantum randomness -- and randomness is precisely not information. Both the sender and the receiver of the message simultaneously access the identical random decisions of the entangled photons, which are used to encode and decode, respectively, the message. So we are communicating randomness -- not information -- at speeds far greater than the speed of light. The only way we could convert the random decisions of the photons into information is if we edited the random sequence of photon decisions. But editing this random sequence would require observing the photon decisions, which in turn would cause quantum decoherence, which would destroy the quantum entanglement. So Einstein's theory is preserved.

    Even though we cannot instantly transmit information using quantum entanglement, transmitting randomness is still very useful. It allows us to resurrect the process of encryption that quantum computing would destroy. If the sender and receiver of a message are at the two ends of an optical fiber, they can use the precisely matched random decisions of a stream of quantum entangled photons to respectively encode and decode a message. Since the encryption is fundamentally random and nonrepeating, it cannot be broken. Eavesdropping would also be impossible, as this would cause quantum decoherence that could be detected at both ends. So privacy is preserved.

    Note that in quantum encryption, we are transmitting the code instantly. The actual message will arrive much more slowly -- at only the speed of light.

    -Ray Kurzweil, The Age of Spiritual Machines [penguinputnam.com], pg. 115
  • Yup... basically they use the random data from quantum processes (random data, which as an earlier post pointed out goes to both parties) to generate a One-Time Pad.

    One-Time Pads are guaranteed unbreakable, since amongst the set of keys that the encrypter might have used are those keys which will generate meaningful but wrong messages of the same length.

    eg: I could encrypt "hello world" using a simple OTP and if you tried to crack it you would find amongst your possible decryptions every meaningful sentence or word of 11 characters (assuming a fairly basic XOR encryption). This would include "Hello Frank", "Big fat gun" and so on.

    Furthermore, Quantum Cryptography means I can essentially guarantee that you won't be able to listen in on the transmission channel of this OTP, so I can be sure that the pad will get to the person I'm communicating with in a secure manner.

  • how about the definitions of angular momentum:

    J x J = ih J

    where J is the angular momentum operator: J=(Jx, Jy, Jz) and 'x' denotes the vector cross product.

    For a state represented by the ket |jm>, we define j and m such that:

    (J.J) |jm> = j(j+1) |jm>

    Jz |jm> = m |jm>

    For normalized kets labelled by distinct eigenvalues, the inner products are equal to 1 if the eigenvalues are the same, 0 otherwise;

    and so on and so forth....

    Dave

  • Just don't register www.entangled-photons.com. I'm sure I've heard Spock mention that at least one episode and before you know it Shatner will be at your door demanding you relinquish the URL.

    Oh and I suppose you'd have to include www.entangled-photons-suck.com

  • . . . but after reading all of the recent insane patent/copyright/censorship articles and listening to the 2600 radio show tonight where I FINALLY heard the audio of the reasoning behind the DeCSS decision, I just cant rejoice about this. In 15 years Ill have a shiny new quantum machine that I can play freecell on and write papers that automatically get mailed to m$. Hooray! were all doomed it seems.
  • One way of interpreting "spooky action at a distance" (ie: apparent FTL communication between entangled photons) is to revisit the "Standard" interpretation of Quantum Physics. An interesting book I read this summer, called Shrodinger's Kittens, suggested that you can look at the phenomenon this way:

    Entangled photons are created and move off in opposite directions

    One photon, still entangled, hits a measuring device that determines its polarity.

    This information allows knowledge of the polarity of the other (no longer entangled) photon, instantaneously. The two photons seem to have a communications channel that defies General Relativity.

    But if photons travel at c (which I think everyone agrees they do), then they experience all events simultaneously. Hence, the second event (the measurement) actually allows the choice to be made during the first one (the creation).

    This requires throwing our out concept our macro-world concept of past and future, but I don't think that's much of a leap, really. I challenge anyone here to prove to me that photons prefer to move FORWARDS in time, or tell me what "time" is... to a photon.

    This explanation allows for an extended form of "relativity" in which events are connected out of time to each other, as if they were part of a continuous medium formed at right angles to what we think of as space-time.

    The foundation for this concept comes from the fact that although a particle of light travels along ALL possible paths (not just the shortest one... it can travel a squiggly, chaotic, fractal path too, and does) from A to B, the reinforcing action of probability makes it appear that it "chose" just one. (ie: the angle of incidence does NOT equal the angle of reflection, it just averages out that way)

    Basically the entangled photons chose their path and states at the beginning, and it was a mystery to us (the rest of the universe) until something happens that measures that information. The problem being, that the photons "anticipated" the measuring event... or conversely, the measuring event allowed the choice to be made earlier on.

    No need for FTL, really, because the photon knows in advance about the entire history of the universe, only restricted to events along its path.

    One unrelated question: how many electrons are there in the universe? Is it the same one electron/positron bouncing back and forth through time, and it only seems to be 10eWhatever of them?

  • Define practical in your context.
    I have witnessed controlled fusion experiments both in Real Time via a camera into the core of a reactor and through post experiment film.

    Controlled fusion reactions are possible. I believe that fusion will make it into usage during my lifetime (I'm 24).

    Don't be so quick to poo-poo fusion just because the field is in it's infancy. We have a long way to go in Materials Science before fusion becomes a mainstream energy source. Were the first bicycles practical? Absolutely not, they were fixed gear and had iron wheels. Prior to that the velocipedes had no gears of any kind, or pedals, and they weighed over 100 lbs.

    All technology goes through a refining process.
  • by dabacon ( 221175 ) on Wednesday November 01, 2000 @08:07AM (#659169) Homepage
    OK, being as I am a reasearcher who has done work on decoherence-free subspaces (DFSs...they are also known as quantum error avoiding codes or noiseless subspaces...damn nomenclaturese) I thought I'd give all you netadmins a real simple explanation of what a DFS is. Of course, being a simple explanation, it will fuzz over a bit. But I thought I'd at least try!

    Suppose you are trying to send some bits down a noisy communication channel (sending an email from Timbucktoo to Weed, CA). Now the noise will cause the bits that you send on one end of the line to sometimes come out different on the other end of the line. Many of you know how we get around this in real world situations: we use error correction. The basic idea of error correction is to use redundancy to transmit information. Thus, for example, instead of sending the bit 0 you might send ten 0's and instead of sending the bit 1 you might send ten 1's. If the channel isn't too noisy then the reciever can figure out what bit you ment to send by looking at the ten bits he recieves and deducing if of those ten more are 0's or 1's. Basically you can reduce the noise rate of information transmission at the cost of increasing the number of bits you need to send in order to transmit one bit of information. (Sorry for those of you who know this shit like the back of your hand).

    Decoherence-free subspaces work on a similar "encode the information" (i.e. 0->ten 0's, 1->ten 1's), but they "use symmetry" to protect the information.

    Suppose that after extensive testing of the phone line you are using you notice that if you send two bits down the line in rapid sucession the line either does nothing to these two bits or flips both of them. Thus for example, if you send 00, the reciever always gets either 00 (no error) or 11 (error!) and if you send 01, the reciever always gets either 01 (no error) or 10 (error!). Your phone line has a symmetry! How do exploit this symmetry?

    Well, what you do is simply encode the information you want to send into the parity of the two bits. This simply means that if you want to send 0 down the line, you send 00 (or 11) and if you want to send 1 down the line, you send 01 or (10). Now the noise can flip 00 to 11 (or vice versa) but it cannot change 00 to 01. Thus the you can perfectly recover the information you sent down the line regardless of an error occuring. What is neat about this is that it doesn't depend on the strength of the noise (the probability that an error occurs, for example). By using the symmetry of the noise you can avoid the noise completely! Symmetry=>protection.

    What I've explained to you is an example of a decoherence-free subsystem (a generalization of decoherence-free subspaces, but the same basic idea) in the real "classical" world. To build a quantum computer we need to deal with similar problems but in the "quantum" world.

    When Peter Shor (quantum computing god) invented a quantum computing algorithm for factoring (the one that breaks RSA), one of the main problems in actually implementing such a computer was quickly understood to be noise. Noise in quantum system is called decoherence (at least by me) and is much more nasty than the classical noise you get when (say) you are talking on your cell phone. The problem with quantum systems is that if they interact with external systems they completely lose their quantum nature. And making this problem even harder, whenever you observe a quantum system it also loses its quantum nature.

    But following his work in discovering the factoring algorithm, Peter Shor noticed that he could do error correction on quantum systems to avoid this decoherence problem (hence Peter Shor=quantum computing god). A huge host of people then developed the theory of quantum error correction which showed that the decoherence problem could be overcome. This is probably one of the most amazing new ideas of the past decade: that quantum information can be in principle be sheilded from its environement by suitable error correction.

    Anyway, decoherence-free subspaces are like quantum error correction in that you encode quantum information, but they, like the example above, use the fact that often noise has some sort of symmetry. Think about it this way: decoherence of a quantum system is like you looking at the system (you are interacting with the system!). But say you have two atoms which are so close together to each other that you cannot distinguish atom A from atom B. Then there is a symmetry in the way in which observe the system: you cannot distiguish that atom A is on the left or if it is on the right. Such a symmetry can then be shown the produce encodings of information which are protected from your observation!

    Ah well, I had to try. Thanks to anyone who made it this far without "man I want to kill this dork" thoughts.

    dave bacon
  • The Los Almos Lab HAS managed to make a Quantum based computer, however, they have disappeared.

    Staff are vigorously checking underneath every photocopy machine to see if they are with the hard drives containing information about disarming nuclear warheads.

  • reeeeeallly tiny radio collars
  • Patent entangled photons.
  • by NanoProf ( 245372 ) on Wednesday November 01, 2000 @07:10AM (#659194)

    OK, the definition of a decoherence-free subspace:

    Quantum mechanical wavefunctions are described in terms of their projection onto a set of basis functions. This is exactly analogous to a Fourier transform of a function (i.e. the projection of the function onto a set of sine and cosine function).

    As the wavefunction of say an electron evolves with time, the weights of the various basis functions will typically change. If the wavefunction is coupled to other systems (i.e. other electrons, surrounding atoms, molecules, etc.), then the wavefunction becomes very complicated as the pieces that describe the single electron mix up with the pieces describing the other parts of the system. This is termed decoherence.

    The subspace referred to in the posting is a subset of the full set of basis functions (like taking a finite bandpass of the Fourier space). For wavefunctions that can be described completely in terms of sums of the basis functions in this subspace, the wavefunctions will maintain their coherence, which means that as they evolve in time, they don't get mixed up with the wavefunctions describing the surrounding environment

    This isolation of a subset of the degrees of freedom describing a system (i.e. the decoherence-free subspace) is essential for quantum computing, as a quantum computer uses the subtle correlations within a wavefunction to perform what are essentially massively parallel computations. Should the system decohere, the subtle structures in the wavefunction are lost.

  • by Nightlight3 ( 248096 ) on Tuesday October 31, 2000 @09:49PM (#659214)
    "If one existed, a quantum computer would be extremely powerful; building one, however, is extremely challenging,"

    Extremely challenging, like in "it can't work and it won't ever work, but I hope the government and the industry sponsors won't find that out, at least until I retire, preferably after I am dead."

    The whole field of Quantum Computing is a mathematical abstraction (fine, as any pure math is, as long as you don't try to claim that's how the real world works). Its vital connection with the real world is based on a highly dubious (even outright absurd, according to some physicists, including Einstein) conjecture about entangled quantum states (roughly, a special kind of "mystical" non-local correlation among events) which was actually never confirmed experimentally. And without that quantum entanglement the whole field is an excercise in pure abstract math with no bearing on reality.

    While there were number of claims of an "almost" confirmation of this kind of quantum correlations (the so-called Bell inequality tests), there is always a disclaimer (explicit or, in recent years, between the lines as the swindle got harder to sell), such as "provided the combined setup and detection efficiency in this situation can be made above 82%" (even though it is typically well below 1% overall in the actual experiment; the most famous of its kind, Aspect experiment from early 1980s had only 0.2% combined efficiency, while 82% is needed for actual, "loophole free" proof) or provided we assume that the undetected events follow such and such statistics, etc. The alternative explanations of those experiments (requiring no belief in mystical instant action-at-a-distance), which naturally violate those wishfull assumptions, are ignored, or ridiculed as unimportant loopholes when forced to debate the opposition, by the "mystical" faction. After all, without believing their conjecture all the magic of quantum computing, quantum cryptography, quantum teleportation, along with funding, would vanish.

    For those interested in the other side of these kinds of claims, why it doesn't work and why it will never work, check the site by a reputable British physicist Trevor Marshall, who has been fighting, along with a small group of allies, the "quantum magic" school for years:

    Quantum Mechanics is not a Science" [demon.co.uk]

    Unfortunately, the vast bulk of the research funding in this area goes to the mystical faction. As long as there are fools with money, there will always be swindlers who will part the two.

    For a more popular account, accessible to non-physicists, of the opposing view, you can check a site by a practical statistician (and general sceptic) Caroline Thompson:

    Caroline Thompson's Physics [aber.ac.uk]

  • Quantum mechanics is the current scientific dogma because it has successfully explained a _vast_ array of experiments. Further, it has successfully _predicted_ a huge number of phenomena. These are good reasons to begin to believe that a theory will generally give you a good answer, and quantum has given enough that it's not unreasonable to believe that it may be more accurate in its predictions than we are in our experimental capabilities. In response to "failures" such as the experiments you refer to (which I must admit to not having examined closely), two things should be done. First, as you suggest, alternative theories should be proposed and tested, perhaps more widely than is done today, I don't know, but it's important to consider the possibility that the theory is flawed. Second, more careful experiments should be carried out to try to nail down exactly what happens, because it's also important to consider the possibility that the experiment is flawed.

    If nothing else, quantum computing, cryptography, error correction, communications, etc, push experiments that test this regime of quantum mechanics. The scientific community won't push forever if entanglement really can't be demonstrated. You (and marshall, et al) may very well prove correct -- maybe this quantum mumbo jumbo is all hogwash and students in the year 2100 will laugh at today's physics community like we laugh at the physicists of a century ago. It will take time -- quantum has been right in enough cases that it deserves some trust.

    Anyway, I know a number of folks who are working on various bits of QC research. These are not shady scientists looking to push alternative theories under the rug and secure government grants on false promises. Rather, they are passionate scientists who see the potential opportunities that may come from this speculative research. I would wager that this is true of the majority of members of the QC community.
  • by honkycat ( 249849 ) on Wednesday November 01, 2000 @01:04AM (#659220) Homepage Journal
    While Shor's factoring algorithm (which permits polynomial time rather than exponential time integer factoring, and therefore could undermine the security of RSA) may well be a "killer app" for quantum computing, it's worth pointing out that it's not yet been shown that QC can help us with general computing problems.

    The big win in QC comes from the superposability of states -- it is possible for the system to be in all of its states at the same time. For n quantum bits (qubits), this is 2**n (two to the n) states. Operations on a system that is in such a superposed state are performed on every possible state at once. Great, neat, cool. But there's a catch -- the information you want can't come from a single measurement of the resulting system. The exponentially large amount of data you've computed is stored in the probability distribution (in some sense). In order to read this out, you need to repeat the experiment again and again to measure out the distribution instead of a single instance of the random variable.

    Guess what, in order to get out the exponentially large amount of information from the probability distribution, you need to make an exponentially large number of measurements. So you're no better off, right?

    Well, in general, maybe not. But there may be special cases. In the cases we've found so far, something funny happens in the quantum mechanical phase space that lets us actually read out the correct answer. Grover's search algorithm is a particularly clear example of what happens. In this case, the "right answer" can be read out because there is a computation that can cause a particular state to be selected with near 100% probability -- this state is the "winning" state that is being searched for (see L.K. Grover, Phys Rev Lett, 79, 325 (1997)).

    Anyway, QC is only useful for those problems that can be computed in such a way that the answers can be read out of the QC in polynomial time. Right now, that's factoring (admittedly a biggie, but not likely something that'll, eg, get you 200 fps at quake, which you don't need anyway... oh wait, wrong thread), Grover's search, and a few other examples. Right now, though, QCs look like they'll be special-purpose code breakers. Hmm. Collossus? ...

The solution of this problem is trivial and is left as an exercise for the reader.

Working...