Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Science

Physicists Discover a Way Around Heisenberg's Uncertainty Principle 153

Hugh Pickens writes writes "Science Daily Headlines reports that researchers have applied a recently developed technique to directly measure the polarization states of light overcoming some important challenges of Heisenberg's famous Uncertainty Principle and demonstrating that it is possible to measure key related variables, known as 'conjugate' variables, of a quantum particle or state directly. Such direct measurements of the wave-function had long seemed impossible because of a key tenet of the uncertainty principle — the idea that certain properties of a quantum system could be known only poorly if certain other related properties were known with precision. 'The reason it wasn't thought possible to measure two conjugate variables directly was because measuring one would destroy the wave-function before the other one could be measured,' says co-author Jonathan Leach. The direct measurement technique employs a 'trick' to measure the first property in such a way that the system is not disturbed significantly and information about the second property can still be obtained. This careful measurement relies on the 'weak measurement' of the first property followed by a 'strong measurement' of the second property. First described 25 years ago, weak measurement requires that the coupling between the system and what is used to measure it be, as its name suggests, 'weak,' which means that the system is barely disturbed in the measurement process. The downside of this type of measurement is that a single measurement only provides a small amount of information, and to get an accurate readout, the process has to be repeated multiple times and the average taken. Researchers passed polarized light through two crystals of differing thicknesses: the first, a very thin crystal that 'weakly' measures the horizontal and vertical polarization state; the second, a much thicker crystal that 'strongly' measures the diagonal and anti-diagonal polarization state. As the first measurement was performed weakly, the system is not significantly disturbed, and therefore, information gained from the second measurement was still valid. This process is repeated several times to build up accurate statistics. Putting all of this together gives a full, direct characterization of the polarization states of the light."
This discussion has been archived. No new comments can be posted.

Physicists Discover a Way Around Heisenberg's Uncertainty Principle

Comments Filter:
  • by Anonymous Coward on Monday March 04, 2013 @11:18AM (#43068795)

    So, is the damned cat dead or alive?

    • by Anonymous Coward on Monday March 04, 2013 @11:19AM (#43068823)
      Yes.
    • by elysiuan ( 762931 ) on Monday March 04, 2013 @11:21AM (#43068865) Homepage

      My favorite part of this thought experiment is that Schrödinger constructed it to point out the ridiculousness of quantum theory and how it couldn't possibly be correct if it allowed for such a thing. Reality sure is strange, maybe the strangest thing is that we can understand it at all.

      • by hedwards ( 940851 ) on Monday March 04, 2013 @11:24AM (#43068901)

        That's more a matter of the way the brain selectively ignores and forgets things which would lead to inconsistency. Which until relatively recently wasn't that big of a deal, there were a small enough set of observers that things could easily be kept in sync, and without extensive records, there wasn't anything to contradict the agreement of the folks talking.

        These days though, that's changed and it's going to be interesting to see what the effects are.

        • by K. S. Kyosuke ( 729550 ) on Monday March 04, 2013 @12:39PM (#43069919)

          That's more a matter of the way the brain selectively ignores and forgets things which would lead to inconsistency.

          Or perhaps it's simply due to the fact that our brains evolved only to cope with severely limited range of environments. We can't imagine complicated local geodetics because we didn't evolve near a black hole. We can't imagine the weird effects of special relativity because we haven't evolved at relativistic speeds. We can't grok the fractal-like nature of subatomic world and physics because we aren't molecule-sized in order to notice it. Perhaps those "inconsistencies" are no more inconsistent than, say, the hydrostatic "paradox" is paradoxical. (In fact, the very existence of the word "paradox" seems to suggest that we just get all too often confused by perfectly normal things that are simply outside the realm of our daily experience.)

          • More likely it's just a case of use it or lose it. We don't generally start studying physics formally until high school, so we grow up thinking about the world in a way that isn't strictly speaking real. We then have to unlearn what we know so that we can understand physics if we wish to be physicists. And when you get to the quantum and relativistic areas, it's so unlike what we've conditioned ourselves to see, that it can be a hard leap to make.

            I like how my previous post got modded funny, when that's pre

        • Which until relatively recently wasn't that big of a deal

          Yes it was.

          • On precisely what basis are you saying that? The brain itself accounts for about 20% of the caloric needs of a person, so having a lot of neurons hanging around that aren't needed was never desirable, now we can more readily feed ourselves, so it's not as big of an issue. But really, up until relatively recently lack of sufficient food was a much bigger concern than the ins and outs of reality.

            • On precisely what basis are you saying that?

              On the basis of it being a joke (which I, and others, had also assumed your post to be), in that I was disagreeing with you in such a way as to imply that your brain had selectively ignored and forgotten something. And now I've had to ruin it by explaining it/collapsing its wave function :(

              Not entirely sure what caloric needs have got to do with anything of this, either...

        • "The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the reve
        • The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revel
      • I didn't realize Schroedinger was so mystical. From http://en.wikiquote.org/wiki/Erwin_Schrödinger [wikiquote.org]:

        Nirvana is a state of pure blissful knowledge... It has nothing to do with the individual. The ego or its separation is an illusion. Indeed in a certain sense two "I"'s are identical namely when one disregards all special contents — their Karma. The goal of man is to preserve his Karma and to develop it further... when man dies his Karma lives and creates for itself another carrier.

    • by History's Coming To ( 1059484 ) on Monday March 04, 2013 @12:39PM (#43069923) Journal
      They're measuring the average state of multiple cats. It's not a way around the uncertainty principle, it's a way of building up a statistical picture, which is exactly what QM does. Over-hyped article.
      • This has all been covered/discussed in so many books before, I'm shocked they think its an advance at all.

    • by jellomizer ( 103300 ) on Monday March 04, 2013 @01:33PM (#43070579)

      The Cat is Dead now. Otherwise Schrodinger would be famous for finding a way to greatly extend the life of Cats.
       

      • The Cat is Dead now. Otherwise Schrodinger would be famous for finding a way to greatly extend the life of Cats.

        This fails to account for the nine lives of a cat.

    • So, is the damned cat dead or alive?

      Yes. But there are two cats, and we're not sure which is alive and which is dead.

    • Turns out the cat will not stay in the box. So the experiment can never be done.

    • by tyrione ( 134248 )

      So, is the damned cat dead or alive?

      Could be the Cat is just an illusion.

    • You know, from the cat's point of view, it's the physicist who keeps cutting his probability of existence in half every time he performs the experiment. She might wonder why he commits this series of half-suicides. If she cared.

  • by arse maker ( 1058608 ) on Monday March 04, 2013 @11:20AM (#43068841)

    This is old news.
    It doesn't violate the uncertainty principle.

  • Uncertainty (Score:5, Funny)

    by ISoldat53 ( 977164 ) on Monday March 04, 2013 @11:20AM (#43068849)
    Are you sure?
  • I'm betting this is barely significant.
  • by Anonymous Coward

    I thought the premise behind QKD was that you couldn't measure the polarization of one of a pair of entangled photons on two different bases at the same time, so once you perform the measurement in either basis, you're stuck with it and can't recreate that photon to forward it to the receiver (you'll only get the right basis half of the time). If this means you can get information about the photon on both the horizontal/vertical and diagonal bases, doesn't that mean you can MITM QKD?

    • by johndoe42 ( 179131 ) on Monday March 04, 2013 @11:34AM (#43069027)

      No, because the summary is (as usual) thoroughly overstated. This experiment, like any other form of quantum state tomography [wikipedia.org] lets you take a lot of identical quantum systems and characterize them. For it to work, you need a source of identical quantum states.

      As a really simple example, take a polarized light source and a polarizer (e.g. a good pair of sunglasses). Rotate the polarizer and you can easily figure out which way the light is polarized. This is neither surprising nor a big deal -- there are lots of identically polarized photons, so the usual uncertainty constraints don't apply.

      The whole point of QKD (the BB84 and similar protocols) is that you send exactly one photon with the relevant state. One copy = no tomography.

    • by mbone ( 558574 )

      Without looking at the original paper, who knows?

      If I had to bet, right now, I would bet on Heisenberg. For now. Subject to change, of course.

    • by mikael ( 484 )

      Some time ago, New Scientist I believe, researchers had claimed that they had been able to visualize a photon travelling through space. They took a sealed container, filled it with inert gas, and fired single photons through a pair of windows. The frequency of the photon was chosen so that it would have enough energy to interact with electrons around atoms, but not enough energy to dislocate them. They could actually visualize the location of the photon through changes in the state of the atoms (temperature

  • Another Star Trek gadget may come true?

  • Editors at it again (Score:5, Interesting)

    by Anonymous Coward on Monday March 04, 2013 @11:26AM (#43068935)

    certain properties of a quantum system could be known only poorly if certain other related properties were known with precision.

    This careful measurement relies on the 'weak measurement' of the first property followed by a 'strong measurement' of the second property.

    Weak measurements are not precise. They can become statistically significant with a large data set, but on an individual event basis, they give you effectively nothing. There's no violation of the Uncertainty Principle here.

  • by Wrath0fb0b ( 302444 ) on Monday March 04, 2013 @11:35AM (#43069039)

    What they are doing is assuming that their light source is broadly uniform and averaging over the double-measurement (which is clever, no doubt). So we still haven't learned anything about a particular photon that violates the uncertainty principle, only something about the entire population. If we assume that the population is uniformly polarized (which is reasonable in this case) then we can conclude that the average reflects the properties of the individual photons. If the population was not uniform, however, then the average tells us very little about the properties of the individual photons.

    And before someone too clever tries to argue that you can take a single input photon and make multiple copes and send them through this process to get results about that one photon, there is the No Clone Theorem [wikipedia.org] to here to prevent that maneuver.

    So really they haven't gone around Heisenberg (which talked only about individual wave-functions) but used multiple compound measurements and an assumption about the properties of the group to infer something that Heisinberg says they can't measure directly -- which is quite clever but Herr Doctor's principle still stands quite strong.

    • by ceoyoyo ( 59147 ) on Monday March 04, 2013 @12:13PM (#43069587)

      Except that the no hidden variables results suggest that the photon really doesn't have both those properties at the same time. You can measure the average, but that's all it is - it doesn't tell you anything you shouldn't know about the state of a single photon, even if they are all quantum mechanically "identical." So Heisenberg gets to be right in the strong sense, as well as the weak.

      • by Dr. Spork ( 142693 ) on Monday March 04, 2013 @12:59PM (#43070169)
        Thank you, I think that's exactly right. The "no hidden variables" issue was settled in the 80s, and this does nothing to overturn those results. The summary makes it sound like they weakly measured a hidden variable and strongly measured an orthogonal variable. They didn't. Quantum mechanics, including Heisenberg's own 1926 formulation of it, predicts these measurements. So let's not pretend that any theoretical results got overturned by experiment! Quantum mechanics is the same as it ever was.
        • by ceoyoyo ( 59147 )

          Which is too bad actually. Bell's theorem has an out: it holds only if the universe is local. So if someone DOES figure out a way to measure hidden variables then it implies the universe is non-local, which might mean all kinds of fun sci fi technology.

        • Not true: tests of Bell's inequality have only ruled out some very limited classes of hidden variable theories. There are still lots of them that are very much on the table.

          In fact, the whole concept of "weak measurements" originally came from studies of the two state vector model of quantum mechanics, which is a hidden variable theory. It arises very naturally from this model (and from other time reversible interpretations of quantum mechanics, all of which are hidden variable theories). It was later sh

          • Not true: tests of Bell's inequality have only ruled out some very limited classes of hidden variable theories. There are still lots of them that are very much on the table.

            Specifically local hidden variables, as in ones that would obey the speed of light and get rid of the "spooky action at a distance" that inspired Einstein et. al. to write the EPR Paradox paper claiming quantum mechanics had to be incomplete.

            The experimental violation of Bell's Inequality means that while there may be some kind of hidden variable, it can't be a kind that gets rid of the quantum weirdness.

            It's often been pointed out that, although standard QM does predict weak measurements should work, it's unlikely anyone would ever have discovered that if time reversible QM hadn't made the prediction first.

            Freaky the way science works sometimes, isn't it?

            • There are more assumptions underlying Bell's inequality than just locality. Some of them are very subtle and hard of even realize they're assumptions, until you come across a theory that doesn't share them.

              Time reversible interpretations are about as "unspooky" as they get. They're simple, deterministic, local, and generally very easy to understand. Their only "strange" feature is that you have to let go of your presentist view of time (which we've known for a century is almost certainly wrong, since it

              • Ah, interesting. But, um, this...

                information can flow both directions in time. The future influences the past in exactly the same way that the past influences the future.

                ... is more than a little spooky. Time-reversible relativistic laws of physics are nothing new. In fact that's pretty much every law we have with the 2nd Law of Thermodynamics being the only apparent indication/cause of the arrow of time. But Relativity also assumes causality, cause preceeding effect.

                How is this compatible with relativity if we have information going backward in time? That's not less spooky than interactions appearing to break out of the light cone, a c

                • I guess that depends on what you consider to be "spooky". :) But there are no cats that are alive and dead at the same time, and no mysterious actions at a distance with no apparent mechanism to cause the action, so that's unspooky in my book.

                  The only assumption behind special relativity is that the laws of physics (including the speed of light) are the same in all inertial reference frames. I'm not sure what "causality" means in this case (it's a notoriously hard word to define), but there's no requireme

                  • Causality means that any transmission of information from event A to event B means that event A must precede B in time. An example would be an electron emitter emitting an electron, and a detector detecting an electron. If the detector went off before the electron was emitted, that would be a violation of causality.

                    Relativity of simultaneity does nothing to prevent such a global evaluation, it only restricts the sets of events that could possibly be causes of other events.

                    As long as events A and B are sep

          • by ceoyoyo ( 59147 )

            http://en.wikipedia.org/wiki/Bell's_inequality [wikipedia.org]

            What I said is true. Your first paragraph is not, which casts suspicion on the second as well.

            • Wrong. See http://en.wikipedia.org/wiki/Loopholes_in_Bell_test_experiments [wikipedia.org]. Note, however, that while that page gives lots of accurate information, it also is somewhat biased by referring to the whole subject as "loopholes". A more accurate statement is to say that no test of Bell's inequality has ever been performed; that by making a set of "supplementary assumptions" (several of which are discussed on that page), you can derive a different inequality that is similar in form to Bell's inequality; and it
    • by sjames ( 1099 )

      In fact, their experiment is almost exactly the same as using a half silvered mirror to send half of the photons to one detector and half to the other.

      Meanwhile, Heisenberg wasn't talking about a binary condition. The uncertainty principle actually suggests that the experiment in TFA should get these results.

    • What if you send the single photon through a lot of 'weak' detectors placed in a row, before it finally hits the 'strong' one?
      Would that count, or would they cease to be 'weak'?

  • Does anyone have a link to an actual pdf of the actual paper?

    Has this interpretation of the original work been subject to peer review?

  • by mpoulton ( 689851 ) on Monday March 04, 2013 @11:38AM (#43069105)
    Like many non-rigorous descriptions, the summary makes the mistake of describing the uncertainty principle as if it is a measurement problem, where the lack of precision somehow arises from inadequate measurement technology. This is not a correct statement of the uncertainty principle. The fundamental issue is that the conjugate variable values are linked on a quantum level, such that there is a certain amount of natural, inherent uncertainty in their collective values due to the statistical/wavelike nature of the quantum particle. With perfect measurement, there is still uncertainty in the pair of values for any conjugate variables because the uncertainty lies in the actual values themselves. Position and momentum are the quintessential conjugate pair. The Heisenberg uncertainty principle is sometimes framed as the idea that you cannot know the speed and position of a particle at the same time. But it's more correct to say that a particle does not HAVE an exact speed and position at the same time. This weak measurement technique is certainly useful and interesting since it allows some observations of wavefunctions without collapse, but it does not actually allow the measurement of conjugate variables more precisely than the uncertainty principle allows - because the values themselves do not exist more precisely than that.

    *This description is based one one of the multiple interpretations of quantum mechanics, and probably does not accurately represent physical reality, only our human understanding of a part of reality that we have not really figured out completely yet.
    • by ceoyoyo ( 59147 )

      *This description is based one one of the multiple interpretations of quantum mechanics, and probably does not accurately represent physical reality, only our human understanding of a part of reality that we have not really figured out completely yet.

      Bell's theorem combined with all the experiments that have been done based on it, rule out local hidden variable theories. So either (1) your description is correct and the particle doesn't have an exact speed and position at the same time, (2) a LOT of experi

      • (6) Some weird hypothesis you (and nobody) have thought about.

        There is never any guarantee that you have identified all alternatives, no matter how carefully you think about it.
        • by ceoyoyo ( 59147 )

          What you're probably thinking of is covered by (3), (4) or (5). (3) for example, sounds simple, but if it turned out to be the case would mean all sorts of weird things.

          We're discussing general results about the limitations of classes of theories, not of specific hypotheses or even whole theories.

      • Bell's theorem combined with all the experiments that have been done based on it, rule out local hidden variable theories. So either (1) your description is correct and the particle doesn't have an exact speed and position at the same time, (2) a LOT of experiments have suffered from horrible systematic errors, (3) the universe is non-local, (4) the universe is superdetermined or (5) mathematics doesn't work properly.

        (1) seems the most likely right now, but I'm personally rooting for (3). Instantaneous communication, teleportation, etc.

        No...please stop muddying already murky waters. von Neumann ruled out hidden-variable theories only for non-dynamical systems (systems that do not evolve with time.) Unfortunately, von Neumann's proof was incorrectly interpreted by physicists to apply to all systems. What Bell actually did was merely demonstrate that physicists (including Bell himself) had been misinterpreting von Neumann wrongly for 35 years. With that said, loopholes in the Bell test experiments [wikipedia.org] leave the door open for viable hidden-v

    • by pclminion ( 145572 ) on Monday March 04, 2013 @12:23PM (#43069723)
      For those with a signal processing background, it can be explained like this. The conjugate pair of momentum and position are related to each other by the Fourier transform -- the Fourier transform of the wavefunction in spatial coordinates yields the wavefunction in momentum coordinates. Anybody who has worked with a Fourier transform knows that if the input is band-limited, the output will not be, and vice versa. To know the position of a particle with exactness implies that its wavefunction is impulse-like in the spatial domain, which causes the momentum wavefunction to be a wave that extends infinitely throughout momentum-space. When you squeeze the bandwidth in one domain it grows in the other. Because the Fourier transform of a Gaussian is another Gaussian, a particle with Gaussian distribution in either space or momentum-space constitutes the most localizable wavefunction one could possibly achieve. The limit of the resolution is given by the Heisenberg relation, but this is a purely mathematical result, having nothing to do with measurement technique.
      • by ceoyoyo ( 59147 )

        You would be surprised at how many people who work in signal processing don't really grasp that concept. Publishing a big chunk of my PhD took a lot longer than it should have because experts in signal processing were having trouble with that concept.

    • Like many non-rigorous descriptions, the summary makes the mistake of describing the uncertainty principle as if it is a measurement problem, where the lack of precision somehow arises from inadequate measurement technology. This is not a correct statement of the uncertainty principle.

      That's not quite right. Heisenberg's "uncertainty principle", as originally fomulated by Heisenberg, is a measurement problem. Heisenberg observed that any measurement will disturb the system being measured, such that its states before and after are different. This limits your ability to perform multiple measurements in a row. Physicists later came to identify the uncertainty with the intrinsic impossibility of having a system be in eigenstates of two non-conjugate variables at the same time. But these

  • The direct measurement technique employs a "trick" to measure the first property in such a way that the system is not disturbed significantly and information about the second property can still be obtained.

    So... they system is disturbed. And

    The downside of this type of measurement is that a single measurement only provides a small amount of information, and to get an accurate readout, the process has to be repeated multiple times and the average taken.

    This is therefore a statistical experiment and thus su

    • all experiments are subject to error...

      But the HUP is made for a case of a single strong measurement. This describes using multiple weak measurements which was proposed back in 1993. Good to see it is finally coming to light as a useful tool.

  • I'm probably misunderstanding things here but assuming just momentaneously that this allows for information to travel instantaneously, Special Relativity ask us to consider that this information is actually traveling back in time. isn't it? Then are we finally goinig to get a proper model of time travel so we can make Hollywood stop making shoddy movies like Looper?

    • assuming just momentaneously that this allows for information to travel instantaneously

      Why would it?

      • I'm not the best one to explain since I barely understand it myself

        https://en.wikipedia.org/wiki/Relativity_of_simultaneity [wikipedia.org]

        Basically, this, let A, B, C, D be points in spacetime, in other words a place an a date (oviously, a date relative to an observer at that time and place). A, B, C, D also have an observer there conveniently called A, B, C, D too.

        Nothing is faster than light so the fastest possible interaction A can have with B is light traveling from A to B, we say "B could see A" which means "A could

        • Teleportation, or any form of FTL travel messes this up.

          I think that's your unwarranted assumption. Teleportation is not a form of FTL, and nor is entanglement or uncertainty or (to the best of my understanding) any other quantum effect.

          B actually happened before D for one traveler and viceversa for the other. Why? I'm not really sure. I'm not a physisist.

          Try this: if you're mapping events in 2D space, the intuitive thing is to divide the space up into an XY grid. But two people could place the grid different - I could place mine at 45 degrees to yours. My coordinates won't match yours, but we're both describing the same events. The same goes for spacetime - although it's not arbi

  • "This process is repeated several times to build up accurate statistics." If I'm reading this right, this means that you need multiple identical copies of the photon, and it's impossible to duplicate the quantum state of an object without knowing its exact state. You can't repeat the process on the same photon because that strong measurement destroys the photon's original state. This would work if your source was producing a number of identical photons, but it wouldn't be very useful in, for example, bre

  • by John Hasler ( 414242 ) on Monday March 04, 2013 @12:15PM (#43069601) Homepage

    As in false: not true. It isn't just distorted or exaggerated. It's wrong.

  • No, this does not invalidate the Heisenberg principle because it is done on an ensemble and the measurement is just an average on the statistic that they gather.

    This has no bearing on the fundamental validity of the uncertainty principle for a single quantum system. Never quite understood the point of these experiments [wavewatching.net]. But to advertise them in this misleading fashion is just asinine.

    Way to go to confuse an already science sceptic pubic.

  • I'm in a habit that when I see a headline like this, I just *assume* it's bullshit and move straight to the comments to find the physicists who refute the claims. Even if it's true, just by the sensationalism Slashdot headlines starting bringing us I'm going to assume false.

  • by peter303 ( 12292 ) on Monday March 04, 2013 @01:46PM (#43070715)
    You can only know the qualities sampled on a discrete digital grid to certain resolution due the limits of the grid. Take a Fourier transform of quanity sampled on that grid. You can only reliable compute frequencies with wavelengths two grid points wide. Else "aliasing" allows you fit an arbitrary number of smaller wavelengths to the same sample points.

    In nature the Planck unit of action discretizes the universe into the smallest quantities you can resolve.
    • by ceoyoyo ( 59147 )

      Close, but your description is confuses frequency resolution and the Nyquist frequency. And a nitpick: the measurement can be anything, it doesn't have to be digital.

      To use your example, the limit on the resolution of the frequency does not depend on the resolution of the sampling, it depends on the extent or field of view of the sampling. If you sample for twice as long you can resolve frequencies half as far apart. The maximum frequency you can represent (the Nyquist frequency), which is like the field

  • How does this bode for quantum encryption? Me thinks it makes it a bit less hack-proof. I guess the good news is that it sounds like Star Trek transporters are another tiny step closer to reality. ;-)

  • ...how soon until we have Heisenberg Compensators?

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...