Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Space Science News

LHC Research May Help Explain the Universe's Matter/Antimatter Imbalance 113

suraj.sun sends this excerpt from the BBC: "Particles called D-mesons seem to decay slightly differently from their antiparticles, LHCb physicist Matthew Charles told the HCP 2011 meeting on Monday. The result may help explain why we see so much more matter than antimatter. The team stresses that further analysis will be needed to shore up the result. At the moment, they are claiming a statistical certainty of '3.5 sigma' — suggesting that there is less than a 0.05% chance that the result they see is down to chance. The team has nearly double the amount of data that they have analyzed so far, so time will tell whether the result reaches the 'five-sigma' level that qualifies it for a formal discovery."
This discussion has been archived. No new comments can be posted.

LHC Research May Help Explain the Universe's Matter/Antimatter Imbalance

Comments Filter:
  • by Anonymous Coward on Friday November 18, 2011 @07:14PM (#38104436)

    Is such an imbalance dangerous for a universe this age? Does our universe need medical treatment?

  • Kaon decay (Score:5, Informative)

    by tylersoze ( 789256 ) on Friday November 18, 2011 @07:16PM (#38104446)

    CP violation in weak interactions has been known for some time, specifically in neutral Kaon decay. If I'm understanding this results correctly, the surprise here seems to be the magnitude of the CP violation in this case.

    • Re:Kaon decay (Score:5, Informative)

      by torako ( 532270 ) on Friday November 18, 2011 @07:26PM (#38104538) Homepage

      CP violation in Kaon decays can be explained by the Standard Model, but if the magnitude of CP violation they have claimed exists in the D system can not. It would be the first actual hint of physics beyond the Standard Model at the LHC. That would be some very exciting news (especially because everybody expected the big "discovery" detectors ATLAS and CMS to actually find something new first, i.e. the Higgs or Supersymmetry).

      • Careful: QCD hard! (Score:5, Informative)

        by Roger W Moore ( 538166 ) on Friday November 18, 2011 @08:54PM (#38105150) Journal

        CP violation in Kaon decays can be explained by the Standard Model, but if the magnitude of CP violation they have claimed exists in the D system can not.

        The calculations required to predict the amount of CP violation in meson systems are extremely hard to do. When I worked on the NA48 experiment, which measured direct CPV in the kaon system, the theorists were initially adamant that there was no way the parameter we measured (espilon-prime over epsilon) could be above 0.001 in the Standard Model. Several year later after both NA48 and KTeV had published results putting the parameter at well above that I saw a theory talk saying that these results were in perfect agreement with the Standard Model!

        Now the discrepancy seems a lot larger here but, nevertheless, even if the result holds I'd give the theorists time to think about this and see whether they find problems in the calculations. I have a huge amount of respect for my theory colleagues but QCD calculations like this are fantastically hard so it is not at all uncommon for the results to change.

        • by Lawrence_Bird ( 67278 ) on Saturday November 19, 2011 @10:06AM (#38108168) Homepage

          Great comment and my semi-learned mind says you are likely to be proven correct. One other thing to point out - eventhough this is reported as 3.5sigma, that does not mean that additional statistics will push it to 5 and in fact, many 3 sigma-ish 'discoveries' end up being background or explained by other parameters/variables.

    • by Skarecrow77 ( 1714214 ) on Friday November 18, 2011 @09:25PM (#38105312)

      I've spent too much time at encyclopedia dramatica recently, because I'm reading your statement way differently than I assume you mean it, based on my understanding of the meaning of the letters CP.

  • by Anonymous Coward on Friday November 18, 2011 @07:18PM (#38104466)

    A star made of antimatter would look exactly the same as one made of matter, wouldn't it? What if half of what we can see in the universe is antimatter?

    • the problem is whenever matter and matching antimatter come in contact we get to see what E=MC^2 means (Very Large Bang for the TL;DR crowd). So an Anti-Star would most likely convert to a Black Hole before it could be seen.

    • by bsane ( 148894 ) on Friday November 18, 2011 @07:35PM (#38104598)

      The assumption is- if the universe had a fair amount of both, we'd see the gamma radiation leftovers from collisions, and we don't...

      • by ackthpt ( 218170 ) on Friday November 18, 2011 @07:44PM (#38104678) Homepage Journal

        The assumption is- if the universe had a fair amount of both, we'd see the gamma radiation leftovers from collisions, and we don't...

        May as well theorize the equal and opposite reaction to the Big Bang was one of Antimatter in an inverse universe. Not saying there's anti Cowboy Neal or anyone else in that universe, it's doing its own thing.

      • Could a stray bit of antimatter interacting with matter be the cause of some of the gamma-ray bursts?

        Maybe something like a comet [wikipedia.org]?

        • by Anonymous Coward on Friday November 18, 2011 @10:58PM (#38105748)

          Posting anonymously because I've moderated in this discussion, but the quick answer is no, probably not - we know the signatures of matter/anti-matter annihilations very well, and they simply don't describe gamma ray bursts well enough.

          Interestingly off-topic but I once entertained myself in an astronomy project filling in a cloudy night by calcuating if gamma ray bursts could possibly be accounted for by tightly-collimated electron/positron annihilations. My conclusion was that if they *weren't* collimated, it would involve almost as many electrons as seem likely to be in the universe, while if they were collimated it was possible... but you'd have issues with the red- or blue-shifting.

          Now, I wouldn't actually *trust* those results since I did them in my second year of university, but they were interesting nonetheless. From people I'd actually trust who do this, it doesn't seem too likely that gamma ray bursts are caused by matter/anti-matter annihilations. But hell, it's physics, you never actually know.

      • by Anonymous Coward on Friday November 18, 2011 @08:59PM (#38105170)

        The assumption is- if the universe had a fair amount of both, we'd see the gamma radiation leftovers from collisions, and we don't...

        That's not a great assumption. Contrary to popular belief, when random matter and antimatter collide, they don't always create gamma radiation. Anything is possible that still is consistent with a conservation of energy, momentum, and quantum number(s). Although the most likely result of a electron/positron collision is 2 gamma ray photons, it is not impossible that there is some other "light-weight" particle is formed (say like a neutrino/anti-neutrino or some unknown ligher particle or even whatever people might be calling dark matter that is not easily detectible). It may be that there is some unknown field/symmetry that favors the create of something else on the "lightweight" side instead of a photon. Or perhaps there is something that somehow segregates matter from antimatter (like dark energy with negative pressure) to prevent large scale production of gamma radiation.

        However, the current wisdom is that antimatter just doesn't exist in large quantities in the universe because the cosmic radiation which is detected in our neck of the woods is mostly of matter origin (high energy protons, electron), and not of the anti-matter origin. If there were large amounts of anti-matter galaxies, and such, there probably wouldn't be this type of bias in cosmic radiation...

  • by Anonymous Coward on Friday November 18, 2011 @07:19PM (#38104472)

    I can't believe I saw this on Yahoo! before I saw it here on Slashdot. I also can't believe I still use Yahoo! either.

  • by Moheeheeko ( 1682914 ) on Friday November 18, 2011 @07:20PM (#38104478)
    Dont article for the public on this kind of stuff need to be simplified for those of us not in the field? At least make the summary partly readable.
    • by Anonymous Coward on Friday November 18, 2011 @07:22PM (#38104502)

      Dont article for the public on this kind of stuff need to be simplified for those of us not in the field? At least make the summary partly readable.

      This is already simplified...

    • by Anonymous Coward on Friday November 18, 2011 @07:27PM (#38104542)

      I'm a layman (artist of all careers) and even I can understand the summary.
      Funny thing, /.'ers typically complain that the science articles are too dumbed down.

      Work on your reading comprehension perhaps?

    • by Anonymous Coward on Friday November 18, 2011 @07:27PM (#38104548)

      Summary was great, if you knew anything about the LHC, five sigma, or particle physics in general, this wouldn't be an issue. Check out wikipedia and stop bitching.

    • by Anonymous Coward on Friday November 18, 2011 @07:28PM (#38104552)

      I'm not in the field, and I understood the entire summary.

    • by Anonymous Coward on Friday November 18, 2011 @08:02PM (#38104804)

      Ok, here we go again:

      LHCb sees where the antimatter's gone

      ALICE looks at collisions of lead ions

      CMS and ATLAS are two of a kind

      They're looking for whatever new particles they can find.

      The LHC accelerates the protons and the lead

      And the things that it discovers will rock you in the head.

    • by Surt ( 22457 ) on Friday November 18, 2011 @08:51PM (#38105128) Homepage Journal

      The universe we can see is primarily made up of matter. We know because there are characteristics of antimatter that would allow us to know if we were looking at an anti-galaxy, for example. But we don't know why there is so much matter, and not anti-matter, because the laws of physics we understand so far are neutral. So to explain the universe we see, there must be some rule we don't know about yet, which explains why the universe heavily favors matter.

      This story is about a high-energy physics experiment which revealed a result which will help to explain the discrepancy if it can be confirmed. It will guide us towards that new rule to explain this particular mystery of the universe.

  • Observable universe (Score:4, Interesting)

    by Lord Lode ( 1290856 ) on Friday November 18, 2011 @07:39PM (#38104632)

    What we see is just the observable universe. What if all this missing antimatter happens to be in a non-observable part? You'll never be able to see that! Unless those faster than light particles end the theory of observable universe of course.

    • by bmuon ( 1814306 ) on Friday November 18, 2011 @08:01PM (#38104798)

      And where would the unobservable universe be? Unless you're thinking about antimatter being coiled up in extra spatial dimensions, everything points to there being a process by which the symmetry is broken.

      • by snowgirl ( 978879 ) on Friday November 18, 2011 @08:49PM (#38105112) Journal

        And where would the unobservable universe be? Unless you're thinking about antimatter being coiled up in extra spatial dimensions, everything points to there being a process by which the symmetry is broken.

        To quote Wikipedia [wikipedia.org]

        The current comoving distance to the particles which emitted the CMBR, representing the radius of the visible universe, is calculated to be about 14.0 billion parsecs (about 45.7 billion light years), while the current comoving distance to the edge of the observable universe is calculated to be 14.3 billion parsecs (about 46.6 billion light years),[1] about 2% larger.

        I'm kind of surprised that you don't understand that the universe is larger than our observable universe.

      • by Surt ( 22457 ) on Friday November 18, 2011 @08:54PM (#38105148) Homepage Journal

        The unobservable universe is the infinite portion beyond the light speed horizon.
        If you really want to be depressed, think about future civilizations in our galaxy for whom all other galaxies will have retreated beyond the light speed horizon. They will have a much harder time figuring out how the universe works.

        Now realize that we may already be one of those future civilizations from the perspective of the lucky folks who got to see the universe early on.

      • by Roger W Moore ( 538166 ) on Friday November 18, 2011 @09:03PM (#38105184) Journal

        And where would the unobservable universe be?

        So far away that light from it has not yet had a chance to reach us, and thanks to the accelerating expansion, never will. I vaguely remember seeing some discussion of this in relation to inflation - we end up in a region which is matter dominated and another region is antimatter dominated with the two regions being causally separated by inflation.

        However I believe that these theories have problems because you'd expect to be able to see gamma rays from the edges of each region...unless we happen to be strangely right in the centre of a massive matter-dominated region and cannot see the edge. Plus, since CP violation does exist it we do know that there is a matter/anti-matter asymmetry so it seems strange that, given this, there would be a completely unrelated mechanism to cause an imbalance.

      • by Lord Lode ( 1290856 ) on Saturday November 19, 2011 @06:18AM (#38107444)

        > And where would the unobservable universe be?

        Beyond the horizon where it is too far away for its information to ever reach us in our lifetime due to the lightspeed limit.

    • by Anonymous Coward on Friday November 18, 2011 @08:40PM (#38105056)

      Well, if it is not observable, it does not interact with our universe. Ant if it does not have any influence on it, it is completely IRRELEVANT. Forget about it.

    • by izomiac ( 815208 ) on Friday November 18, 2011 @09:40PM (#38105386) Homepage
      As I understand it, the theory is that anything galaxy-sized or smaller must be almost completely composed of either matter or antimatter since otherwise it'd destroy itself. But, if you had antimatter galaxies then you'd expect to see gamma particles created when they interacted with matter galaxies.

      That hasn't been observed, so the prevailing theory is that the whole universe is almost exclusively comprised of matter, thus there must be some preference in the laws of physics for matter. Personally, I suspect we'll discover an alternate explanation for the missing gamma rays that doesn't require an asymmetry in physics, such as your idea, but I'm certainly not an expert on the topic ("neophyte" would be generous).
  • by ThorGod ( 456163 ) on Friday November 18, 2011 @07:42PM (#38104654) Journal

    Is this sigma terminology coming from some discipline? I've taken plenty of grad statistics and we've always called them alpha-significance levels.

    From wikipedia:

    The term Six Sigma originated from terminology associated with manufacturing, specifically terms associated with statistical modeling of manufacturing processes

    So...the MBAs went and redefined some terms? And we're using them to summarize an empirical physics paper's results...why?

    • by EvanED ( 569694 ) <{evaned} {at} {gmail.com}> on Friday November 18, 2011 @07:48PM (#38104704)

      s this sigma terminology coming from some discipline? I've taken plenty of grad statistics and we've always called them alpha-significance levels.

      Surely if you've taken plenty of grad statistics, you've seen sigma used for the standard deviation.

      They're saying something like the observed difference is 3.5 times sigma. That corresponds to an alpha=0.05% (or is it 99.95%?); they're not saying that sigma itself is 0.05%.

    • by Anonymous Coward on Friday November 18, 2011 @07:48PM (#38104708)

      5*sigma = 5 standard deviations.

    • Re:sigma? (Score:5, Informative)

      by Anonymous Coward on Friday November 18, 2011 @07:52PM (#38104742)
      • by John.P.Jones ( 601028 ) on Friday November 18, 2011 @09:31PM (#38105346)

        They are averaging the results of many collisions, which are presumed to be independent and identically distributed of finite variance. Thus the central limit theorem dictates that the measured average is normally distributed about the mean of the true distribution of the statistics of a single collision. As they repeat the experiment n times the variance of the mean reduces at order n (hence std dev. the square root of the variance reduces at order sqrt(n)) Once they have repeated the experiment sufficient times the observed mean will be resolvable from a theoretical calculation (that is, if the theory is in error). They are waiting to verify that the expected (theoretical) result differs from the observed (measured average of many experiments) by at least six standard deviations (more experiments will lower the standard deviation while keeping the difference between theory and observation relatively static, or not). Then they will be certain that the theory is in error by however much they measure, then it is time to revise the theory to match the observation (without breaking any other observations and being able to predict new results that can be tested experimentally).

    • by hankwang ( 413283 ) * on Friday November 18, 2011 @08:37PM (#38105022) Homepage
      Funny that you mention alpha, since Wikipedia says: [slashdot.org] "In some fields, for example nuclear and particle physics, it is common to express statistical significance in units of the standard deviation Ïf of a normal distribution."
    • by MyLongNickName ( 822545 ) on Friday November 18, 2011 @08:38PM (#38105026) Journal

      How in the world do you take even ONE grad class and never hear of sigma or standard deviation? This is like the intro to the intro to statistics class and everything builds on it. You would have seen sigma dozens of times in each class...

      • by Theovon ( 109752 ) on Friday November 18, 2011 @08:54PM (#38105144)

        Sounds like he had a brain-fart. RIgth now, he's smacking his forehead and calling himself an idiot because he didn't put together this sigma with the sigma he knows about as the standard deviation.

        This sort of thing happens to me all the time. (Sometimes I feel really old.) I hate it when it makes me look stupid in front of someone. Like the day I was in the office of a Linguistics professor and asked a really stupid question about the fridge magnet letters that just happened to be IPA characters. I know IPA like the back of my hand, so I don't know what I was thinking.

        I do other things that make me look stupider than I really am. Recently, I did a doozie in a slashdot comment. But this time, I was just being lazy. They were talking about Bulldozer, and I said a bunch of things that were wrong, mostly because I had forgotten, and I didn't take the time to look it up. I'm getting a Ph.D. specializing in computer architecture, but my lazyness made me look like a total idiot.

        Fortunately, my dissertation committee won't be looking at my slashdot comments. :)

        • by ThorGod ( 456163 ) on Friday November 18, 2011 @10:34PM (#38105642) Journal

          Oh, I knew they were trying to refer to the second parameter of a normal distribution. But, whatever symbol we *use* for the variance (std dev) is just a symbol. We could call it: "a", "alpha", "sigma", "theta", all with various subscripts, and so on and so forth. Ever heard of six-theta_2 ? Six-theta_2 refers to 6 times the standard deviation of an estimated, normal curve. The term six-theta_2 only makes sense because we filled in the crucial parts (that shouldn't be left out).

          Saying "sigma" without any qualifications leaves much to be assumed. I'm being persnickety about terms because the terminology lacked definiteness.

    • by Anonymous Coward on Friday November 18, 2011 @08:48PM (#38105106)

      it means 99.95 percent.

      http://www.wolframalpha.com/input/?i=integral+between+-3.5+and+3.5+of+1%2Fsqrt%282pi%29*exp%28-x^2%2F2%29

    • by Rich0 ( 548339 ) on Sunday November 20, 2011 @02:30AM (#38114374) Homepage

      Well, the MBAs apply Six Sigma to all kinds of stuff that it really doesn't fit. However, the definition of six sigma is pretty straightforward:

      A six sigma process is one whose specification limits are at least six standard deviations away from the mean.

      So, if a space shuttle part needs to be 1 meter long, and if bad things happen if it is more than one cm off, then a six sigma process would need to produce parts that are 1 meter long with a standard deviation of 1/6th of a centimeter (and a normal distribution of sizes). If the process can do that then the chances of a part ever coming off the line that is off by a centimeter is VERY low - so low that you don't really need to check them all.

      Statistical process control is how the Japanese clobbered US industry after WWII. The US was stuck on outdated models where you test every part and reject the bad ones. The pioneer of SPC (an American) realized that you could ditch the testing, and take all that saved money and instead put it into improving your manufacturing process so that you don't make the bad parts to begin with. Modern process control is about randomly monitoring critical parameters and ensuring they all stay in range so that the final product is VERY likely to have the desired attributes.

      Alpha is used in hypothesis testing - as in, we can all be sure that 5% of all the clinical trial conclusions ever reached are downright wrong, and most likely those that are actually are reported are wrong much more often than that.

  • by dltaylor ( 7510 ) on Friday November 18, 2011 @07:54PM (#38104750)

    Identifying an a real-world mismatch of our models' predictions does not "explain" anything but that our models are incomplete.

    When spheres, and spheres on spheres, don't explain planetary motion, let's try another model: the ellipse.

    When "classical" mechanics can't explain why "orbiting" electrons don't fall into the nucleus of an atom due to electrostatic attraction, let's come up with a new model (while confusingly calling them "orbitals"): shells and quantum exclusion effects.

    When whatever synthesis of strings and quantum gravity and pixie dust (or something very different from all of them) can provide a mathematical basis (that isn't all adjusted parameters) to describe this universe's preference for "matter" vs "anti-matter" (maybe the seventh harmonic of the property of "charge" in 12- (13- ?) dimensional space has a more-natural resonance with the fourth harmonic of the property "mass" for matter than for anti-matter, or something): we'll have a better model, but still, probably, not an "explanation".

  • by Anonymous Coward on Friday November 18, 2011 @07:59PM (#38104784)

    If they are able to demonstrate a symmetry breaking strong enough to explain the preponderance of matter in the universe then they are a very good bet for the Nobel price in physics.

  • by anwyn ( 266338 ) on Friday November 18, 2011 @08:25PM (#38104948)
    All these experiments occured on earth in the vicinity of a lot of matter. How do we know that if we performed the experiments on a anti-earth we would not get an opposite result?
    • Because we did the experiment here, and these are the results we got. Feel free to doubt, but unless you are willing to create an experiment to falsify their findings, your claim has as much validity as young earth creationist.

    • by osu-neko ( 2604 ) on Friday November 18, 2011 @10:08PM (#38105502)

      All these experiments occured on earth in the vicinity of a lot of matter. How do we know that if we performed the experiments on a anti-earth we would not get an opposite result?

      All these experiments occurred within 5000 years of 1AD. How do we know that if we performed the experiments before 5000 BC or after 5000 AD we would not get an opposite result?

      The answer to both your question and mine is: we don't, but unless we have evidence that we would see an opposite result, it would be silly to believe we would in the absence of any good reason for it. Waving your hands and saying "maybe all the matter around influences things" is silly unless you have evidence to support that claim.

  • by Hentes ( 2461350 ) on Friday November 18, 2011 @09:29PM (#38105340)

    The radiation of an antimatter star would be the exact same as a matter star. There is no way of knowing that our visible Universe is mainly matter. That the Universe is made mostly of matter is a myth not really backed up.

    • by Surt ( 22457 ) on Friday November 18, 2011 @09:48PM (#38105422) Homepage Journal

      We know what annihilation looks like. If there were anti-stars in our galaxy, we'd see some substantial annihilation signatures in the mixing in nebulae for example. Even if whole galaxies were anit-matter, we'd see some signature where the galaxies mix. The smallest unit of mass that could be anti matter unnoticeably is probably the supercluster. Even then, doubtful that we couldn't see annihilation signatures along the great walls, for example.

    • by Old Wolf ( 56093 ) on Saturday November 19, 2011 @01:35AM (#38106562)

      Firstly,it might not, as Nature respects neither C-symmetry (swapping matter for antimatter) nor CP-symmetry (swapping matter for antimatter and taking a reflection), as shown by TFA. So antimatter stars might behave differently or not even exist.

      Secondly,if there were large amounts of antimatter in the observable universe, there would be huge amounts of radiation produced along the bounday between it and the bits that are made of matter. ('Empty space' isn't empty; look up Interstellar medium and Intergalactic medium).

  • by trojjan ( 994851 ) on Saturday November 19, 2011 @12:58AM (#38106408)
    A bit off topic but this is very interesting find, just a few weeks after the 'Faster than light neutrinos'. Why can't we put money into projects like these instead of killing people in other countries. Err correction: Bringing democracy to other people.
  • Significance (Score:5, Informative)

    by kievit ( 303920 ) on Saturday November 19, 2011 @07:12AM (#38107598) Journal

    Being a physicist myself I am very happy that this topic makes it into the news. But it is important to keep cool and skeptical. The statement that a statistical fluke has a probability of 0.05% implies that it is bound to happen if you let 2000 students do data analyses on independent data sets. There are indeed literally thousands of PhD students doing such analyses LHC data, trying to address hundreds of specific research questions that each require different data selections. So it is very likely that some of them will find a result several standard deviations away from the expectation. Actually 3.5 sigma deviations happen very often, because of all sorts of mistakes and inaccuracies in the analyses, but most of the time these mistakes are scrutinzed away before loud public announcements are made. After all scrutiny a few genuine statistical flukes should still remain, and recognized as such.

    (For the xkcd inclined: green jellybeans linked to acne [xkcd.com].)

    More caveats:

    • On slide 14 and 15 you see a summary of the estimated systematic errors and the final result: the deviation of the observed value from the expected value is 0.82 ± 0.21(stat.) ± 0.11(sys.) %. Estimating and combining systematic errors is almost by definition dark magic. It looks like the "3.5 sigma" was obtained by adding the statistical and systematic error in quadrature, which yields a total error of 0.237, and 0.82/0.237=3.5.
    • The statement that the probability of this 3.5 sigma deviation is 0.05% is based on the assumption that if you repeat this analysis several times on more data with exactly the same experimental setup, the deviations from expectation are distributed like a Gaussian (bell curve) with a sigma equal to the total error mentioned in the previous bullet point. That is a major idealization, it could be distributed in many other ways, and then the relation between the deviation (in units of sigma, which is also defined for non-Gaussian distributions) and "the fraction of events with such a deviations or larger" can be quite different. Furthermore, when repeating the identical experiment the systematical errors do *not* fluctuate (that is one of the aspects in which they differ from statistical errors), so aforementioned idealized Gaussian would have an arbitrary offset with a magnitude of the order of the estimated systematic error (0.11), in either direction, and a width of the actual statistical error, 0.21. Depending on what this systematic error really is, the true statistical significance is much larger or much smaller than the quoted 3.5 sigma.

    So this is a very interesting result, but more study is needed and in my experience such flukes almost always evaporate in the light of more data and scrutiny. Still, it's not completely excluded that this was indeed the first hint of a real discovery (otherwise no researcher would ever do all that work).

    OK, enough for now. Sorry for misinterpretations and other errors I might have made.

"Experience has proved that some people indeed know everything." -- Russell Baker

Working...