Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
IBM Science

Landmark Calculation Clears the Way To Answering How Matter Is Formed 205

First time accepted submitter smazsyr writes "An international collaboration of scientists is reporting in landmark detail the decay process of a subatomic particle called a kaon – information that may help answer fundamental questions about how the universe began. The calculation in the study required 54 million processor hours on the IBM BlueGene/P supercomputer at Argonne National Laboratory, the equivalent of 281 days of computing with 8,000 processors. 'This calculation brings us closer to answering fundamental questions about how matter formed in the early universe and why we, and everything else we observe today, are made of matter and not anti-matter,' says a co-author of the paper."
This discussion has been archived. No new comments can be posted.

Landmark Calculation Clears the Way To Answering How Matter Is Formed

Comments Filter:
  • 281 days? (Score:3, Informative)

    by Gr8Apes ( 679165 ) on Monday May 28, 2012 @10:23PM (#40138849)
    Blue Gene uses quad core PowerPCs, with 8192 cores on the Argonne system. That's a heck of a lot of days of maxing out your CPUs!
  • by DontLickJesus ( 1141027 ) on Monday May 28, 2012 @11:27PM (#40139145) Homepage Journal
    All I read about this event is that the computers mapped the decay. Not 1 piece of information about what they learned. In that light, I'll fill in the blanks with the pieces of Quantum Physics I understand.

    Kaons are quarks with "strangeness". This typically includes Up, Down, Charm, Strange, and Bottom. Top doesn't participate due to size and shortness of life. Kaons ( http://en.wikipedia.org/wiki/Kaon [wikipedia.org] ) decaying into Pions ( http://en.wikipedia.org/wiki/Pion [wikipedia.org] ) is a great demonstration of quarks participating in the Weak Force. This study combines our study of particle oscillation and weak decay, and digitally maps out that entire process rather than simply relying on theory. Granted, they weren't actually watching this happen, but the generated map gives Physicists what they need to compare against findings from places like the LHC.

    TL;DR? Basically, this group designed software and used a very fast computer to generate a result set from theoretical predictions which can be used to compare against various super-collider findings. Specifically, these result sets are regarding Kaon to Pion decay, a Weak force interaction.
  • by Anonymous Coward on Tuesday May 29, 2012 @12:02AM (#40139257)

    There conceivably could be an infinite number of "parellel" universes, but there's a real philosophical problem with that. So long as we use the real physicists definitions and not something out of Stargate SG1, those parallels will always remain undetectable. SF writers tell stories about interacting with other universes - physicists define them in ways that show they can't be interacted with to be verified.

              An untestable idea isn't part of science. If it can't be disproven, it's philosophy or religion or something instead. An infinite number of untestable ideas is even worse. Philosophers get to whip out Occam's Razor at that point. If I claim that there is not only a God, but 7 different orders of angels totaling 144,000 beings working for him, those numbers are still simpler, in the sense Occam's Razor usually means, and so are to be preferred as a hypothesis. The same goes for a Million gods with an avarage of four arms each and a bunch of hidden cyclic time periods totalling quintillions of years for them to do their work in, or any of those models with a reasonably sized bunch of gods, and maybe some giants, dwarfs, dark elves, ninja turtles piza delivery robots, a billion clones of an invisible pink unicorn who died for your sins, riding on a gigantic fiberglass replical of L. Ron Hubbard, and so on. Just about any other idea looks preferrable to an idea that postulates an infinite number of unverifiable consequents.

    An untested idea isn't science?

    The scientific method is:
    State the problem. Are there multiple universes?
    Form a hypothesis. Yes there are other universes.
    Test your hypothesis using experimentation and observation. I examine black holes and the mathmatics behind them. I also study the Cosmic Microwave Background that seems to have a cold spot in it (Source: Through the Wormhole with Morgan Freeman). The cold spot is potentially another universe's gravity pulling on our universe.
    The hypothesis can be Proved, Disproved, or the results dismissed as inconclusive. Currently inconclusive. More data and experimentation is needed.
    Rinse and repeat.
    Please explain to me how running a simulation on the early universe using a supercomputer doesn't follow the scientific method?

    In your mind there are only two states: Proved and Disproved, but there is also inconclusive.Inconclusive is potentially more important than Proved/Disproved because it forces us to continue to examine the universe and continue asking questions.

    I don't see a philosophical problem as almost all religions already describe a multiversal structure. Buddhism and Hinduism have countless celestial and hellish realms and actually describe the universe as a giant net and at each knot is a gem. Enlarge it and clusters of gems become a universes within the net (multiversal) structure. Jews, Christians, and Muslims all believe in a three universe system of Heaven, Hell and Earth. You know eternal life after death and all that, but just because it is a philosophical concept doesn't mean it can't also be scientific.

  • by Roger W Moore ( 538166 ) on Tuesday May 29, 2012 @12:38AM (#40139381) Journal
    Since the blog entry contains no reference - and the one hint there is is wrong - here is the actual article reference: Phys. Rev. Lett. 108:141601 (2012) - which was published on 6th April, not 30th March at the article states!

    Now onto the physics, sorry but your summary is almost completely wrong. Kaons are mesons which are a bound state of a quark and anti-quark. In the case of neutral kaons this is a strange and anti-down (or vice versa for the anti-kaon IIRC). What is interesting about the kaon is that the neutral states can oscillate between kaon and anti-kaon through a weak interaction. What you end up with is a long-lived kaon (KL) and a short lived one (KS). The simplest way to demonstrate that this system differentiates between matter and anti-matter is to look at the long lived kaon decaying in to muons (heavy cousins of the electron). The number of anti-muons will be about 0.1% different from the number of muons produced.

    However the decay to pions is far more closely studied because it can tell us far more information - in particular whether this symmetry breaking occurs in the decay mechanism (direct CP violation) or only in the weak mixing of a kaon to anti-kaon (indirect CP violation). The experiment I worked on as a grad student, NA48, observed this direct CP violation unambiguously for the first time, confirming the previous NA31 result. This ruled out more exotic types of CP violation from a new "superweak" interaction and, in broad terms, was consistent with the Standard Model.

    However this was not really confirmation of the Standard Model because the actual calculation of CP violation occurring in the SM is really hard to calculate: it involves quark/W boson loops which must have contributions from all three generations of quarks (specifically including the top quark!). These so-called penguin diagrams [wikipedia.org] (blame the name on John Ellis' dart playing skills!) are really hard to calculate - at least to the accuracy needed for CP violation in kaons. Kaons must decay through a weak interaction because only the weak interaction can change the strange quark into an up quark which is needed for pion decay. However there is also a strong component to the decay.

    Strong (QCD) processes are really hard to calculate because perturbation theory does not work for them (the interaction is far too strong). One approach to solve this is lattice QCD which literally simulates all the colour (QCD) fields on a 4D grid of space-time points. However this is really CPU-intensive so only small grids can be simulated. This is not too bad if you have a strong process because, being 'strong' it happens quickly in a small region. However the weak part of the decay occurs more slowly over a larger area. What the authors seem to have done is overcome this simulation problem of both weak and strong forces in the same decay which raises the prospect of accurate calculations of the CP violation in kaon decays which has never been possible before. For the technically minded this paper calculates the Isospin=2 decay amplitude (A_2) whose phase shift, relative to the isospin 0 amplitude (A_0) is what makes direct CP violation visible - it's a really interesting paper - at least if you have ever been involved in kaon physics!
  • by Roger W Moore ( 538166 ) on Tuesday May 29, 2012 @09:01AM (#40141249) Journal
    Not that I want to make you more depressed but the above post was at a level somewhat below what I'd expect final year undergraduates will understand - at least the ones who have taken an undergrad particle course. The only exception being the A_0 and A_2 amplitudes which is specialized kaon physics. If you are studying matter/antimatter interactions then you ought to know this stuff. There is a good undergrad book by Griffiths, "Introduction to Elementary Particles", which has a section on CP violation including the the B meson sector. I'd also happily share by lecture slides on this but my university has not yet implemented public access to course material.
  • by IICV ( 652597 ) on Tuesday May 29, 2012 @09:54AM (#40141715)

    The phrase "a solution looking for a problem" was originally coined for the newly invented laser [wikipedia.org] - everyone could tell that it was wicked cool, but nobody could come up with a good use for it besides maybe pumping a ton of power into it and setting fire to something far away.

  • by Anonymous Coward on Tuesday May 29, 2012 @10:24AM (#40142005)

    Henrich Hertz, just after his famous experiment where he generated and received radio waves, was intervied by some newspaper reporter on the practical uses for this new science. His response, "It's of no use whatsoever... this is just an experiment that proves Maestro Maxwell was right."

  • by Brannoncyll ( 894648 ) on Tuesday May 29, 2012 @10:31AM (#40142079)

    Oops- apologies for the empty post!

    Disclaimer - I am an author on the paper.

    Your comment about the weak interaction occurring over large distances is not correct - the weak interaction scale is ~90 GeV, which is much much higher than the hadronic energy scale ~1 GeV. In lattice calculations, where the interaction scales are on the order of femtometres, the weak interactions can be simulated to very high accuracy (sub-1%) using simply a point-like vertex. Due to the separation of scales, the actual weak component of the calculation can be completely separated out and calculated using standard perturbative techniques - the hard part has always been the calculation of the strong interaction component. While perturbative calculations just take a few guys a couple of months to sort out the factors of 2, the lattice calculation takes many months to run on state-of-the-art supercomputers and combines techniques developed over 40 years of work.

  • by orgelspieler ( 865795 ) <w0lfie@@@mac...com> on Tuesday May 29, 2012 @01:46PM (#40145145) Journal

    I think you have totally missed the point. We don't know how useful a theory is until decades after its discovery. The technologists and engineers have to have time to shape it into something useful. In 1772, when LaGrange points were discovered, do you think anybody ever dreamed of having a satellite perched at one of them to warn us about solar storms? Of course not. It was pure theory, an interesting quirk in the solution to a purely mathematical function.

    Quantum dynamics was purely theoretical physics just a few decades ago. Now we have microscopes and hard drives that depend on quantum effects to properly function. What about general and special relativity? Without them we wouldn't have GPS. It's all just "theoretical physics" right?

    Who knows, maybe in 50 years we'll be using string theory on a daily basis to teleport to dimension X to mine trilithium for our ludicrous-speed drives. You don't know. Trying to classify theory and science as two different things is false dichotomy at its worst.

Intel CPUs are not defective, they just act that way. -- Henry Spencer

Working...