Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
NASA Space Software Science

How Cosmological Supercomputers Evolve the Universe All Over Again 144

the_newsbeagle writes "To study the mysterious phenomena of dark matter and dark energy, astronomers are turning to supercomputers that can simulate the entire evolution of the universe. One such simulation, the Bolshoi projection, recently did a complete run-through. It started with the state the universe was in around 13.7 billion years ago (not long after the Big Bang) and modeled the evolution of dark matter and energy up to the present day. The run used 14,000 CPUs on NASA's fastest supercomputer."
This discussion has been archived. No new comments can be posted.

How Cosmological Supercomputers Evolve the Universe All Over Again

Comments Filter:
  • by Yvanhoe ( 564877 ) on Wednesday October 03, 2012 @12:20AM (#41534435) Journal
    There is not enough energy in the universe to store all the informations of the universe in a computer.
    If you focus on some information (human minds for instance) of special interest to you, on the other hand...
  • by Black Parrot ( 19622 ) on Wednesday October 03, 2012 @01:13AM (#41534705)

    There is not enough energy in the universe to store all the informations of the universe in a computer.

    I subscribe to the view that the universe is computing its own final state.

    Or more precisely, always computing its next state; apparently there isn't going to be a final one.

  • by Black Parrot ( 19622 ) on Wednesday October 03, 2012 @01:23AM (#41534763)

    We all did

    I shouted out Who killed the Kennedys?, when after all... It was you and me.

  • by Anonymous Coward on Wednesday October 03, 2012 @03:24AM (#41535215)

    You fail to realize something. If the reality we are experiencing is in fact a simulation, then it doesn't matter if one plank-step of the simulation takes an hour or a Universe worth of time to compute -- To us within the simulation, time remains locally constant. Likewise, The super computers can simulate the entire evolution of the universe by imposing acceptable error rates (epsilon).

    A very low resolution simulation would simply count down from 1.0 (max Universal energy) to 0.0 (heat death) over one universe worth of time steps, the fastest of such simulation is a single constant approximation: .42

    A higher resolution simulation could produce a more detailed simulation using much more than a single time step. Interestingly, the quantum error rate can be predicted from within the simulation via observation. Heisenberg has calculated the epsilon of our Universe... Plank calculated the physics step size.

    In short: One can indeed calculate an entire Universe within another if one allows a high enough "acceptable" error rate and low enough resolution. Quantum Uncertainty may be proof such corner cutting has already happened at a higher dimension.

  • by Anonymous Coward on Wednesday October 03, 2012 @03:26AM (#41535227)

    Yes, Bolshoi's initial conditions were in some sense designed to match observations. Specifically, the five main parameters which describe cosmology (including the abundance of dark energy and dark matter) were derived from observations of the cosmic microwave background, supernovae, and galaxy clustering. These five numbers were used to generate the initial conditions for Bolshoi; a supercomputer spent the time to figure out how the universe should evolve from several million years after the Big Bang to the present day. Thus, Bolshoi serves as a giant consistency check of the model: i.e., it tests whether the five parameters are enough to explain everything we observe about the evolution of the universe, as well as whether observers calculated them correctly.

    However, the fact that Bolshoi matches observations now is no guarantee that a future observation won't come along and break things. The previous large simulation (the Millennium Simulation), which was run in 2005, was also designed to match all observations up to that point. However, since then, we've made observations which contradicted results from that previous simulation, which have indeed taught us new things.

    Finally, to address the specific point that you raise: we don't know what dark matter and dark energy are, but to our knowledge, gravity doesn't care about the type of matter/energy involved. This assumption could be wrong, of course. So far, however, making that assumption has led to predictions which seem to match observations. (So it would definitely be interesting if someone made an observation that proved otherwise!) The "linear equations" the author is referring to are simply Taylor expansions of the gravitational potential. Since the density fluctuations in the early universe are tiny (variations of +/- 0.001% even 300,000 years after the Big Bang), using a linear approximation doesn't introduce significant errors. However, once the density fluctuations grow to +/- 10% or so, then the linear approximation is no longer as useful; that's when the supercomputer takes over to do more accurate computations of gravity.

"Here's something to think about: How come you never see a headline like `Psychic Wins Lottery.'" -- Comedian Jay Leno

Working...