Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Technology Science

Simulating the Whole Universe 326

Roland Piquepaille writes "An international group of cosmologists, the Virgo Consortium, has realized the first simulation of the entire universe, starting 380,000 years after the Big Bang and going up to now. In 'Computing the Cosmos,' IEEE Spectrum writes that the scientists used a 4.2 teraflops system at the Max Planck Society's Computing Center in Garching, Germany, to do the computations. The whole universe was simulated by ten billion particles, each having a mass a billion times that of our sun. As it was necessary to compute the gravitational interactions between each of the ten billion mass points and all the others, a task that needed 60,000 years, the computer scientists devised a couple of tricks to reduce the amount of computations. And in June 2004, the first simulation of our universe was completed. The resulting data, which represents about 20 terabytes, will be available to everyone in the months to come, at least to people with a high-bandwidth connection. Read more here about the computing aspects of the simulation, but if you're interested by cosmology, the long original article is a must-read."
This discussion has been archived. No new comments can be posted.

Simulating the Whole Universe

Comments Filter:
  • by Anonymous Coward on Saturday September 04, 2004 @05:51PM (#10159094)
    The article indicates that the "tricks" these researchers used were the octree and multipole expansion--both of which have been used in gravity and potential theory for many years. They reduce the N^2 interaction problem to N or N Log(N), depending upon implementation. The story makes it sound like these researchers invented the technique; I assume the writer misunderstood the scientists, because it certainly predates them.
  • Re:Wait wait wait (Score:2, Informative)

    by iroll ( 717924 ) on Saturday September 04, 2004 @05:59PM (#10159125) Homepage
    The article answers said questions nicely.

    a) They didn't start at the beginning; they started at 380K years--the "snapshot" of which has been developed by looking at cosmic background radiation.

    b) Using telescopes, they've observed very large-scale structures in the universe (arrangements of clusters of galaxies), and they are hoping to see similar large-scale structures in their model.
  • Re:Kind of useless? (Score:5, Informative)

    by mangu ( 126918 ) on Saturday September 04, 2004 @06:17PM (#10159186)
    I think they would've done a much better job with 1 million particles of possibly different types, simulating several other forces.


    No. Of the four known forces in the universe, only gravity is important in the long range, which defines the overall structure of the universe.


    The other three forces are electrical, and two nuclear forces. The nuclear ones are *very* short range, acting only in the atom nucleus. The electrical force is long range, but because there are two different electrical charges, which balance out, there isn't any perceptible electrical attraction in the long range.

  • by TMB ( 70166 ) on Saturday September 04, 2004 @08:27PM (#10159764)
    Are they modeling any of the physical (star formation, etc) interactions of matter or just the gravitational interaction. It seemed like the latter, but the article did mention the apparent non-interaction of dark matter.
    From the article:

    The recently completed Millennium Run gave them the universe's broad distribution of matter as dictated by gravity. In upcoming simulations, other forces will come into play. Onto the web of matter the scientists will graft the electromagnetic aspects of normal matter, which by radiating photons allows gas to cool down and condense into spiral disks that originate stars. At the same time, hydrodynamic pressure, which ultimately derives from the fact that two atoms cannot overlap each other because of repulsion between their electrons, redistributes matter along the cosmic web's strands and nodes.
    So this run is just gravity, but they will do more runs that include hydro, cooling, and presumably star formation. And to answer your first question

    that works out to 100 to 200 data points to represent our galaxy. I wonder if they will get recognizeable spiral structures, etc?
    Without hydro or cooling, all you get are ellipsoidal dark matter halos, no disks.

    [TMB]
  • by xtal ( 49134 ) on Saturday September 04, 2004 @08:33PM (#10159783)
    http://www.simulation-argument.com/

    This paper argues that at least one of the following propositions is true: (1) the human species is very likely to go extinct before reaching a "posthuman" stage; (2) any posthuman civilization is extremely unlikely to run a significant number of simulations of their evolutionary history (or variations thereof); (3) we are almost certainly living in a computer simulation. It follows that the belief that there is a significant chance that we will one day become posthumans who run ancestor-simulations is false, unless we are currently living in a simulation. A number of other consequences of this result are also discussed.
  • Re:Kind of useless? (Score:1, Informative)

    by Anonymous Coward on Saturday September 04, 2004 @11:38PM (#10160642)

    You're forgetting the Big Bang, which is the most interesting part of the history of the universe.


    Nobody knows how to simulate the Big Bang; nobody knows even what laws applied then. The researchers started out 380,000 years after the Big Bang for a reason: it's the time at which we have a lot of information about the laws and the initial data (the CMBR spectrum at that time). Gravity is what is relevant afterwards.


    Using a full 10 billion particles for just gravity also makes very little sense because the distribution of matter in the universe is quite homogeneous.


    This is wrong. They needed ten billion particles precisely because they needed to model the inhomogeneity, which increases with time as locally dense regions cluster under their self-gravitation.


    A lot more insight could've been gained by focussing on a smaller subset of the universe.


    That kind of computer time ain't cheap to go squandering on pointless precision. If they could have gotten by with fewer particles, they would have. Trust me, these guys knew what they were doing.


    What they're doing is like simulating a glass of water using 10 billion water molecules. You can do this, or you can just look at the interesting parts like the air/water interface and the meniscus around the sides of the glass, and either get the same results faster or use more particles and a more detailed model to get more detailed results.


    No, they needed to simulate the entire volume to find out the interesting bits. This isn't like Monte Carlo in stat mech, where everything's in equilibrium and you can study representative samples. This is like molecular dynamics, where you have to follow each individual particle step by step to figure out what happens.
  • by Anonymous Coward on Saturday September 04, 2004 @11:45PM (#10160670)
    The Heisenberg uncertainty principle doesn't apply to classical simulations such as this one; it does apply to quantum simulations. In those simulations, a particle doesn't have a simultaneously well-defined position and velocity (just like in real life); instead, you're simulating its wavefunction (which describes the probability of it being somewhere, with some velocity).
  • by Anonymous Coward on Sunday September 05, 2004 @02:08AM (#10161068)
    The article says each simulation particle is a billion solar masses, that is, 1e9 M_sun, which immediately implies that Andromeda (with your mass) requires 1000 particles. And hence our galaxy would as well. I think your miscalculation comes from a mistake about the Sun's mass; the link you give is incorrect. The mass of the Sun is 2e30 kg [google.com].

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...