Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Technology Science

Simulating the Whole Universe 326

Roland Piquepaille writes "An international group of cosmologists, the Virgo Consortium, has realized the first simulation of the entire universe, starting 380,000 years after the Big Bang and going up to now. In 'Computing the Cosmos,' IEEE Spectrum writes that the scientists used a 4.2 teraflops system at the Max Planck Society's Computing Center in Garching, Germany, to do the computations. The whole universe was simulated by ten billion particles, each having a mass a billion times that of our sun. As it was necessary to compute the gravitational interactions between each of the ten billion mass points and all the others, a task that needed 60,000 years, the computer scientists devised a couple of tricks to reduce the amount of computations. And in June 2004, the first simulation of our universe was completed. The resulting data, which represents about 20 terabytes, will be available to everyone in the months to come, at least to people with a high-bandwidth connection. Read more here about the computing aspects of the simulation, but if you're interested by cosmology, the long original article is a must-read."
This discussion has been archived. No new comments can be posted.

Simulating the Whole Universe

Comments Filter:
  • by Anonymous Coward on Saturday September 04, 2004 @05:21PM (#10158941)
    This isn't an ad for anything on think geek !

    Has a mistake been made?

    NOT AN ADVERTISMENT FOR THINK GEEK !!
  • by kirkjobsluder ( 520465 ) <kirk@@@jobsluder...net> on Saturday September 04, 2004 @05:50PM (#10159093) Homepage
    How can you accurately simulate the computer that is simulating the entire universe?

    The same way you simulate anything else. You simplify the problem down to a manageable number of particles that represent larger units of whatever you are simulating. Since in looks like they are interested in mass and gravity at the galactic supercluster scale, they can use particles that weigh much more than any individual star.

    So the fundamental challenge for the Virgo team is to approximate that reality in a way that is both feasible to compute and fine-grained enough to yield useful insights. The Virgo astrophysicists have tackled it by coming up with a representation of that epoch's distribution of matter using 10 billion mass points, many more than any other simulation has ever attempted to use.


    THESE DIMENSIONLESS POINTS have no real physical meaning; they are just simulation elements, a way of modeling the universe's matter content. Each point is made up of normal and dark matter in proportion to the best current estimates, having a mass a billion times that of our sun, or 2000 trillion trillion trillion (239) kilograms. (The 10 billion particles together account for only 0.003 percent of the observable universe's total mass, but since the universe is homogeneous on the largest scales, the model is more than enough to be representative of the full extent of the cosmos.)

  • Re:Why bother? (Score:1, Insightful)

    by Anonymous Coward on Saturday September 04, 2004 @05:58PM (#10159122)
    The answer is 42. Just google it

    Might as well go with that. Trying to simulate the whole universe with a computer is like trying to simulate all data on earth with an md5 checksum.
  • by mangu ( 126918 ) on Saturday September 04, 2004 @06:04PM (#10159146)
    what kind of useful calculations can you make when you vary that significantly from your target system.


    To you and all the other (-1, Redundant) posts on how the system can't simulate every single detail in the Universe: it's a *simulation*, not the real thing, OK?


    The first thing you need to do when you plan a simulation is to determine exactly what's significant or not. In this case, they decided that a set of particles with a billion times the mass of our sun would be appropriate. That's because what they are studying is mostly the long range effects of gravitation, where "long range" is defined as a sphere that contains a mass of ten billion suns.


    When and if someone wants to study the workings of the Universe at a smaller scale than that, then they will have to simulate at a smaller scale. Phew, people are so dense! Next thing they will say that because a photograph didn't capture every single hair in a person's head or every single pore in their skin, that photo doesn't represent that person at all...

  • by Jeremy Erwin ( 2054 ) on Saturday September 04, 2004 @06:11PM (#10159169) Journal
    The Read more here link leads to a few pity sentences framing lengthy excerpts from the IEEE article.

    BTW, the machine in question, the Max-Planck-Gesellschaft MPI/IPP, is currently ranked 66th [top500.org]. It looks to be a fairly ordinary cluster with none of the exoticism that Cray says we so desperately need [slashdot.org]
  • by JamesTRexx ( 675890 ) on Saturday September 04, 2004 @06:20PM (#10159194) Journal
    Aren't scientists nerds too?
  • by Anonymous Coward on Saturday September 04, 2004 @06:27PM (#10159222)
    he doesnt give a shit about the story, slashdot is just an ends to a means as long as people keep clicking he will keep spamming
    the dickhead is just a plagairist, steals content and then reposts it for profit, its pretty obvious the editors are getting kickbacks so just just add his site(s) to your hosts file and ignore the french fuck

    127.0.0.1 radio.weblogs.com
    127.0.0.1 www.blogads.com
    127.0.0.1 blogads.com

  • by TMB ( 70166 ) on Saturday September 04, 2004 @08:09PM (#10159663)
    (disclaimer: I Am An N-Body Modeller, and although I'm not part of the Virgo collaboration, a large fraction of what I do is study cosmological models like the one described)

    It doesn't quite come out in the article, but what's really groundbreaking about this work is the number of particles they're using. When you make models like these, you always have to prioritize how large a volume you want to simulate (the more volume you have, the more representative a fraction of the universe you have and the larger number of structures you can analyze) vs how massive the particles are (the smaller the particles, the smaller structures you can analyze).

    The more total particles you have, the less you need to compromise your volume or particle mass. Until now, simulating such a large a fraction of the universe (NOTE: unlike what the submitter said, this is not the full universe; as the article itself says, it's about 0.003 of the Hubble volume) required such large particles that it was impossible to say anything about individual galaxies.

    However, with 10^10 particles, the mass of their particles is only about 10^9 solar masses, so they can reliably resolve structures of 10^11 solar masses. For reference, the mass of the Milky Way is roughly 10^12 solar masses. This is a fantastic leap forward - most other modern simulations have 10^8 - 10^9 particles, and so either can only simulate a much smaller fraction of the universe (like the simulations I study), or cannot say anything about galaxies, only massive galaxy clusters.

    [TMB]
  • by Anonymous Coward on Saturday September 04, 2004 @09:51PM (#10160200)
    You're forgetting the Big Bang, which is the most interesting part of the history of the universe. That's when forces other than gravity had a significant impact. Of course, you'd also need to simulate different types of particles to get this right (first protons/neutrons then atoms were being created, lots of light was emitted, etc).

    Using a full 10 billion particles for just gravity also makes very little sense because the distribution of matter in the universe is quite homogeneous. That is, although there are lots of particles pulling on you from far away, you're being pulled from all directions pretty much equally. The only groups of particles that really matter are those that are very close to you (the Earth, Sun and Moon in our case). A lot more insight could've been gained by focussing on a smaller subset of the universe.

    What they're doing is like simulating a glass of water using 10 billion water molecules. You can do this, or you can just look at the interesting parts like the air/water interface and the meniscus around the sides of the glass, and either get the same results faster or use more particles and a more detailed model to get more detailed results.
  • by Quantum Jim ( 610382 ) <jfcst24&yahoo,com> on Sunday September 05, 2004 @02:57AM (#10161207) Homepage Journal

    While I find this excercise interesting I also find it a tad ridiculose. So many simplifications have to be made to even attempt it and the whole thing is based on some assumptions that are not necesarrily cold hard fact...

    The primary goal for computer simulations such as these are to understand how and why they don't work - not to test current theories.

    For instance, many different attempts had be conducted before an computer model of Earth's magnetic field exibited magnetic field reversal - and even then, it wasn't exactly like how the geological evidence shows. However, it suggests a basic model that can be adjusted to more accurately describe our planet's core.

    It is the same with this attempt to simulate a Universe. The goal is to understand how things interact, how the simulated universe differs from our Universe, and why it differed. Some things would be due to problems running computer simulations with a Von Neuman Machine (such as the "three body problem"). Other errors will be caused by problems with our current model. If the two effect can be seperated and analysed, then advancements could be done in both computer science (e.g. weather forcasting) and cosmology. That's the point of this excersize.

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...