Simulating the Whole Universe 326
Roland Piquepaille writes "An international group of cosmologists, the Virgo Consortium, has realized the first simulation of the entire universe, starting 380,000 years after the Big Bang and going up to now. In 'Computing the Cosmos,' IEEE Spectrum writes that the scientists used a 4.2 teraflops system at the Max Planck Society's Computing Center in Garching, Germany, to do the computations. The whole universe was simulated by ten billion particles, each having a mass a billion times that of our sun. As it was necessary to compute the gravitational interactions between each of the ten billion mass points and all the others, a task that needed 60,000 years, the computer scientists devised a couple of tricks to reduce the amount of computations. And in June 2004, the first simulation of our universe was completed. The resulting data, which represents about 20 terabytes, will be available to everyone in the months to come, at least to people with a high-bandwidth connection. Read more here about the computing aspects of the simulation, but if you're interested by cosmology, the long original article is a must-read."
NOT AN ADVERTISMENT FOR THINK GEEK !! (Score:0, Insightful)
Has a mistake been made?
NOT AN ADVERTISMENT FOR THINK GEEK !!
RTFA (Re:Umm, Paradox?) (Score:5, Insightful)
The same way you simulate anything else. You simplify the problem down to a manageable number of particles that represent larger units of whatever you are simulating. Since in looks like they are interested in mass and gravity at the galactic supercluster scale, they can use particles that weigh much more than any individual star.
Re:Why bother? (Score:1, Insightful)
Might as well go with that. Trying to simulate the whole universe with a computer is like trying to simulate all data on earth with an md5 checksum.
define "significantly" (Score:5, Insightful)
To you and all the other (-1, Redundant) posts on how the system can't simulate every single detail in the Universe: it's a *simulation*, not the real thing, OK?
The first thing you need to do when you plan a simulation is to determine exactly what's significant or not. In this case, they decided that a set of particles with a billion times the mass of our sun would be appropriate. That's because what they are studying is mostly the long range effects of gravitation, where "long range" is defined as a sphere that contains a mass of ten billion suns.
When and if someone wants to study the workings of the Universe at a smaller scale than that, then they will have to simulate at a smaller scale. Phew, people are so dense! Next thing they will say that because a photograph didn't capture every single hair in a person's head or every single pore in their skin, that photo doesn't represent that person at all...
Oh Joy! Another Roland Piquepaille post! (Score:4, Insightful)
BTW, the machine in question, the Max-Planck-Gesellschaft MPI/IPP, is currently ranked 66th [top500.org]. It looks to be a fairly ordinary cluster with none of the exoticism that Cray says we so desperately need [slashdot.org]
Re:Obligatory question (Score:2, Insightful)
Keep clicking Roland Piquepaille spammer ! (Score:1, Insightful)
the dickhead is just a plagairist, steals content and then reposts it for profit, its pretty obvious the editors are getting kickbacks so just just add his site(s) to your hosts file and ignore the french fuck
127.0.0.1 radio.weblogs.com
127.0.0.1 www.blogads.com
127.0.0.1 blogads.com
Lots and lots of particles (Score:5, Insightful)
It doesn't quite come out in the article, but what's really groundbreaking about this work is the number of particles they're using. When you make models like these, you always have to prioritize how large a volume you want to simulate (the more volume you have, the more representative a fraction of the universe you have and the larger number of structures you can analyze) vs how massive the particles are (the smaller the particles, the smaller structures you can analyze).
The more total particles you have, the less you need to compromise your volume or particle mass. Until now, simulating such a large a fraction of the universe (NOTE: unlike what the submitter said, this is not the full universe; as the article itself says, it's about 0.003 of the Hubble volume) required such large particles that it was impossible to say anything about individual galaxies.
However, with 10^10 particles, the mass of their particles is only about 10^9 solar masses, so they can reliably resolve structures of 10^11 solar masses. For reference, the mass of the Milky Way is roughly 10^12 solar masses. This is a fantastic leap forward - most other modern simulations have 10^8 - 10^9 particles, and so either can only simulate a much smaller fraction of the universe (like the simulations I study), or cannot say anything about galaxies, only massive galaxy clusters.
[TMB]
Re:Kind of useless? (Score:1, Insightful)
Using a full 10 billion particles for just gravity also makes very little sense because the distribution of matter in the universe is quite homogeneous. That is, although there are lots of particles pulling on you from far away, you're being pulled from all directions pretty much equally. The only groups of particles that really matter are those that are very close to you (the Earth, Sun and Moon in our case). A lot more insight could've been gained by focussing on a smaller subset of the universe.
What they're doing is like simulating a glass of water using 10 billion water molecules. You can do this, or you can just look at the interesting parts like the air/water interface and the meniscus around the sides of the glass, and either get the same results faster or use more particles and a more detailed model to get more detailed results.
Re:Speed of Gravitational attraction ? (Score:4, Insightful)
The primary goal for computer simulations such as these are to understand how and why they don't work - not to test current theories.
For instance, many different attempts had be conducted before an computer model of Earth's magnetic field exibited magnetic field reversal - and even then, it wasn't exactly like how the geological evidence shows. However, it suggests a basic model that can be adjusted to more accurately describe our planet's core.
It is the same with this attempt to simulate a Universe. The goal is to understand how things interact, how the simulated universe differs from our Universe, and why it differed. Some things would be due to problems running computer simulations with a Von Neuman Machine (such as the "three body problem"). Other errors will be caused by problems with our current model. If the two effect can be seperated and analysed, then advancements could be done in both computer science (e.g. weather forcasting) and cosmology. That's the point of this excersize.