How Cosmological Supercomputers Evolve the Universe All Over Again 144
the_newsbeagle writes "To study the mysterious phenomena of dark matter and dark energy, astronomers are turning to supercomputers that can simulate the entire evolution of the universe. One such simulation, the Bolshoi projection, recently did a complete run-through. It started with the state the universe was in around 13.7 billion years ago (not long after the Big Bang) and modeled the evolution of dark matter and energy up to the present day. The run used 14,000 CPUs on NASA's fastest supercomputer."
Let's qualify that sentence just a bit... (Score:4, Informative)
I'm thinking the intent here is to mean this qualified "up to a certain point in time", as I'm pretty sure that to say this as a general, even theoretical, possibility is a Godelian-type logical impossibility. Since the supercomputers would be part of the universe you are simulating, you have to simulate the simulation of the supercomputer, which requires simulating the simulation of the computer simulating the computer... ad infinitum.
But then again, I may be wrong. Best simulate my thought processes to be sure.
Re:Any Cosmologists Here? (Score:5, Informative)
First off, "entire evolution of the universe" should obviously be qualified with "on cosmological scales", unless they've built the matrix. That said, how big is the domain? Is it just set to match the observable universe? 2048 grid points across the entire universe (or just the observable universe) seems rather... low-res. The TFA mentions an adaptive grid, but fails to mention what factor that can increase the local resolution by.
As you point out, the 'entire evolution ...' phrase is a bad way of saying that the simulated volume and mass is large enough to be statistically representative of the large scale structure and evolution of the entire universe. It's 2048^3 particles total, which is a heck of a lot. 8,589,934,592 particles total, each pushing and pulling on each other simultaneously. It's an enormous computational problem. The particles are put into a box ~250 megaparsecs on a side; the Milky Way is ~0.03 megaparsecs in diameter, and it's ~0.8 megaparsecs from here to the Andromeda galaxy, our nearest large galaxy. 250 megaparsecs is a huge slice and more than enough to ensure that local variations (galaxies) won't dominate the statistics. The ART code starts with a grid covering 256^3 points, but can subdivide to higher resolutions if some threshold is passed up to 10 times if I remember correctly, giving a limit of around 0.001 megaparsecs. My memory is hazy, and the distances are scaled according to the hubble constant at any given point, but they're in the ballpark I think.
Also, how exactly do we model dark matter when we don't really know WTF it is beyond the fact that it has gravitational mass? Does it work because gravitational effects are the only thing that really matters on cosmological scales?
Essentially, yes; gravity absolutely dominates at these scales compared to all other forces considered. The role of stellar and galactic feedback into their environment when forming (and as they evolve) changes lots of important things, but simulations like Bolshoi seek to simulate the largest scale structures in the universe. Smaller subsections of the simulation can be picked out to run detailed N-body simulations of Milky Way type galaxies, or to statically match the dark matter clumps (which will form galaxies) to huge databases like the Sloan Digital Sky Survey. Both of those are pretty active things-to-do in cosmology now.