Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
NASA Space Software Science

How Cosmological Supercomputers Evolve the Universe All Over Again 144

the_newsbeagle writes "To study the mysterious phenomena of dark matter and dark energy, astronomers are turning to supercomputers that can simulate the entire evolution of the universe. One such simulation, the Bolshoi projection, recently did a complete run-through. It started with the state the universe was in around 13.7 billion years ago (not long after the Big Bang) and modeled the evolution of dark matter and energy up to the present day. The run used 14,000 CPUs on NASA's fastest supercomputer."
This discussion has been archived. No new comments can be posted.

How Cosmological Supercomputers Evolve the Universe All Over Again

Comments Filter:
  • by Empiric ( 675968 ) on Wednesday October 03, 2012 @12:14AM (#41534387)
    ...astronomers are turning to supercomputers that can simulate the entire evolution of the universe.

    I'm thinking the intent here is to mean this qualified "up to a certain point in time", as I'm pretty sure that to say this as a general, even theoretical, possibility is a Godelian-type logical impossibility. Since the supercomputers would be part of the universe you are simulating, you have to simulate the simulation of the supercomputer, which requires simulating the simulation of the computer simulating the computer... ad infinitum.

    But then again, I may be wrong. Best simulate my thought processes to be sure.
  • by mendelrat ( 2490762 ) on Wednesday October 03, 2012 @02:23AM (#41534981)
    I'm not a cosmologist, but I am an astronomer. Most of the questions you ask are in the papers associated with Bolshoi, but science writers just leave them out because the numbers are so huge and hard to relate with -- I'm going to use megaparsecs for distances; 1 megaparsec = 1 million parsecs = 3.26 million light years = 200 billion astronomical units. 1 astronomical unit is ~93 million miles, the distance from the Earth to the Sun.

    First off, "entire evolution of the universe" should obviously be qualified with "on cosmological scales", unless they've built the matrix. That said, how big is the domain? Is it just set to match the observable universe? 2048 grid points across the entire universe (or just the observable universe) seems rather... low-res. The TFA mentions an adaptive grid, but fails to mention what factor that can increase the local resolution by.

    As you point out, the 'entire evolution ...' phrase is a bad way of saying that the simulated volume and mass is large enough to be statistically representative of the large scale structure and evolution of the entire universe. It's 2048^3 particles total, which is a heck of a lot. 8,589,934,592 particles total, each pushing and pulling on each other simultaneously. It's an enormous computational problem. The particles are put into a box ~250 megaparsecs on a side; the Milky Way is ~0.03 megaparsecs in diameter, and it's ~0.8 megaparsecs from here to the Andromeda galaxy, our nearest large galaxy. 250 megaparsecs is a huge slice and more than enough to ensure that local variations (galaxies) won't dominate the statistics. The ART code starts with a grid covering 256^3 points, but can subdivide to higher resolutions if some threshold is passed up to 10 times if I remember correctly, giving a limit of around 0.001 megaparsecs. My memory is hazy, and the distances are scaled according to the hubble constant at any given point, but they're in the ballpark I think.

    Also, how exactly do we model dark matter when we don't really know WTF it is beyond the fact that it has gravitational mass? Does it work because gravitational effects are the only thing that really matters on cosmological scales?

    Essentially, yes; gravity absolutely dominates at these scales compared to all other forces considered. The role of stellar and galactic feedback into their environment when forming (and as they evolve) changes lots of important things, but simulations like Bolshoi seek to simulate the largest scale structures in the universe. Smaller subsections of the simulation can be picked out to run detailed N-body simulations of Milky Way type galaxies, or to statically match the dark matter clumps (which will form galaxies) to huge databases like the Sloan Digital Sky Survey. Both of those are pretty active things-to-do in cosmology now.

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker

Working...