Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
NASA Space Software Science

How Cosmological Supercomputers Evolve the Universe All Over Again 144

the_newsbeagle writes "To study the mysterious phenomena of dark matter and dark energy, astronomers are turning to supercomputers that can simulate the entire evolution of the universe. One such simulation, the Bolshoi projection, recently did a complete run-through. It started with the state the universe was in around 13.7 billion years ago (not long after the Big Bang) and modeled the evolution of dark matter and energy up to the present day. The run used 14,000 CPUs on NASA's fastest supercomputer."
This discussion has been archived. No new comments can be posted.

How Cosmological Supercomputers Evolve the Universe All Over Again

Comments Filter:
  • How long until... (Score:2, Interesting)

    by bejiitas_wrath ( 825021 ) <johncartwright302@gmail.com> on Wednesday October 03, 2012 @12:09AM (#41534353) Homepage Journal

    How long will it be until we can build a supercomputer that can span the Universe and if the Universe suffers a heat death it could just remake the whole Universe as it stored the state of everything within? Therefore humanity could survive even the end of the whole Universe in 100,000,000,000,000 years time. The short story the Last Question made quite an impression on me and surely with the current evolution in technology we could create a God computer eventually that would exist outside of anything we could comprehend. That would be mind-blowing.

  • by Required Snark ( 1702878 ) on Wednesday October 03, 2012 @01:56AM (#41534893)
    This is clearly good work, but I believe that the article glosses over real problems with these kinds of simulations. The short version of the problem is that the agreement between the model and the observations doesn't provide a huge degree of confidence in the model being tested. It appears that both the model and the starting setup are per-disposed to produce results that match observations.

    There has been no perturbation testing of the model. It does not seem that they did any runs that were intended to produce a result that did not match observations. They have no idea what range of input or modeling change produce a result that matches observations.

    The greatest utility of these simulations is when they don't match observations. This opens the possibility that the current ideas are incorrect, and that new ideas are needed.

    I also wonder about scaling issues. The three simulations at different scales are unconnected. There is no way to see how events at one scale effect events at other scales.

    The author also said one specific thing that bothered me:

    Astrophysicists can model the growth of density fluctuations at these early times easily enough using simple linear equations to approximate the relevant gravitational effects.

    I am not a physicist or cosmologist, but that seems to be a huge assumption. We have no idea what dark energy or dark matter are, but they can be modeled by "simple linear equations."

    I know that the shear cost and complexity of these computational experiments means that they are hard to accomplish. Even so, I will be less skeptical about their value when they are done in ways that test how the simulations fail, as well as how they verify current ideas.

  • by Mr0bvious ( 968303 ) on Wednesday October 03, 2012 @08:15AM (#41536637)

    Nick Bostrom has a paper on this [simulation-argument.com], the intro:

    This paper argues that at least one of the following propositions is true: (1) the human species is very likely to go extinct before reaching a “posthuman” stage; (2) any posthuman civilization is extremely unlikely to run a significant number of simulations of their evolutionary history (or variations thereof); (3) we are almost certainly living in a computer simulation. It follows that the belief that there is a significant chance that we will one day become posthumans who run ancestor-simulations is false, unless we are currently living in a simulation. A number of other consequences of this result are also discussed

  • Re:How long until... (Score:5, Interesting)

    by rgbatduke ( 1231380 ) <rgb@@@phy...duke...edu> on Wednesday October 03, 2012 @09:29AM (#41537365) Homepage
    Depends on how seriously you take information theory and the information content of the Universe. If, as seems rather reasonable, the information content of the (visible) Universe is irreducible/uncompressible, it would take at supercomputer with at least as many bits of storage as there are bits of information in the specification of the Universe's state. This requires a computer that is strictly larger than (in the sense of having at least as much "stuff" devoted to storage of all of those bits) than the Universe itself. Finally, since the supercomputer is part of the Universe (at least, if we built it), it also has to be self-referential and store its own state information. If it is to have any processing capability at all, it then is in a deadly game of catch-up, adding bits to describe every elementary particle in its processors and memory and losing the race even if it requires only one elementary particle to store the bit content of another (which will never be the case).

    In the end, it is provably, mathematically impossible to build a supercomputer that stores the complete state of the Universe, where the Universe is cleanly defined to be everything with objective existence. The same proof works to prove that there can be no omniscient God, since God suffers from precisely the same issues with information content and storage. A processing system cannot even precisely specify its own encoded state unless it is a truly bizarre fully compressible self-referential system the likes of which we cannot even begin to schematize, and there are lovely theorems on the rates of production of entropy in state switching on top of any actual physical mechanism for computation, all of which make this an interesting but ultimately absurd proposition.

    If you don't like information theory, then there are the limitations of physics itself, at least so far. We can only see back to (shortly after, the end of The Great Dark) the big bang, some ~14 bya. It is literally impossible for us to extract state information from outside of a sphere some 27.5 billion light years across. However, making reasonable assumptions of isotropy and continuity and the coupling of the "cosmic egg" that was the early post BB unified field state, cosmological measurements suggest that the Universe is no less than 200 times larger than this, that is, a ball some 500 billion light years across (where it is most unlikely that we are in the center of any actually compact Universe). Obviously, we cannot get any state information at all beyond indirect inference of mere existence from strictly less than 1 - (1/200)^3 of the actual Universe unless and until we have new transluminal physics. And from the first argument, even if you turned this 99.99999% of the actual Universe into a computer to fully describe only the 0.00001% visible sphere that we actually inhabit, you'd barely have enough material to create the bits needed to hold the information at current peak matter-per-bit levels (and then there is the problem of the free energy needed to drive any computation, the need for a cold reservoir into which to dump the entropy, but I digress). So it is safe to say that it is also physically impossible to build a supercomputer that can store/duplicate the information content of the entire Universe (and again, the same argument works against the existence of a God presuming only that this deity requires internal switching mechanisms on top of some sort of medium in order to store information and process it.

    The only exception to both is the specific case where the Universe and/or God are one and the same entity, and its "storage" of information is the irreducible representation of the information content of mass-energy in the mass-energy itself, and the irreducible computational mechanism is the laws of physics themselves.

    But of course you really do understand this, if you get outside of the willing suspension of disbelief required of science fiction (and yeah

Never test for an error condition you don't know how to handle. -- Steinbach

Working...