Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
NASA Space Software Science

How Cosmological Supercomputers Evolve the Universe All Over Again 144

the_newsbeagle writes "To study the mysterious phenomena of dark matter and dark energy, astronomers are turning to supercomputers that can simulate the entire evolution of the universe. One such simulation, the Bolshoi projection, recently did a complete run-through. It started with the state the universe was in around 13.7 billion years ago (not long after the Big Bang) and modeled the evolution of dark matter and energy up to the present day. The run used 14,000 CPUs on NASA's fastest supercomputer."
This discussion has been archived. No new comments can be posted.

How Cosmological Supercomputers Evolve the Universe All Over Again

Comments Filter:
  • I call (Score:1, Funny)

    by Anonymous Coward

    Bolshoit!

    • by mcgrew ( 92797 ) *

      I'm thinking WOW, what a computer... modeling the 14 billion year history of every galaxy, stellar system, planet, moon, molecule, atom, subatomic particle in... how long did the simulation run?

      I'm sure they got some good theories (or at least hypotheses) out of this simulation, but come on... it isn't as impressive as TFS makes it seem.

  • How long will it be until we can build a supercomputer that can span the Universe and if the Universe suffers a heat death it could just remake the whole Universe as it stored the state of everything within? Therefore humanity could survive even the end of the whole Universe in 100,000,000,000,000 years time. The short story the Last Question made quite an impression on me and surely with the current evolution in technology we could create a God computer eventually that would exist outside of anything we co

    • by Anonymous Coward

      You should probably also read The Last Answer, also freely available online. An equally thought-provoking short story.

      • You should probably also read The Last Answer, also freely available online. An equally thought-provoking short story.

        I believe the story you're referring to is "The Last Question" by Isaac Asimov.

        • I've read it. I'm looking for sci-fi books like this. Do you guys remember anyone related to these matters?
          • Frederick Pohl's "The World at the End of Time" deals tangentially with heat death of the universe, and also has superbeings tossing stars at each other. Can't miss it. :)

          • by Zordak ( 123132 )
            Ten years or so ago I read a book called "Darwinia" by Robert Charles Wilson, though to say exactly what it has to do with this topic is kind of a spoiler.
          • by mcgrew ( 92797 ) *

            Niven, Asimov, and Hienlein have all written stories like this. One (Asimov iirc but it could have been Heinlein) wrote a short story about a time traveller who went to the heat death of the universe. Sorry, it's been years since I read it.

            Niven's A World Out of Time [wikipedia.org] doesn't go to the end of the universe, but it's a really good read, involving relativistic speeds.

        • You should probably also read The Last Answer, also freely available online. An equally thought-provoking short story.

          I believe the story you're referring to is "The Last Question" by Isaac Asimov.

          https://en.wikipedia.org/wiki/The_Last_Answer [wikipedia.org]

          • You should probably also read The Last Answer, also freely available online. An equally thought-provoking short story.

            I believe the story you're referring to is "The Last Question" by Isaac Asimov.

            https://en.wikipedia.org/wiki/The_Last_Answer [wikipedia.org]

            No, I think GP really does mean The Last Question. [wikipedia.org] That's the Asimov story with the Cosmic AC that [spoiler alert] re-creates the universe after the Heat Death.

        • The GGP mentioned The Last Question, so I think the AC really was referring to The Last Answer [thrivenotes.com].
          • You're right... it's my fault for not reading the GGP fully. I only skimmed it, saw the premise of "The Last Question" and assumed that's what the GP meant. I am aware of the story "The Last Answer", but don't think it has much at all to do with the topic at hand, so assumed the AC suggesting it really meant The Last Question.

        • by dissy ( 172727 )

          Twice now it has been posted "Not only should you read The Last Question, but you should ALSO read The Last Answer"

          And twice now you have attempted to claim they intended to say "You should not only read The Last Question, but also read a totally different story called The Last Question"

          Why do you refuse to believe there are two stories by the same author with different names?

          The Last Question: http://filer.case.edu/dts8/thelastq.htm [case.edu]
          The Last Answer: http://www.thrivenotes.com/the-last-answer/ [thrivenotes.com]

          Perhaps you s

          • Perhaps you should make yourself aware of both of them, before attempting to correct others who know what they mean to say

            Perhaps you should read that I apologised twice for my mistakes already - it was due to reading comprehension failure (which I attribute to nicotine withdrawal - I'm quitting smoking) and not due to a lack of knowledge of the books. Regardless of the cause, it was my failure though, and I apologised for it.

            • Good luck with quitting!
            • by mcgrew ( 92797 ) *

              I'm quitting smoking

              Perhaps I can help. [kuro5hin.org]

              • Thanks for that - I just read through it and it really did help.

                I planned this months in advance. My first attempt was before my daughter was born as I REALLY wanted her to have a non-smoking Dad. After one week though, my wife told me to buy cigarettes, since my mood was so foul, she just couldn't handle it.

                This time, we planned it so that my wife and daughter (now 18 months old) would be on holiday without me. I've gone cold turkey - no patches, no gum, nothing. The first three days were AGONY... phys

    • by Nyder ( 754090 ) on Tuesday October 02, 2012 @11:16PM (#41534399) Journal

      How long will it be until we can build a supercomputer that can span the Universe and if the Universe suffers a heat death it could just remake the whole Universe as it stored the state of everything within? Therefore humanity could survive even the end of the whole Universe in 100,000,000,000,000 years time. The short story the Last Question made quite an impression on me and surely with the current evolution in technology we could create a God computer eventually that would exist outside of anything we could comprehend. That would be mind-blowing.

      I'm pretty sure Douglas Adams covered all that.

    • Someone better call the Dixie Flatline!

    • Re: (Score:3, Insightful)

      by Yvanhoe ( 564877 )
      There is not enough energy in the universe to store all the informations of the universe in a computer.
      If you focus on some information (human minds for instance) of special interest to you, on the other hand...
      • Re: (Score:3, Insightful)

        by Black Parrot ( 19622 )

        There is not enough energy in the universe to store all the informations of the universe in a computer.

        I subscribe to the view that the universe is computing its own final state.

        Or more precisely, always computing its next state; apparently there isn't going to be a final one.

      • The Planck constant just get bigger as you go.

    • ... run Windows 9 with it.

    • by thej1nx ( 763573 )
      That will in all likelihood be much, much longer than human kind's likely survival odds. So, the answer is never.
    • by Tablizer ( 95088 )

      Maybe that's what we actually are, and "God" is really a lonely Linux admin (equivalent) playing with his human ant farm.

      • by Anonymous Coward

        You jest, but that is actually one possible solution to the Fermi Paradox / Drake Equation... The longer we go without discovering other life in the universe, the more likely this is all a simulation.

    • Comment removed based on user account deletion
      • so many people these days are throwing the concept around so easily eg. Google "itself," the iPhone "itself", Internet "itself," Earth "itself," the Law "itself," money "itself..."

        so it's really quite simple: it will just run on "itself" ;-)
      • by sFurbo ( 1361249 )
        If there were no dark energy, the temperature difference between a heat storage and an ever cooling universe would allow you to do infinitely many calculations, at an ever slowing rate. It seems that there is dark energy though, so the point where space-time recedes faster than c from you moves closer and closer. This is, essentially, an event horizon, so it will have Hawking radiation, meaning that the visible universe will not get arbitrarily cold, so only finitely many calculations can be done. It also b
    • In "The Last Question", the "God computer" you're talking about is composed of... humans. (Humanity IS the computer) What I mean is that, in real life, *maybe* (or maybe not) there's nothing to be built except running the Good (or God) software in people's brains.
    • A very good book on that subject: Permutation City, by Greg Egan.
    • Re:How long until... (Score:5, Interesting)

      by rgbatduke ( 1231380 ) <`ude.ekud.yhp' `ta' `bgr'> on Wednesday October 03, 2012 @08:29AM (#41537365) Homepage
      Depends on how seriously you take information theory and the information content of the Universe. If, as seems rather reasonable, the information content of the (visible) Universe is irreducible/uncompressible, it would take at supercomputer with at least as many bits of storage as there are bits of information in the specification of the Universe's state. This requires a computer that is strictly larger than (in the sense of having at least as much "stuff" devoted to storage of all of those bits) than the Universe itself. Finally, since the supercomputer is part of the Universe (at least, if we built it), it also has to be self-referential and store its own state information. If it is to have any processing capability at all, it then is in a deadly game of catch-up, adding bits to describe every elementary particle in its processors and memory and losing the race even if it requires only one elementary particle to store the bit content of another (which will never be the case).

      In the end, it is provably, mathematically impossible to build a supercomputer that stores the complete state of the Universe, where the Universe is cleanly defined to be everything with objective existence. The same proof works to prove that there can be no omniscient God, since God suffers from precisely the same issues with information content and storage. A processing system cannot even precisely specify its own encoded state unless it is a truly bizarre fully compressible self-referential system the likes of which we cannot even begin to schematize, and there are lovely theorems on the rates of production of entropy in state switching on top of any actual physical mechanism for computation, all of which make this an interesting but ultimately absurd proposition.

      If you don't like information theory, then there are the limitations of physics itself, at least so far. We can only see back to (shortly after, the end of The Great Dark) the big bang, some ~14 bya. It is literally impossible for us to extract state information from outside of a sphere some 27.5 billion light years across. However, making reasonable assumptions of isotropy and continuity and the coupling of the "cosmic egg" that was the early post BB unified field state, cosmological measurements suggest that the Universe is no less than 200 times larger than this, that is, a ball some 500 billion light years across (where it is most unlikely that we are in the center of any actually compact Universe). Obviously, we cannot get any state information at all beyond indirect inference of mere existence from strictly less than 1 - (1/200)^3 of the actual Universe unless and until we have new transluminal physics. And from the first argument, even if you turned this 99.99999% of the actual Universe into a computer to fully describe only the 0.00001% visible sphere that we actually inhabit, you'd barely have enough material to create the bits needed to hold the information at current peak matter-per-bit levels (and then there is the problem of the free energy needed to drive any computation, the need for a cold reservoir into which to dump the entropy, but I digress). So it is safe to say that it is also physically impossible to build a supercomputer that can store/duplicate the information content of the entire Universe (and again, the same argument works against the existence of a God presuming only that this deity requires internal switching mechanisms on top of some sort of medium in order to store information and process it.

      The only exception to both is the specific case where the Universe and/or God are one and the same entity, and its "storage" of information is the irreducible representation of the information content of mass-energy in the mass-energy itself, and the irreducible computational mechanism is the laws of physics themselves.

      But of course you really do understand this, if you get outside of the willing suspension of disbelief required of science fiction (and yeah
  • by catmistake ( 814204 ) on Tuesday October 02, 2012 @11:12PM (#41534373) Journal

    It started with the state the universe was in around 13.7 billion years ago (not long after the Big Bang) and modeled the evolution of dark matter and energy up to the present day.

    so... what happened when it reached the simulation of the simulation, and then eventually the simulation of the simulation of the simulation? I've long been told that it's turtles all the way down, but I'd like to see a citation.

  • Did anyone else think this was going to be about some sort of Universe-scale natural phenomenon being modeled as a supercomputer?

  • by Empiric ( 675968 ) on Tuesday October 02, 2012 @11:14PM (#41534387)
    ...astronomers are turning to supercomputers that can simulate the entire evolution of the universe.

    I'm thinking the intent here is to mean this qualified "up to a certain point in time", as I'm pretty sure that to say this as a general, even theoretical, possibility is a Godelian-type logical impossibility. Since the supercomputers would be part of the universe you are simulating, you have to simulate the simulation of the supercomputer, which requires simulating the simulation of the computer simulating the computer... ad infinitum.

    But then again, I may be wrong. Best simulate my thought processes to be sure.
    • But what if you build the computer outside the Universe so it is not part of the Universe. The AC was in hyperspace... BTW, is the One in the Last Answer story the cosmic AC?

      • But what if you build the computer outside the Universe so it is not part of the Universe. The AC was in hyperspace... BTW, is the One in the Last Answer story the cosmic AC?

        The story is called "The Last Question ", not "The Last Answer"... seems to be a common mistake; but completely defeats the point of the story. The point is that the question remains the same throughout the ages and is always answered the same way; until the very end when there is finally a way to answer to the last question - however the answer is never given since only through demonstration of the answer can there be someone to give the answer to.

        As for the name of the computer, it changes for each "time

        • by Maritz ( 1829006 )

          The story is called "The Last Question ", not "The Last Answer"... seems to be a common mistake; but completely defeats the point of the story.

          I hadn't heard of it until reading these comments but there is a 'last answer' by Asimov as well. The common mistake seems to be people correcting those who mention it..!

          • I hadn't heard of it until reading these comments but there is a 'last answer' by Asimov as well. The common mistake seems to be people correcting those who mention it..!

            I am aware of the story "The Last Answer", but didn't think it has much at all to do with the topic at hand, and had just replied to someone else about "The Last Question" so mistakenly assumed the reference here should also be to it. Totally my fault on lack of comprehension, since re-reading the GP post; he clearly did mean "The Last Answer" and was wondering about the relationship between the stories...

            • by Maritz ( 1829006 )
              Indeed ;) Shall have to check out the last answer, as the last question really is one of the coolest short stories I've read.
        • by Sique ( 173459 )

          There are two stories, one is The Last Answer and the other The Last Question. Both are by Isaac Asimov.

    • what about it becoming a feed back loop where the simulation of the computer simulating ends ends powering the simulation of the computer simulating

    • by JWW ( 79176 )

      What part of "turtles all the way down" don't you understand?

      • by Empiric ( 675968 )
        Good question. What part of the relative timestamps of that post and mine don't you understand?
        • by Anonymous Coward

          Good question. What part of quantum tachyon dynamics do I need to understand to follow the jokes on this thread?

    • ...astronomers are turning to supercomputers that can simulate the entire evolution of the universe.
      I'm thinking the intent here is to mean this qualified "up to a certain point in time", as I'm pretty sure that to say this as a general, even theoretical, possibility is a Godelian-type logical impossibility. Since the supercomputers would be part of the universe you are simulating, you have to simulate the simulation of the supercomputer, which requires simulating the simulation of the computer simulating the computer... ad infinitum.

      Almost without exception, simulations are simpler than the thing being simulated. You use simulations when the real thing would be impossible, or too dangerous or expensive.

      • by Empiric ( 675968 )
        Indeed. And without the qualifier "entire", I wouldn't have commented. That suggests complete algorithmic, rather than heuristic, simulation.
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      You fail to realize something. If the reality we are experiencing is in fact a simulation, then it doesn't matter if one plank-step of the simulation takes an hour or a Universe worth of time to compute -- To us within the simulation, time remains locally constant. Likewise, The super computers can simulate the entire evolution of the universe by imposing acceptable error rates (epsilon).

      A very low resolution simulation would simply count down from 1.0 (max Universal energy) to 0.0 (heat death) over one u

      • by Anonymous Coward

        Wow. In case you missed it, an AC just provided both the Ultimate Answer to Life the Universe and Everything, AND demonstrated the Ultimate Question too. Protip, The U.Q. seems to be something like: "What is 'normal', anyway?" -- Deep Thought simply used a scale of 100 to 0 instead of 1.0 to 0.0

        There may be hope for humanity yet!

  • by PPH ( 736903 ) on Tuesday October 02, 2012 @11:28PM (#41534495)

    ... put this supercomputer to work generating all possible Slashdot logos.

  • If you think slashdotters are tired of First Posts....

  • by steppedleader ( 2490064 ) on Tuesday October 02, 2012 @11:46PM (#41534593)
    First off, "entire evolution of the universe" should obviously be qualified with "on cosmological scales", unless they've built the matrix. That said, how big is the domain? Is it just set to match the observable universe? 2048 grid points across the entire universe (or just the observable universe) seems rather... low-res. The TFA mentions an adaptive grid, but fails to mention what factor that can increase the local resolution by.

    Also, how exactly do we model dark matter when we don't really know WTF it is beyond the fact that it has gravitational mass? Does it work because gravitational effects are the only thing that really matters on cosmological scales?

    I must say I like the use of periodic boundary conditions, though, simply because it makes their simulated universe conform to the Modest Mouse lyric "The universe is shaped exactly like the earth, if you go straight long enough you end up where you were".
    • by mendelrat ( 2490762 ) on Wednesday October 03, 2012 @01:23AM (#41534981)
      I'm not a cosmologist, but I am an astronomer. Most of the questions you ask are in the papers associated with Bolshoi, but science writers just leave them out because the numbers are so huge and hard to relate with -- I'm going to use megaparsecs for distances; 1 megaparsec = 1 million parsecs = 3.26 million light years = 200 billion astronomical units. 1 astronomical unit is ~93 million miles, the distance from the Earth to the Sun.

      First off, "entire evolution of the universe" should obviously be qualified with "on cosmological scales", unless they've built the matrix. That said, how big is the domain? Is it just set to match the observable universe? 2048 grid points across the entire universe (or just the observable universe) seems rather... low-res. The TFA mentions an adaptive grid, but fails to mention what factor that can increase the local resolution by.

      As you point out, the 'entire evolution ...' phrase is a bad way of saying that the simulated volume and mass is large enough to be statistically representative of the large scale structure and evolution of the entire universe. It's 2048^3 particles total, which is a heck of a lot. 8,589,934,592 particles total, each pushing and pulling on each other simultaneously. It's an enormous computational problem. The particles are put into a box ~250 megaparsecs on a side; the Milky Way is ~0.03 megaparsecs in diameter, and it's ~0.8 megaparsecs from here to the Andromeda galaxy, our nearest large galaxy. 250 megaparsecs is a huge slice and more than enough to ensure that local variations (galaxies) won't dominate the statistics. The ART code starts with a grid covering 256^3 points, but can subdivide to higher resolutions if some threshold is passed up to 10 times if I remember correctly, giving a limit of around 0.001 megaparsecs. My memory is hazy, and the distances are scaled according to the hubble constant at any given point, but they're in the ballpark I think.

      Also, how exactly do we model dark matter when we don't really know WTF it is beyond the fact that it has gravitational mass? Does it work because gravitational effects are the only thing that really matters on cosmological scales?

      Essentially, yes; gravity absolutely dominates at these scales compared to all other forces considered. The role of stellar and galactic feedback into their environment when forming (and as they evolve) changes lots of important things, but simulations like Bolshoi seek to simulate the largest scale structures in the universe. Smaller subsections of the simulation can be picked out to run detailed N-body simulations of Milky Way type galaxies, or to statically match the dark matter clumps (which will form galaxies) to huge databases like the Sloan Digital Sky Survey. Both of those are pretty active things-to-do in cosmology now.

      • Ah, thank you. I'm a numerical modeler myself, but I'm studying meteorology and thus generally focus on things that happen a bit more nearby and in a fluid medium. Good to know my physics BS allows me to at least ask somewhat intelligent questions about this sort of stuff, though!
      • You may think it's a long way to the chemist's but that's peanuts compared to space!

        (Just kidding! Thanks for your informative post!)

  • Scientists build the ultimate computer. The first thing they ask it is, "is there a god?". The computer answers "there is now!"
  • Or maybe I just watch to many sci-fi movies. Feels like one of those "knowledge man was not yet ready to possess" storylines in the works.

    • General Relativity has been around for almost a century and it's been understood pretty well since the 60's.

  • by Required Snark ( 1702878 ) on Wednesday October 03, 2012 @12:56AM (#41534893)
    This is clearly good work, but I believe that the article glosses over real problems with these kinds of simulations. The short version of the problem is that the agreement between the model and the observations doesn't provide a huge degree of confidence in the model being tested. It appears that both the model and the starting setup are per-disposed to produce results that match observations.

    There has been no perturbation testing of the model. It does not seem that they did any runs that were intended to produce a result that did not match observations. They have no idea what range of input or modeling change produce a result that matches observations.

    The greatest utility of these simulations is when they don't match observations. This opens the possibility that the current ideas are incorrect, and that new ideas are needed.

    I also wonder about scaling issues. The three simulations at different scales are unconnected. There is no way to see how events at one scale effect events at other scales.

    The author also said one specific thing that bothered me:

    Astrophysicists can model the growth of density fluctuations at these early times easily enough using simple linear equations to approximate the relevant gravitational effects.

    I am not a physicist or cosmologist, but that seems to be a huge assumption. We have no idea what dark energy or dark matter are, but they can be modeled by "simple linear equations."

    I know that the shear cost and complexity of these computational experiments means that they are hard to accomplish. Even so, I will be less skeptical about their value when they are done in ways that test how the simulations fail, as well as how they verify current ideas.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Yes, Bolshoi's initial conditions were in some sense designed to match observations. Specifically, the five main parameters which describe cosmology (including the abundance of dark energy and dark matter) were derived from observations of the cosmic microwave background, supernovae, and galaxy clustering. These five numbers were used to generate the initial conditions for Bolshoi; a supercomputer spent the time to figure out how the universe should evolve from several million years after the Big Bang to

      • Thank you very much for the informative response. I did not intend to dismiss the simulation results, I just wanted to get a better understanding of what they did or did not imply. Viewing the results as a consistency check provides a useful context.

        I know that the experimenters would like to be able to explore the limits of the current modeling techniques. These simulations are so time consuming that running examples that are expected to fail (not match known observational data) is hard to justify. Ho

    • by Anonymous Coward

      We have no idea what dark energy or dark matter are, but they can be modeled by "simple linear equations."

      That bit's fine. Dark matter and dark energy are fudge factors in the equations that make them generate answers corresponding to what we see. (This isn't a bad thing: an electron is also a fudge factor that makes lots of different equations produce results that correspond to experimental results.) At the moment, simple linear terms for dark matter and energy are enough to produce results consistent with observations. If they weren't, that would tell us that our simple concepts of dark matter and energy

  • And if the beginning parameters of the model were off from actual history by even the tiniest fraction, the extrapolated results won't be worth much. We pretend otherwise, but we really still don't know the current state and composition of the universe, much less how it started... assuming it started. There's a reason that they're called theories.

  • I'm sure eventually we'll get enough size and resolution:

    It's the simulation that doesn't end.
    Yes, it goes on and on my friend.
    Someone started running it not knowing what it was,
    And they'll continue singing it forever just because...

    Well to be perfectly honest:
    (1) Most people don't realise they're in a simulation
    (2) The few with "suspicions" have no idea where the off switch is
  • I recommend you get rid of this Stalin fella in the next iteration. He's up to no good.

  • Comment removed based on user account deletion
  • And that cluster is 11th? Nice!!

Fast, cheap, good: pick two.

Working...