Forgot your password?
typodupeerror
Space Supercomputing Science

First Full Observable-Universe Simulation 95

Posted by timothy
from the not-counting-the-big-blue-room dept.
First time accepted submitter slashmatteo writes "The goal of the DEUS project (Dark Energy Universe Simulation) is to investigate the imprints of dark energy on cosmic structure formation through high-performance numerical simulations. In order to do so, the project has conducted a simulation of the structuring of the entire observable universe, from the Big Bang to the present day. Thanks to the Curie super-computer, the simulation has made it possible to follow the evolution of 550 billion particles. Two other complementary runs are scheduled by the end of May. More details in the press release."
This discussion has been archived. No new comments can be posted.

First Full Observable-Universe Simulation

Comments Filter:
  • by Narrowband (2602733) on Sunday April 22, 2012 @06:47PM (#39766193)
    When in the simulation does it reach the point where it starts simulating the Curie supercomputer simulating it?
    • by zAPPzAPP (1207370) on Sunday April 22, 2012 @06:49PM (#39766205)

      It's right there, in particle 4153341989.

    • Forget that, what happens when it starts simulating thousands of gaming machines running Crysis on Vista?

      • by c0lo (1497653)

        Forget that, what happens when it starts simulating thousands of gaming machines running Crysis on Vista?

        Bingo.

        The performance of universe rendering (a clever particle/wave algorithm) will start degrading.
        Now, our lead developer chose to degrade the rendering of the details in a proportional way with the distance to the observer (they call it Hubble constant)... Anyway, rendering the details far away will be done with lower priority, thus they'll see the results later; also, when considering the radiosity rendering part (the part that deals with wave nature of rendering), they'll see the light waves with a r

    • by BSAtHome (455370)

      The Thirteenth Floor, anyone?

      Recursion is a bitch.

    • by Brad1138 (590148) <brad1138@yahoo.com> on Sunday April 22, 2012 @10:13PM (#39767145)
      (obligatory Space Balls reference)
      You're looking at now, sir. Everything that happens now is happening now.
      What happened to then?
      We passed then.
      When?
      Just now. We're at now now.
      Go back to then.
      When?
      Now!
      Now?
      Now!
      I can't.
      Why?
      We missed it.
      When?
      Just now.
      When will then be now?
      Soon.
    • Look there- that one is petitioning his local school board to keep intelligent design out of the curriculum! Isn't that adorable? Let's simulate some lightning bolts and a flood and see what he does.
  • by IndustrialComplex (975015) on Sunday April 22, 2012 @06:47PM (#39766195)

    All we need is a pointer to Earth that says 'You are here.' and it's game over for us all!

  • DEUS... (Score:3, Insightful)

    by Nrrqshrr (1879148) on Sunday April 22, 2012 @06:52PM (#39766217)
    I didn't RTFA, but DEUS sounds like the perfect name for this project.
  • by stox (131684) on Sunday April 22, 2012 @06:52PM (#39766219) Homepage

    if there was a way to reduce entropy in the Universe yet?

  • by NetFusion (86828) on Sunday April 22, 2012 @06:55PM (#39766239)
    Gradually the multiverse calculations our universe spawns will become more complex and longer lived until the secret of a self sustaining calculation that uses the very fabric of space time as its compuational engine is found and grows rapidly with inflation to consume our universe and give birth to new ones. /tin foil
  • by Anonymous Coward on Sunday April 22, 2012 @06:59PM (#39766265)

    From Wikipedia's page "Galaxy":
    "There are probably more than 170 billion (1.7 × 1011) galaxies in the observable universe."

    550 billion particles to simulate the observable universe means just over three particles per galaxy. I don't know exactly what they're doing but it doesn't sound like much of a simulation..?

    • by Anonymous Coward

      It's a simulation; the map is not the territory.

      • by tehcyder (746570)

        It's a simulation; the map is not the territory.

        No, but if you simplify a map too much it becomes useless. For example if you draw a big circle on a piece of paper and label it "the world", it's not going to help you circumnavigate the globe, although it's not actually wrong.

    • by Fluffeh (1273756) on Sunday April 22, 2012 @07:16PM (#39766323)

      550 billion particles to simulate the observable universe means just over three particles per galaxy. I don't know exactly what they're doing but it doesn't sound like much of a simulation..?

      That really depends on what you are trying to achieve. If you are not interested in the interactions going on inside each galaxy, but rather the interactions between galaxies themselves as well as things like filaments and clusters and or superclusters, this is more than enough particles to use. In fact, if each particle is assumed to be a galaxy, then the surplus may well have been introduced to see failed galaxies or to find where initial seeds may not have turned into fully fledged galaxies. They may also account for a small portion of the vast numbers of dwarf galaxies to see how these interact with larger objects.

    • I'm not impressed. Now, when they can run a simulation with more particles than the atoms in the computer, I'll be impressed. Heck, I'll make it easy, when they run a simulation more particles than transistors in their CPUs, I'll be impressed. Let's see, 92,000 CPUs @ ~ 2B/cpu = ~184T. Now that's a simulation.

      • by doshell (757915)
        FWIW, "more particles than the atoms in the computer" would be impossible with current technology since (presumably) you'd need to store at least one bit of state per particle, and current computers need more than an atom to store a single bit.
        • Woosh! Guess I didn't trigger your sarcasm detector.

          • by doshell (757915)
            Indeed, you did not. Blame it on the ever decreasing quality of Slashdot comments; these days, some people here actually mean stuff like that when they say it. I do apologise for misjudging your intelligence.
            • I actually had a concern as I typed it that including the second sentence would detract from the sarcasm, guess I should have left it off.

              Great sig.

        • FWIW, "more particles than the atoms in the computer" would be impossible with current technology since (presumably) you'd need to store at least one bit of state per particle, and current computers need more than an atom to store a single bit.

          All you need is two bits. Map all the ones in your data to one location and all the zeros to the other.

      • by jovius (974690)

        Perhaps the most important question is that what are we doing with the particles/waves/quantum states that we've got in our disposal.

        I mean I'm able to run a perfect simulation by crunching it with my 4 core CPU: the crunching of a CPU with an object modeled from metal and wood.

    • by AC-x (735297)

      As another comparison an average grain of salt contains around 1.2x10^18 atoms verses this simulation's 5.5x10^11 particles. (source [physlink.com])

  • I was lead to believe there would be faerie cake.

  • 550 billion particles? That's it? How exactly does that equate to a "full observable-universe simulation"? Last I checked, the minimum estimate for our galaxy alone was 100 billion stars. Multiply that by at least 100 billion other *galaxies* and we're looking at... uh... a much larger number to even begin to simulate the entire observable universe.

    I'm sure I'm significantly misunderstanding something about the simulation parameters though.

    • The number of particles is not relevant.
      You can do a one billion simulation of a single galaxy or of the whole universe. The purpose is different.
      In this respect, a particle can represent a single star in one galaxy or a single galaxy in the universe. Large scale structures in the universe don't depend on the exact location of each star in each galaxy.

      • by jbengt (874751)

        Large scale structures in the universe don't depend on the exact location of each star in each galaxy.

        Let's do a simulation to check your hypothesis! We'll only need a few billion more DEUSs.

  • Does the simulated universe contain intelligent lifeforms who have built universe-simulating supercomputers?

    • by jim_deane (63059)

      Why not? This one does.

      • Fortunately the universe is simulated on windo..... Wow, I didn't know the last thing I saw would be white writing on a blue background.

            In other news. I kind of doubt these muckabouts considered running the simulation with 4+ dimensional dark matter/energy.

                  It's like they don't even know that non-3d matter cannot interact directly with 3d matter except through gravitation! Ric Romero reporting...

    • Does the simulated universe contain intelligent lifeforms who have built universe-simulating supercomputers?

      Current replies:

      No why should it? This one dosnt either

      Why not? This one does.

      I suppose that covers all the bases.

  • And when it returns a response of "42," Douglas Adams will die laughing...

    no, wait...

  • Nice Machine (Score:5, Interesting)

    by kramulous (977841) on Monday April 23, 2012 @02:04AM (#39768023)

    Interesting to note that they didn't bother with too many gpu nodes. Reflects what we see with our users despite the abundance of marketing material from Nvidia.

    5040 'standard' compute nodes: dual E5-2680 processors; 64GB RAM
    360 'bulk' compute nodes: quad EX-X7560; 128GB RAM
    144 GPU nodes: dual M2050

    Another 90 'super' nodes on order: 128core, 512GB RAM

    Cores: 103,680
    GPUs: 288

    Almost token GPU offering. These guys must do real work on it.

    • Unless of course you count the 896 GPU cores per GPU node ( 448 cores per card, x2 per box, x144 boxes, for a cool 129,024 GPU cores).

      Yeah, token GPU work... Seems to me like they appropriately sized the compute capabilities of what could be accellerated by cuda rather appropriately considering it's only a specific set of operations which can be accelerated by it in the first place.

      Take a look at what gets accelerated by BOINC projects on NVIDIA / ATI GPU cores. Some projects cannot be sped up at all by CUD

      • by kramulous (977841)

        Sure. You can quote whatever numbers you like.
        CPU Cores: 103, 680
        GPU Cores: 129,024

        Total machine is 2PetaFlop and the GPUs contribute less than 10% of that.

        Look, I realise that for the right job, the GPU is superior. But it is not anywhere near what we are being led to believe (again, according to the marketing material). I sure the people commissioning this machine knew what they were doing and what they needed was raw x86_64 grunt.

        I'm not interested in a single program's performance on the GPU. I'm in

  • ...and now I'm on the lookout for a girl with purple hair named Miang.

  • then they probably know where I'll be tommorow (just don't tell my girlfriend)
  • Is any genuine science being done here? Running simulations to model, say, the weather or ocean currents makes sense. You can calibrate them to past data and use them predictively. How does a simulation of the "universe" tell you anything?
    • Is any genuine science being done here? Running simulations to model, say, the weather or ocean currents makes sense. You can calibrate them to past data and use them predictively. How does a simulation of the "universe" tell you anything?

      Take the starting state and mechanisms suggested by theory, run it, compare the result to the actual universe now.

  • How are they going to verify it experimentally?

  • There are 3.34E22 molecules of H20 in one gram of water. That is a hundred billion or so times more particles that are in this simulation. Astro calcs have just been including more and more particles since the first one with 2 interacting particles. The number of (stars/solar systems/galaxies/clusters/super clusters etc) that each of those particles is supposed to represent has just been getting smaller as we have faster and faster computers.

As far as we know, our computer has never had an undetected error. -- Weisert

Working...