Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Space Supercomputing

Simulating Galaxies With Supercomputers 120

An anonymous reader writes "Over in the UK Durham University is tasking its supercomputing cluster with nothing less than recreating how galaxies are born and evolve over the course of billions of years. Even with 800 AMD processor cores at its disposal the university is still hitting the limits of what is possible."
This discussion has been archived. No new comments can be posted.

Simulating Galaxies With Supercomputers

Comments Filter:
  • by mehrotra.akash ( 1539473 ) on Tuesday September 14, 2010 @01:02PM (#33577550)

    800 AMD processor cores, that knowledge is useless, need more info regarding that, are they ultra low power ones like Atom/Bobcat, or extremely high clocked, such as the i7 980x/ Phenom x6 1090,etc

    Also article says that they have 1600GB RAM, isnt RAM normally in powers of 2?

    • by blueg3 ( 192743 )

      Key to puzzling this out: it's a computing cluster, and 800 isn't a power of 2 either.

    • Re: (Score:3, Interesting)

      by Antisyzygy ( 1495469 )
      Id guess they are running Opteron cpu's, maybe up to 8 core. So that means 50-100 machines in a cluster. 1600 / 50 = 32 gb per machine OR 1600/ 100 = 16 gb per machine.
    • by gerddie ( 173963 )
      How about 100 nodes of 2x Phenom x4 @16GB?
    • Id guess they are running Opteron cpu's, maybe up to 8 core. So that means 50-100 machines in a cluster. 1600 / 50 = 32 gb per machine OR 1600/ 100 = 16 gb per machine.

      And that does not preclude them using Dual Socket MoBo's with two - 4 core Opterons.

    • by rubycodez ( 864176 ) on Tuesday September 14, 2010 @01:30PM (#33578090)

      mostly opteron 175 (528 of them at 2.2 GHz with 1056GB RAM totl) and 285 (256 of them at 2.6GHz with 512GB RAM tota), so about 2GB RAM each.

      they run Solaris 10 u3

      http://icc.dur.ac.uk/icc.php?content=Computing/Cosma [dur.ac.uk]

      • The mystery is solved. Their cluster is old and slow,

        • a school isn't going to dump their supercomputer in the garbage every three years like a PC gamer. heck it probably took a year to get hardware project proposed and approved.

          funny how many here assumed it would be an octo-core monster. instead we find hardware a few years old is actually useful.

          • Perhaps they're angling for another grant.

            Dr Lydia Heck, the ICC's computer cluster manager, said the ICC had maxed out its supercomputing cluster's processors and memory by running a simulation of the effect of dark matter on how galaxies are formed.

            And the maxed-out cluster is not even using large scale models.

            Physicists have to simplify the cosmological models they use in order to get ones that produce data sets small enough to be accurately processed by the 64-bit chips in the supercomputing cluster, and which can fit into the cluster's available memory.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      Details are here:
      http://icc.dur.ac.uk/icc.php?content=Computing/Cosma

      The Cosmology Machine (COSMA) was first switched on in July 2001. From the original system only 2 TByte of dataspace is still on line. 64 SunBlade 1000s with a total of 64 GByte of RAM were donated by the ICC in February 2006 to the Kigali Institute for Science and Technology in Kigali, the Capital of Rwanda. Sun Microsystems payed for their transport by air.

      In February 2004, QUINTOR was installed. QUINTOR consists of a 256 SunFire V210s w

    • 1600GB RAM = 2GB RAM per core. That's a power of 2.

  • Easier way (Score:4, Funny)

    by Yvan256 ( 722131 ) on Tuesday September 14, 2010 @01:02PM (#33577564) Homepage Journal

    They should have asked The Doctor to simply record the event when he re-booted the Universe.

    • The BBC did already did. Lets just hope they dont overwrite the tapes.

    • no we just need the doctor to loan us the TARDIS and scientist could learn how the universe started and how we aren't alone maybe even about the threats to our the planets including the Daleks, Cybermen, and the real policemen of the universe the Judoon the r real enforcers of the universe

    • Just find where the MiB storage is - 'the galaxy is in Orion's belt' - not really a simulation though but an actual galaxy.

  • Gaaahhhh (Score:1, Insightful)

    by Anonymous Coward

    Let's simulate a single cell, then an organism, then aging. Then we can start extending our lifespan. THEN we can start living, not just this handful of years between being a powerless child and a weak, aging adult. Then you can worry about galaxies.

    • by tom17 ( 659054 )
      This. And quickly, i'm running out of time.
    • First of all, people are trying to do this already, probably on much more powerful hardware. Second, simulating an organism to the level that we need is probably a lot more demanding than simulating a galaxy to the level that these people need.
    • Re: (Score:3, Insightful)

      Let's simulate a single cell, then an organism, then aging. Then we can start extending our lifespan. THEN we can start living, not just this handful of years between being a powerless child and a weak, aging adult. Then you can worry about galaxies.

      What does an astrophysicist know about cellular biology? Probably about as much as a biologist knows about astrophysics.

      Compounding that, we wouldn't have made a fraction of the scientific progress to date if we focused on a single discipline until it was maste

      • I wonder if (theoretically) in a simulated galaxy, in which all particles and physical rules have been considered, some kind of simulated life-form can evolve from strange interactions of those particles and rules. Then, does this simulated life have any difference from real-life (reference to "The 13th floor")? I mean, for the living forms inside the simulation it will be like real life. I'm thinking if in theory there is anything to contradict this. Is it possible?
    • You mean like this?

      http://www.popularmechanics.com/technology/engineering/extreme-machines/4337190 [popularmechanics.com]

      Well its simulating neurons... I suppose that's close enough.

    • THEN we can start living, not just this handful of years between being a powerless child and a weak, aging adult. Then you can worry about galaxies.

      Nature already has a process for eternal renewal - death and birth. Our species as a whole has no pre-determined expiration date, and the ability to pass information through the generations. What difference does it really make whether it's us individually that live on, or our descendants?

      • I am not my descendants. I'd love to see the new discoveries that will be made about our place in the cosmos, the new technologies and art and culture that will be created in the distant future for myself, as would a great many others.
      • and old people need to die or no progress would be made racist didnt start dieing till the slave owners started dieing
  • did anyone consider ahead of time how many calculations would be necessary before they invested all that money?
  • Dont want to wait 15 billon years to see the next Blue Screen of Big Bang
  • by Sonny Yatsen ( 603655 ) * on Tuesday September 14, 2010 @01:19PM (#33577890) Journal

    It's interesting to think that the university is attempting to use 800 processor cores to simulate galaxies, when IBM uses 147,456 processors to do a neuron-by-neuron simulation of the human brain.

    • Well, Id imagine it takes quite a bit longer than the IBM super computer to do an equal amount of work.
      • Well, Id imagine it takes quite a bit longer than the IBM super computer to do an equal amount of work.

        An equal amount of computation, sure. But how much computation is necessary to get useful results? Both may not be working on problems of equal magnitude.

        A backhoe can move more dirt than I can with a shovel, but if all I have to move is 1 cubic meter, and the backhoe has to move 1000... my workload is still a lot less.

    • He's too busy fighting shambling men in rubber masks.
    • by tibit ( 1762298 )

      They don't do a simulation of the entire brain, just a part of the cortex. And their simulation runs at maybe 1% of real time.

    • There's a big difference in the problem. Namely, its possible to work at a coarser level of granularity when dealing with galaxies. You might not be able to simulate individual stars, but you can simulate star clusters and the clumps of dark matter to get approximations. With the brain simulation, its not possible to abstract away as much detail, hence the higher hardware requirements.

  • by cowtamer ( 311087 ) on Tuesday September 14, 2010 @01:24PM (#33577988) Journal

    The galaxies in the simulation develop planets, scientists, and their own Galaxy Simulators???

    Has anyone else been bothered the fact that energy is quantized? It always made me feel like we were looking at pixels we weren't supposed to see :)

    • by tom17 ( 659054 )
      Well it'd be a bit of a 'round-the-houses' method, but we'd have invented AI.. Finally!
    • The galaxies in the simulation develop planets, scientists, and their own Galaxy Simulators???

      Has anyone else been bothered the fact that energy is quantized? It always made me feel like we were looking at pixels we weren't supposed to see :)

      Why should I be bothered, if you look at it just the right way, it looks like...

      Turtles.

    • Re: (Score:3, Insightful)

      by dominious ( 1077089 )
      Ref. to "The 13th floor". I just posted something similar a few posts before:)
    • by mangu ( 126918 ) on Tuesday September 14, 2010 @04:22PM (#33580522)

      Has anyone else been bothered the fact that energy is quantized?

      Even more significant is that there's an intrinsic speed limitation [wikipedia.org] in a simulation.

      When you simulate a continuous medium by dividing it into small space and time steps, there's a speed "c" that's equal to the space step divided by the time step which cannot be exceeded by anything in the simulation.

    • The simulation argument [simulation-argument.com] paper proposes a philosophical argument about this sort of thing. The consequences that they come up with are pretty interesting. Of course, there are arguments [pagesperso-orange.fr] against [imminst.org] such a configuration of the universe as well...

  • Waste of Time (Score:1, Informative)

    by MrTripps ( 1306469 )
    Let me save those guys some time: 42
  • Obligatory XKCD http://xkcd.com/505/ [xkcd.com]
  • by vlm ( 69642 ) on Tuesday September 14, 2010 @01:38PM (#33578234)

    Even with 800 AMD processor cores at its disposal the university is still hitting the limits of what is possible..

    Meaningless uninformed journalist bs filler puff. What is possible, is simulating every subatomic particle in the universe at planck time intervals for the total age of the universe, repeatedly for an infinite combination of different cosmological constants to see what you get. That will never be done, of course.

  • by peter303 ( 12292 ) on Tuesday September 14, 2010 @01:49PM (#33578406)
    Every year they can do more detail models. And they become clever in modeling. For example, aggregate gravity fields.
  • Let's assume that they are trying to simulate the formation of a small galaxy... that would be no more than 100 million stellar masses. That's still a lot of points, a whole lot of calculations.

  • by Anonymous Coward

    the grape-5 does N-body simulations using specialized hardware that is faster than a standard CPU: http://en.wikipedia.org/wiki/Gravity_Pipe [wikipedia.org]

  • What happens when the simulation get to the point where humanity is 'advanced' enough technologically to try to model the universe with supercomputers? its an obvious infinite loop that will cause the universe to crash.... and they are professors? sheesh
  • Simulating galaxies?? Why not use it for something useful -- like ray tracing Wolf3d?!

  • Eight hundred cores of any type is TINY as far as supercomputers go. Most large US universities generally have at least one (if not several) supercomputers that are multiple times (if not an order of magnitude) larger than this. Never mind that most research projects on supercomputers NEVER use the whole system at once - it's more of a timeshare thing where you book however many threads for a certain length of time. Yes, the prospect of the research is interesting. That being said, other than that there
  • Let them simulate the milky way. I'm curious as to whether or not they will be able to simulate the genesis of life on Earth. That will be interesting..
    Hey, maybe if they let the simulation run long enough, the simulated earthlings will make their own simulation.
  • There must be a principle out there somewhere that says the universe cannot be accurately simulated by anything smaller than the universe. And if there isn't can I invent it and call it The Principle of Computational Hopelessness?
    • Technically it would depend on the properties of the universe. If it's turing complete, we can already simulate it given enough time and/or space. Since we're not doign a simulation of the whole timeframe of the universe but only it's beginning, time (and hence space) is not an issue.
  • by Prune ( 557140 ) on Tuesday September 14, 2010 @04:19PM (#33580476)
    be more careful with article summaries. They're wore than newspaper headlines these days. The "Over in the UK Durham University is tasking its supercomputing cluster with nothing less than recreating how galaxies are born and evolve over the course of billions of year" could describe any of the countless galaxy evolution simulations that have been done for a couple of decades already at various places, and gives no indication as to what's new about this instance. In other words, the headline is at best absolutely uninformative, and at worst, misleading.
  • How are the numbers over at Cambridge University? I guess David Braben is working on this too...
  • ... they could simply ask Ceiling Cat to create a new galaxy and record it on IMAX.

  • the supercomputer in the virtual galaxy that is simulating a galaxy?

The computer is to the information industry roughly what the central power station is to the electrical industry. -- Peter Drucker

Working...