Simulating Galaxies With Supercomputers 120
An anonymous reader writes "Over in the UK Durham University is tasking its supercomputing cluster with nothing less than recreating how galaxies are born and evolve over the course of billions of years. Even with 800 AMD processor cores at its disposal the university is still hitting the limits of what is possible."
Re: (Score:3, Informative)
Re: (Score:2)
Gravity doesn't have a range.
That's a rather 'tricky' statement don't you think? First, I'll agree with you in that gravity doesn't technically reach zero. But it does appear to have to propagate. In a system many thousand of lightyears across, propagation delay would be significant.
Not only that, but wouldn't the galaxy have expanded several million, if not billions of miles in the 27,000 years it would take for light to travel from one end to the other? (I'm not trusting my back of the envelope calcua
Re: (Score:2)
Gravity doesn't have a range.
That's a rather 'tricky' statement don't you think? First, I'll agree with you in that gravity doesn't technically reach zero. But it does appear to have to propagate. In a system many thousand of lightyears across, propagation delay would be significant.
I'm not sure how this makes my statement about gravity not having a range "tricky"; but it's definitely something worth thinking about. Several thousand lightyears is actually a pretty small-scale simulation; the simulation volume wouldn't be large enough to contain a typical bright galaxy. Cosmological simulations incorporating galaxy formation typically use volumes tens or even hundreds of megaparsecs (Mpc) on a side. Imagine you're working with a 100 Mpc per-side cube (for the cognoscenti: taking h=1
Re: (Score:1)
In a simulation like this, the most important physical effect to model is gravity.
No, it's not. Or it is. Actually, you can't say until you run it and compare the results with other setups and, better yet, with what you see in the Cosmos.
Imagine that every galaxy is, more or less, surrounded by all the others- thus, unless you have superclusters in proximity gravity cancels out. Seen as a system with components, though, the gravitational interactions between individual stars (from which the shape of the galaxy emerges from) are important. Yet, forces such as e/m might be mild, but operat
800 AMD processor cores (Score:4, Interesting)
800 AMD processor cores, that knowledge is useless, need more info regarding that, are they ultra low power ones like Atom/Bobcat, or extremely high clocked, such as the i7 980x/ Phenom x6 1090,etc
Also article says that they have 1600GB RAM, isnt RAM normally in powers of 2?
Re: (Score:2)
Key to puzzling this out: it's a computing cluster, and 800 isn't a power of 2 either.
Re: (Score:3, Interesting)
Re: (Score:2)
Re: (Score:2)
Id guess they are running Opteron cpu's, maybe up to 8 core. So that means 50-100 machines in a cluster. 1600 / 50 = 32 gb per machine OR 1600/ 100 = 16 gb per machine.
And that does not preclude them using Dual Socket MoBo's with two - 4 core Opterons.
Re:800 AMD processor cores (Score:4, Informative)
mostly opteron 175 (528 of them at 2.2 GHz with 1056GB RAM totl) and 285 (256 of them at 2.6GHz with 512GB RAM tota), so about 2GB RAM each.
they run Solaris 10 u3
http://icc.dur.ac.uk/icc.php?content=Computing/Cosma [dur.ac.uk]
Re: (Score:2)
The mystery is solved. Their cluster is old and slow,
Re: (Score:2)
a school isn't going to dump their supercomputer in the garbage every three years like a PC gamer. heck it probably took a year to get hardware project proposed and approved.
funny how many here assumed it would be an octo-core monster. instead we find hardware a few years old is actually useful.
Re: (Score:2)
Perhaps they're angling for another grant.
Dr Lydia Heck, the ICC's computer cluster manager, said the ICC had maxed out its supercomputing cluster's processors and memory by running a simulation of the effect of dark matter on how galaxies are formed.
And the maxed-out cluster is not even using large scale models.
Physicists have to simplify the cosmological models they use in order to get ones that produce data sets small enough to be accurately processed by the 64-bit chips in the supercomputing cluster, and which can fit into the cluster's available memory.
Re: (Score:2, Informative)
Details are here:
http://icc.dur.ac.uk/icc.php?content=Computing/Cosma
The Cosmology Machine (COSMA) was first switched on in July 2001. From the original system only 2 TByte of dataspace is still on line. 64 SunBlade 1000s with a total of 64 GByte of RAM were donated by the ICC in February 2006 to the Kigali Institute for Science and Technology in Kigali, the Capital of Rwanda. Sun Microsystems payed for their transport by air.
In February 2004, QUINTOR was installed. QUINTOR consists of a 256 SunFire V210s w
Re: (Score:2)
1600GB RAM = 2GB RAM per core. That's a power of 2.
Easier way (Score:4, Funny)
They should have asked The Doctor to simply record the event when he re-booted the Universe.
Re: (Score:2)
The BBC did already did. Lets just hope they dont overwrite the tapes.
Re: (Score:1)
no we just need the doctor to loan us the TARDIS and scientist could learn how the universe started and how we aren't alone maybe even about the threats to our the planets including the Daleks, Cybermen, and the real policemen of the universe the Judoon the r real enforcers of the universe
Re: (Score:2)
Just find where the MiB storage is - 'the galaxy is in Orion's belt' - not really a simulation though but an actual galaxy.
Re: (Score:2)
Yeah but a single Galaxy is nothing compared to the Universe.
Gaaahhhh (Score:1, Insightful)
Let's simulate a single cell, then an organism, then aging. Then we can start extending our lifespan. THEN we can start living, not just this handful of years between being a powerless child and a weak, aging adult. Then you can worry about galaxies.
Re: (Score:2)
Re: (Score:2)
I'm still waiting for my cyborg body...
Re: (Score:2)
Re: (Score:3, Insightful)
Let's simulate a single cell, then an organism, then aging. Then we can start extending our lifespan. THEN we can start living, not just this handful of years between being a powerless child and a weak, aging adult. Then you can worry about galaxies.
What does an astrophysicist know about cellular biology? Probably about as much as a biologist knows about astrophysics.
Compounding that, we wouldn't have made a fraction of the scientific progress to date if we focused on a single discipline until it was maste
Re: (Score:2)
Re: (Score:2)
You mean like this?
http://www.popularmechanics.com/technology/engineering/extreme-machines/4337190 [popularmechanics.com]
Well its simulating neurons... I suppose that's close enough.
Re: (Score:2)
Nature already has a process for eternal renewal - death and birth. Our species as a whole has no pre-determined expiration date, and the ability to pass information through the generations. What difference does it really make whether it's us individually that live on, or our descendants?
Re: (Score:1)
Re: (Score:1)
too many atoms, not enough processors (Score:1)
Re: (Score:2)
We wouldnt have had this story if they had
Does it run Windows? (Score:2)
Brain vs. Galaxy Simulation (Score:4, Interesting)
It's interesting to think that the university is attempting to use 800 processor cores to simulate galaxies, when IBM uses 147,456 processors to do a neuron-by-neuron simulation of the human brain.
Re: (Score:2)
Re: (Score:2)
Well, Id imagine it takes quite a bit longer than the IBM super computer to do an equal amount of work.
An equal amount of computation, sure. But how much computation is necessary to get useful results? Both may not be working on problems of equal magnitude.
A backhoe can move more dirt than I can with a shovel, but if all I have to move is 1 cubic meter, and the backhoe has to move 1000... my workload is still a lot less.
Re: (Score:2)
Re: (Score:2)
They don't do a simulation of the entire brain, just a part of the cortex. And their simulation runs at maybe 1% of real time.
Re: (Score:2)
There's a big difference in the problem. Namely, its possible to work at a coarser level of granularity when dealing with galaxies. You might not be able to simulate individual stars, but you can simulate star clusters and the clumps of dark matter to get approximations. With the brain simulation, its not possible to abstract away as much detail, hence the higher hardware requirements.
Simulating a Galaxy on a computer? (Score:2)
How long before ... (Score:3, Interesting)
The galaxies in the simulation develop planets, scientists, and their own Galaxy Simulators???
Has anyone else been bothered the fact that energy is quantized? It always made me feel like we were looking at pixels we weren't supposed to see :)
Re: (Score:2)
Re: (Score:2)
The galaxies in the simulation develop planets, scientists, and their own Galaxy Simulators???
Has anyone else been bothered the fact that energy is quantized? It always made me feel like we were looking at pixels we weren't supposed to see :)
Why should I be bothered, if you look at it just the right way, it looks like...
Turtles.
Re: (Score:3, Insightful)
Speed of light in a simulation (Score:5, Interesting)
Even more significant is that there's an intrinsic speed limitation [wikipedia.org] in a simulation.
When you simulate a continuous medium by dividing it into small space and time steps, there's a speed "c" that's equal to the space step divided by the time step which cannot be exceeded by anything in the simulation.
Re: (Score:2)
In conclusion, God sucks at collision checking.
reminds me of the "Simulation Argument" (Score:2, Interesting)
The simulation argument [simulation-argument.com] paper proposes a philosophical argument about this sort of thing. The consequences that they come up with are pretty interesting. Of course, there are arguments [pagesperso-orange.fr] against [imminst.org] such a configuration of the universe as well...
Waste of Time (Score:1, Informative)
Re: (Score:1, Interesting)
Let me save those guys some time: 42
What were the input params again?
Re: (Score:3, Informative)
Re: (Score:3, Funny)
Re: (Score:2)
Obligatory XKCD (Score:1, Funny)
Meaningless uninformed journalist bs (Score:4, Insightful)
Even with 800 AMD processor cores at its disposal the university is still hitting the limits of what is possible..
Meaningless uninformed journalist bs filler puff. What is possible, is simulating every subatomic particle in the universe at planck time intervals for the total age of the universe, repeatedly for an infinite combination of different cosmological constants to see what you get. That will never be done, of course.
Re: (Score:1)
Re: (Score:1)
Perhaps the simulator's universe is bigger than ours, or their computer is in a 4D universe or they found a way to tap into a 4th dimension.
Re: (Score:1)
Re: (Score:1)
I don't think I'd be their first choice if they had such ability.
Re: (Score:1)
computing astronomers doing this for decades (Score:4, Insightful)
Of course it is overwelmed (Score:2)
Let's assume that they are trying to simulate the formation of a small galaxy... that would be no more than 100 million stellar masses. That's still a lot of points, a whole lot of calculations.
grape processors are faster (Score:2, Informative)
the grape-5 does N-body simulations using specialized hardware that is faster than a standard CPU: http://en.wikipedia.org/wiki/Gravity_Pipe [wikipedia.org]
Recursive Loop? (Score:1)
Simulating universe on big scale is (Score:2)
and in very interesting way.
Better uses (Score:2)
Simulating galaxies?? Why not use it for something useful -- like ray tracing Wolf3d?!
Compared to what? (Score:1)
curious (Score:1)
Hey, maybe if they let the simulation run long enough, the simulated earthlings will make their own simulation.
Fidelity of Mathematical Models (Score:2)
Re: (Score:2)
Please Slashdot editors, (Score:4, Interesting)
Cambridge University (Score:1)
Don't know why they go through this when... (Score:2)
... they could simply ask Ceiling Cat to create a new galaxy and record it on IMAX.
Does it take into account (Score:1)
Re: (Score:2)
Re: (Score:2)
not necessarily, 100 VM's means 8 Cores and 16GB RAM per VM.
assuming that these cores are not equivalent to Atom cores, but something faster, it still doesnt say anything about the graphics hardware
Re: (Score:3, Interesting)
Re: (Score:2)
Re: (Score:2)
I don't believe that at all; it sounds like marketing-speak. Intel's using, what, eight CPUs to do real-time RAY TRACING, and that's MORE demanding than the rasterizering paradigm that modern GPUs are based on. Certainly a GPU is more specialized and efficient than a similar-scale general purpose CPU, but I think the performance ratio is closer to 4::1 than 100::1.
Re: (Score:2)
IIRC those CPUs have some GPU components built into the die which is why that is possible. In a straight competition on most normal GFX rendering type equations the actual is closer to the 100:1 than 4:1.
Please correct me if I'm wrong. However it very much depends on the equations. There are things that GPUs can do but are bad at and things they can't do at all.
Either way having a good setup to make use of the strengths of both types of processor is going to be the optimal solution.
Re: (Score:2)
I don't believe that at all; it sounds like marketing-speak. Intel's using, what, eight CPUs to do real-time RAY TRACING, and that's MORE demanding than the rasterizering paradigm that modern GPUs are based on. Certainly a GPU is more specialized and efficient than a similar-scale general purpose CPU, but I think the performance ratio is closer to 4::1 than 100::1.
It's all in the design. (YAY! Car analogy time)
You have to transport 1,000 people from NY to Miami faster than another person. Complete the chal
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
but does it run linux?
guess it doesnt run windows, though I would like to see the processor graphs of a 800 core machine in the Task Scheduler
Re: (Score:2)
the task manager alone would take up a few dozen cores
Re: (Score:2)
It runs on storks.
Re:Should have... (Score:5, Insightful)
So they have a problem that takes more then one rack of modern computers to handle?
Re: (Score:2)
That's what I was thinking. The University I work for has not just one but 3 clusters with 2 of them having 4,096 CPU cores and 848 CPU cores
Re: (Score:2)
Do you mean 4096 CPU cores and 848 actual CPUs?
Re: (Score:2)
More likely he meant one had 4096 cores and another had 848 cores.
Re: (Score:1, Interesting)
They should have talked to SuperMicro.
That's just over 8 enclosures (4 nodes/enclosure) and fits in 18U.
It looks like this [supermicro.com].
Re: (Score:2)
Even better, this:
http://www.dell.com/us/en/enterprise/servers/pedge_m1000e/pd.aspx?refid=pedge_m1000e&s=biz&cs=555 [dell.com]
4 of those maxed out gives you near the same capacity. 16 slots * 12 cores per slot * 4 enclosures = 768 cores
Dual GTX 480 (Score:5, Funny)
Or two GPUs [nvidia.com].
If it can run Crysis it can simulate galaxies.
Re: (Score:1)
You've modded this funny, but it's true (disclaimer; I'm doing numerical simulations for a living)-
On a physical problem that can be described by a handful of equations (as in a per-particle simulation of gravity and electromagnetism, or a per-mesh element of a fluid) the calculations are very simple; the caveat is that they have to be iterated a couple of trillion times before getting a result; GPUs are designed to do exactly this (probably as a colaterral of how they are designed to handle vertices). Why
Re:Should have... (Score:5, Funny)
This is nothing. For my kindergarten thesis, I used galaxies to simulate supercomputers.
Re: (Score:1)
Its like to to be far more realistic since its based on the real physics of the universe, rather than the simulations which are based on simulations of made up rules for the universe.