Simulated Universe 332
anonymous lion writes "A story in the Guardian Unlimited reports on The Millennium Simulation saying that it is 'the biggest exercise of its kind'. It required 25 million megabytes of memory to take our universe's initial conditions along with the known laws of physics to create this simulated universe." From the article: "The simulated universe represents a cube of creation with sides that measure 2bn light years. It is home to 20m galaxies, large and small. It has been designed to answer questions about the past, but it offers the tantalising opportunity to fast-forward in time to the slow death of the galaxies, billions of years from now."
So where can I download it? (Score:3, Informative)
25 TB? That's nothing. (Score:3, Informative)
Brief article, with pictures:
University of Wisconsin deploys nearly 200TB of Xserve RAID storage [alienraid.org] (Google cache [google.com])
The storage is used for, among other things, particle physics simulations in support of research projects at sites such as the Large Hadron Collider [web.cern.ch] at CERN [cern.ch]. More information on GLOW and its initiatives can be found here [wisc.edu].
Text of the above article:
The University of Wisconsin - Madison has deployed 35 5.6TB Xserve RAID storage arrays in a single research installation as part of an ongoing scientific computing initiative.
The Grid Laboratory of Wisconsin (GLOW), a partnership between several research departments at the University of Wisconsin, have installed almost 200TB, or 200,000GB, of Xserve RAID arrays. As a comparison, 200TB of storage is enough to hold 2.75 years of high definition video, 25,000 full length DVD movies, 323,000 CDs, 20 printed collections of the Library of Congress, or over 1000 Wikipedias.
The GLOW storage installation is physically split between the departments of Computer Sciences and High Energy Physics. Each Xserve RAID is attached to a dedicated Linux node running Fedora Core 3 via an Apple Fibre Channel PCI-X Card and is either directly accessed via various mechanisms, such as over the network via gigabit ethernet, or aggregated using tools such as dCache.
The storage is primarily used to act as a holding area for large amounts of data from experiments such as the Compact Muon Solenoid (CMS) and ATLAS experiments at the Large Hadron Collider at CERN.
Aside from the GLOW initiative, the university also has Xserve RAID storage systems in use in other areas as well.
Full disclosure: I am the administrator of alienraid.org and am affiliated with the University of Wisconsin.
Re:25 TB? That's nothing. (Score:5, Informative)
Re:I thought (Score:5, Informative)
No, they don't. This has happened a few times in the past, e.g., when they didn't know about the different populations of stars, but currently there isn't an age problem.
We don't know what dark matter is, but we know enough about its gravitational properties -- that's why it was postulated to exist, after all -- to simulate its effects on these scales.
The models we have are not as badly flawed as you think they are. But even if they are flawed, that's the point of the simulation: to test the validity of the model. If the simulation's results don't agree with observations, then that tells us about where the model fails.
Re:25 TB? That's nothing. (Score:5, Informative)
That's confirmed in page 18 of their paper: http://arxiv.org/PS_cache/astro-ph/pdf/0504/05040
Re:Predicting the future (Score:4, Informative)
Re:25 TB? That's nothing. (Score:2, Informative)