Please create an account to participate in the Slashdot moderation system


Forgot your password?
Technology Science

Simulating the Whole Universe 326

Roland Piquepaille writes "An international group of cosmologists, the Virgo Consortium, has realized the first simulation of the entire universe, starting 380,000 years after the Big Bang and going up to now. In 'Computing the Cosmos,' IEEE Spectrum writes that the scientists used a 4.2 teraflops system at the Max Planck Society's Computing Center in Garching, Germany, to do the computations. The whole universe was simulated by ten billion particles, each having a mass a billion times that of our sun. As it was necessary to compute the gravitational interactions between each of the ten billion mass points and all the others, a task that needed 60,000 years, the computer scientists devised a couple of tricks to reduce the amount of computations. And in June 2004, the first simulation of our universe was completed. The resulting data, which represents about 20 terabytes, will be available to everyone in the months to come, at least to people with a high-bandwidth connection. Read more here about the computing aspects of the simulation, but if you're interested by cosmology, the long original article is a must-read."
This discussion has been archived. No new comments can be posted.

Simulating the Whole Universe

Comments Filter:
  • by BinBoy ( 164798 ) on Saturday September 04, 2004 @04:20PM (#10158938) Homepage
    Does the simulation include simulated scientists simulating the universe?
  • by Anonymous Coward on Saturday September 04, 2004 @04:20PM (#10158939)
    "I always wanted to be God." said Dr. Johnson. "When they announced this project, the first words out of my mouth were 'Dibs on God!' I even have plans to introduce a son in a few billion simulated years. This is going to be exciting."
  • by Sialagogue ( 246874 ) <> on Saturday September 04, 2004 @04:21PM (#10158942)

    I can search it to find out where I left my cell phone last night.

    • I can search it to find out where I left my cell phone last night
      Have you tried calling it to see who answers? I've recovered mine three times that way (over the years, not in the last week!)

      GMail invites for iPod referrals []
    • Any simulation depends on the initial conditions. Since we all hardly know very much about reality - you for instance don't even know where your cell phone is - any result is acceptable.

      The idea of a simulation is useful for insight into possibilities but not for specifics. We can focus the simulation into a personal world where initial conditions are meticulously gathered. This can only go so far since outside influences cannot be fully observed or predicted. Even the imposition of boundaries will skew re
  • bittorrent (Score:2, Funny)

    by dioscaido ( 541037 )
    Fire up your bittorrents, people!
  • Why bother? (Score:5, Funny)

    by sometwo ( 53041 ) on Saturday September 04, 2004 @04:21PM (#10158945)
    The answer is 42. Just google it []
    • Well, maybe they hoped to find the question?
    • The minute one of you start seeing pan-dimensional mice - I'm outta here
    • by Anonymous Coward
      The best part is that if you search for "42", you don't get calculator results, but you do get the ads:

      Buy 42 on ebay. Low
      prices. Wide selection. aff

      eBay: 42
      Low Prices, Huge Selection, Easy to
      Shop. Get Started on eBay Now! -Aff

      I know eBay ads come up often, but I just find the fact that ebay bought ads for "42" (and apparently other numbers) just astounding. Do they really think a lot of people want to buy numbers on eBay? I, for one, buy my numbers down on the corner fr

    • the answaer to what? what was the question...
  • by Anonymous Coward on Saturday September 04, 2004 @04:21PM (#10158947)
    Do you have a 1:1 scale map of the world I can use?

    Uh yes, but it's being used right now.
  • I was going to comment on the absurdity of the claim to simulate everything, but due to a slashcode hiccup, I got "nothing to see here, move along." What The? I thought that EVERYTHING was here?

    Oh, and the ObStevenWright: You can't have (simulate?) everything. Where would you put it?

    GMail invites for iPod referrals []
    • Great (Score:2, Funny)

      We slashdotted the entire universe. Way to go.
    • The last time I checked, we didn't know a whole lot about the nature of the universe as it stands... how the hell are these guys claiming that they

      a) Know how it was at the start


      b) can compare it to whats out there already when its done?

      • well was the last time you checked in 1930?
      • Re:Wait wait wait (Score:2, Informative)

        by iroll ( 717924 )
        The article answers said questions nicely.

        a) They didn't start at the beginning; they started at 380K years--the "snapshot" of which has been developed by looking at cosmic background radiation.

        b) Using telescopes, they've observed very large-scale structures in the universe (arrangements of clusters of galaxies), and they are hoping to see similar large-scale structures in their model.
  • by Libor Vanek ( 248963 ) <> on Saturday September 04, 2004 @04:22PM (#10158953) Homepage
    Now just imagine a beowulf cluster of... damned!
  • Does anyone else see the striking similarities to the Hitchikers Guide?

    Simulating the Universe to find answers and the number 4.2 can't be a coincidence, can it?
  • Sounds like the first Hyperspace Nav-Computer to me...
  • Umm, Paradox? (Score:2, Interesting)

    by Lord Kano ( 13027 )
    How can you accurately simulate the computer that is simulating the entire universe?

    Basically, you'd end up infinitely short on processing power. The faster you make the computer, the faster you need the computer to be. It's like working out so that you can get strong enough to pick yourself up by the bootstraps. The stronger you get, the more you weigh and you make the impossible less possible.

    • by kirkjobsluder ( 520465 ) <kirk&jobsluder,net> on Saturday September 04, 2004 @04:50PM (#10159093) Homepage
      How can you accurately simulate the computer that is simulating the entire universe?

      The same way you simulate anything else. You simplify the problem down to a manageable number of particles that represent larger units of whatever you are simulating. Since in looks like they are interested in mass and gravity at the galactic supercluster scale, they can use particles that weigh much more than any individual star.

      So the fundamental challenge for the Virgo team is to approximate that reality in a way that is both feasible to compute and fine-grained enough to yield useful insights. The Virgo astrophysicists have tackled it by coming up with a representation of that epoch's distribution of matter using 10 billion mass points, many more than any other simulation has ever attempted to use.

      THESE DIMENSIONLESS POINTS have no real physical meaning; they are just simulation elements, a way of modeling the universe's matter content. Each point is made up of normal and dark matter in proportion to the best current estimates, having a mass a billion times that of our sun, or 2000 trillion trillion trillion (239) kilograms. (The 10 billion particles together account for only 0.003 percent of the observable universe's total mass, but since the universe is homogeneous on the largest scales, the model is more than enough to be representative of the full extent of the cosmos.)

      • Sorry but the universe is not homogenous on the largest scales. It is not so at the planetary, interstellar, intergalactic, intercluster, intersupercluster, inter-Great Wall [] scales. The basic cosmological assumption that the universe is homogenous has been falsified by observation. Of course someone can claim that it would be if seen at an even larger scale but at this point it feels improbable and scientifically questionable. I personally cringe on the "creationist" big bang theory but I won't get into tha
    • If the computer does things that can be simplified then you could make smart, but simple, algorithm to simulate the computer. If you do this multiple times, you end up with 0 processing power.
    • He wasn't a scientist, just a thriller writer, but sometimes he came with some brilliant phrases. Like: "If God were omnipotent and omniscient in any literal sense, he wouldn't have bothered to make the Universe at all."
      ("Playback", 1958)
    • How can you accurately simulate the computer that is simulating the entire universe?

      How can you draw an accurate world map? It wouldn't fit on the planet?!

      The (obvious) answer is: Abstraction, my friend.
    • Dark Helmet: What the Hell am I looking at?! When does this happen in the movie?!
      Col. Sandurz: Now! You're looking at "now," sir. Everything that happens now is happening "now."
      Dark Helmet: What happened to "then?"
      Col. Sandurz: We passed it.
      Dark Helmet: When?
      Col. Sandurz: Just now. We're at now "now."
      Dark Helmet: Go back to "then."
      Col. Sandurz: When?
      Dark Helmet: Now.
      Col. Sandurz: Now?!
      Dark Helmet: Now!
      Col. Sandurz: I can't.
      Dark Helmet: Why?
      Col. Sandurz: We missed it.
      Dark Helmet: When?
      Col. San
    • recursion
    • Thats more than possible. The worlds weightlifting champion woman can life twice her weight, the camp man can lift more than 2.5 times his weight.

      Anyone who can do chin-ups can lift their own weight..

      There are other physical reasons why you cant lift yourself by the bootstraps, including the fact that you really have to push the earth downwards somehow.

      Beside that, your points taken.
  • But did they try to build a simulator that would simulate the entire universe in the simulated universe?

    Did they get a giant sign million light years across floating in space, saying:

    Simulating universe in a simulated universe is not going to work.
    You just have to try it, didn't you?
  • that should only take around 6 and a half minute to download []
  • by TWX ( 665546 )
    How long before denizens of the simulated universe start demanding equal rights and for our universe to stop negatively impacting their destiny?

    I wonder if our universe is just a simulation sometimes...
  • How many of you read that as "stimulating the whole universe" and immediately thought of pr0n?


    just me...
  • It's turtles all the way down.

    Now, where can I find the scientists working on a reality-hacking machine?
  • by hrieke ( 126185 ) on Saturday September 04, 2004 @04:40PM (#10159051) Homepage
  • by mikael ( 484 ) on Saturday September 04, 2004 @04:44PM (#10159067)
    ... the intergalactic branch of the RIAA has filed a "Cease and Desist" order against the scientists, citing Copyright law; that anyone giving out free copies of the universe without first seeking permission from the copyright holder is a violation of intergalactic intellectual property rights.

  • by Anonymous Coward on Saturday September 04, 2004 @04:51PM (#10159094)
    The article indicates that the "tricks" these researchers used were the octree and multipole expansion--both of which have been used in gravity and potential theory for many years. They reduce the N^2 interaction problem to N or N Log(N), depending upon implementation. The story makes it sound like these researchers invented the technique; I assume the writer misunderstood the scientists, because it certainly predates them.
  • Given that this group is called "the Virgo Consortium", is it any wonder that they have to resort to a "simulated" "Big Bang"?
  • by Anonymous Coward
    ...used by Duke Nukem: Forever.
  • First, the researchers divided the simulated cube into several billion smaller volumes. During the gravitational calculations, points within one of these volumes are lumped together--their masses are summed.
    a tree algorithm to simplify and speed up the calculations for this realm of short-distance interactions. Think of all 10 billion points as the leaves of a tree. Eight of these leaves attach to a stem, eight stems attach to a branch, and so on, until all the points are connected to the trunk. To e

  • by Jeremy Erwin ( 2054 ) on Saturday September 04, 2004 @05:11PM (#10159169) Journal
    The Read more here link leads to a few pity sentences framing lengthy excerpts from the IEEE article.

    BTW, the machine in question, the Max-Planck-Gesellschaft MPI/IPP, is currently ranked 66th []. It looks to be a fairly ordinary cluster with none of the exoticism that Cray says we so desperately need []
  • by Ira Sponsible ( 713467 ) on Saturday September 04, 2004 @05:19PM (#10159192) Journal
    I just hope it has a "You are Here-->" indicator so we all know where we are.
  • MOND (Score:2, Interesting)

    by sagman ( 465807 )
    Someone had to ask: wonder if anyone's simulated the universe using MOND []. How did the researchers account for all this dark matter [] that's supposed to be around? It's far more likely that we got the force law wrong. Do these dark matter guys still believe in Santa Claus? BTW has anyone successfully simulated a galaxy and produced results that correspond to observations? I think this problem is still open...
  • At this resolution (Score:5, Interesting)

    by crisco ( 4669 ) on Saturday September 04, 2004 @05:23PM (#10159208) Homepage
    that works out to 100 to 200 data points to represent our galaxy. I wonder if they will get recognizeable spiral structures, etc?

    Are they modeling any of the physical (star formation, etc) interactions of matter or just the gravitational interaction. It seemed like the latter, but the article did mention the apparent non-interaction of dark matter.

    • by TMB ( 70166 ) on Saturday September 04, 2004 @07:27PM (#10159764)
      Are they modeling any of the physical (star formation, etc) interactions of matter or just the gravitational interaction. It seemed like the latter, but the article did mention the apparent non-interaction of dark matter.
      From the article:

      The recently completed Millennium Run gave them the universe's broad distribution of matter as dictated by gravity. In upcoming simulations, other forces will come into play. Onto the web of matter the scientists will graft the electromagnetic aspects of normal matter, which by radiating photons allows gas to cool down and condense into spiral disks that originate stars. At the same time, hydrodynamic pressure, which ultimately derives from the fact that two atoms cannot overlap each other because of repulsion between their electrons, redistributes matter along the cosmic web's strands and nodes.
      So this run is just gravity, but they will do more runs that include hydro, cooling, and presumably star formation. And to answer your first question

      that works out to 100 to 200 data points to represent our galaxy. I wonder if they will get recognizeable spiral structures, etc?
      Without hydro or cooling, all you get are ellipsoidal dark matter halos, no disks.

    • Actually, in my previous post [], I calculated that there would be much fewer points than that, as in 10 million Andromedas per simulation element. Andromeda is almost the size of our galaxy. So I would think that no, you would not get to see parts of our galaxy, and even if there were that many, I doubt it would appear since the input data is probably not nearly accurate enough.
    • I run some N-body simulations myself at UCI, and having dealt with systems that have run for eons with 100 - 200 bodies, I can say that these systems tend to take the shape of a blob with perhaps some overall rotation. The spiral arms form because of local interaction between bodies having more effect on the motion than the central gravitational attraction. This doesn't really occur until the numbers of bodies simulating the galaxy gets way up there (into the millions).
  • by tmortn ( 630092 ) on Saturday September 04, 2004 @05:40PM (#10159265) Homepage
    Last I heard there was some question as to the speed of gravitational attraction. IE if the effect of gravity is only as fast as that of light then the earth is being acted on by the gravity from the point the sun was at 8 minutes ago or some such while the sun is similarly being affected by the earths poistion from 8 minutes ago.

    As these mass points get further and further apart this would have a huge effect on the results. Unless of course Gravity is instentaneous across any distance opening the door to some interesting possibilities. Namely the ability to communicate across large distances without delay. Perhaps even FTL travel.

    While I find this excercise interesting I also find it a tad ridiculose. So many simplifications have to be made to even attempt it and the whole thing is based on some assumptions that are not necesarrily cold hard fact... such as the mass of the universe. Theory says one thing, observation says another. Dark matter was invented to close the gap. Don't get me wrong, there are a lot of smart people that have come up with an awful lot of observation which seems to confirm its existence, but it could be that our point of veiw is insufficient. After all by all observations the Ptolemaic model of the movement of the heavens was accurate and it had all sorts of added rules for handling what was observed.

    Also there is the issue of the N body problem where N is greater than 2. Did you know we cannot accurately model our solar system just using keplers laws ? We have to create stabilising factors in the system to keep the planets paths from becoming unstable in their orbits. And yet here they are attempting to simulate an N body problem where N = 10 billion. /NCOR.11.16.D/display.html []

    That link shows what happens with a pure Keplerian system of equations for 9 bodies.

    Thus introducing such things as mass simplification for objects farther away ( creating groupings etc ) and the tree approach for close objects all creates an introduction of error into the equation. Further more they have to use some means of stabilizing the equations similar to solar system models which is a value based on observation but with no understanding for what really controls it ( if they don't do this then the system of equations can't model our own solar system much less 10 billion mass points expanding since 380k years after the big bang ). This is all chance for more error to creep into the equation. Then with all of this they run a simulation for a simplified mass points using simplified interactions with an unkown stabilizing force over the course of billions of years and then expect people to believe that what they wind up with has any significant correlation to reality.

    Do not be decieved by impressive things like 4 teraflops and 20 terabytes of information. To me this seems an interesting intellectual excercise, but the chances of the results being meaningful are pretty slim.
    • You can model 3+ bodies with gravitation, its just not exact, with no "closed form" integral solutions like the two body problem. With a "simulation" like this basically, you have NxNxT calculations. T equals something akin to "time", where low T=low resolution, high T=high resolution. For example, say I wanted to run a calculation with for 60 seconds, with 20 calcs per second. 60x20xN^2 steps. If I increase it to 100 calculations per second, it will be more "accurate". etc, etc. So you don't have to use Ke
      • by tmortn ( 630092 ) on Saturday September 04, 2004 @06:34PM (#10159479) Homepage
        I am not real sure about gravity personally. Have seen pretty convincing arguments on both sides of the coin. Until we can detect and produce gravity waves its pretty open to question I think. In this case though the point is that we don't know and it is an integral piece of knowledge to accurately simulate the interactions of 10 billion mass points over time and significant distances.

        On the other I know about the increased accuracy from higher fidelity time samples but all that does is postpone the inevitable chaos in the equations. Most solar system models don't even use keplers equations. They use the information determined from solving them via a 2 body problem ( planet and the sun ) and then assume that orbital period is more or less sacrosacnt. This creates a stable model which accurately represents what we have observed... but does not allow for the chaos that creeps in when we try to replicate observed motions using Keplers laws to atempt to model all interactions. If your really interested (or already know alot about it) a fascinating subject based in reality is orbital mechanics... ie how do you accurately rendesvous with other planets when you are traveling in an N body problem where N is greater than 3 over periods of time that are too great to be able to avoid the chaos ? The simple answer is you make small corrective burns along the way based on observation to recalibrate the route. But the significance there is that you can't use Keplers equations for more than a rough estimation for navigating in space at N > 2 ( like landing the martian rovers ).

        Keplers laws work almost flawlessly for 2 bodies which is why they are so powerful. However I think that is the problem. They work flawlessly for N=2 even when there is no real world true N=2 problem to solve. Essentially to solve the N = 2 problem for any planet you assume the attraction from anything other than the sun is insignificant. This works amazingly well and is what led to the discovery of the last two or three planets if memory serves.

        But as accurate as that is there is no getting around the chaos of the 3 body equation no matter how fine grained your time samples are. This is not true of the 2 body problem.. IE it dosn't matter what your time sample is, the 2 body problem works. If it dosn't its because there is another source of significant gravitational attraction at work. However over a great enough time span my guess is even the 2 body equation has inherent chaos in reality.. IE a pure theoretical 2 body equation is perfect, but for the earth and the sun sooner or later what is deemed insignificant in the 2 body problem for practical purposes will become significant over a long enough time frame.

        All in all it reminds me of the old parallax problem that led the Greeks to dismiss a Heliocentric model of the solar system and choose Ptolemy's view of a an earth centered model. I think our frame of refference is such that the inherrent error in Keplers laws are not readily observable just the same as the greeks frame of refference was insufficient to observe parallax.
        • For one, you aren't even simulating nebulas (to use English pluralization) or galaxies. In a rough calculation, Andromeda is roughly 1E12 Solar Masses [] and a solar mass is 2E20 kilograms []. So Andromeda is 2E32 kg. Each simulation element is 2E39 kg. So each element is simulation 10 million Andromedas! For an element of scale, the Local Group contains 30 galaxies, while the Local Supercluster contains only 100 galaxies and galaxy clusters. It is estimated that the Local Supercluster's mass is 10E15 solar []
        • It seems to me that if you you were to just modify an experiment I read about where they were measuring the gravitational force of the sun (or something like that). They had a few large rods with a precisely known mass, and measured the change in forces during the day.

          If you were to do something like that and take into account the change in distance from the sun and tidal effects, you could compare the force data to the observed relative position of the sun, find the amount of delay, and then calculate the
    • Further more they have to use some means of stabilizing the equations similar to solar system models which is a value based on observation but with no understanding for what really controls it ( if they don't do this then the system of equations can't model our own solar system much less 10 billion mass points expanding since 380k years after the big bang )

      Actually, it doesn't work that way... we really don't care where each individual particle ends up, just what the overall phase space density is. The best

      • I can see where just wanting to know the density distribution would be good information. Seems on a smaller scale this is critical information for figuring out how a solar system forms and on larger scales how galaxies form.

        It seems to me your saying this should create a system that acts similar to the universe while it is not supposed to be an exact model. Sort of like the chaos anaology with the drop of water.. no matter how precisely you drop it there is no real predicting how it will fall but you can m
    • by Quantum Jim ( 610382 ) <> on Sunday September 05, 2004 @01:57AM (#10161207) Homepage Journal

      While I find this excercise interesting I also find it a tad ridiculose. So many simplifications have to be made to even attempt it and the whole thing is based on some assumptions that are not necesarrily cold hard fact...

      The primary goal for computer simulations such as these are to understand how and why they don't work - not to test current theories.

      For instance, many different attempts had be conducted before an computer model of Earth's magnetic field exibited magnetic field reversal - and even then, it wasn't exactly like how the geological evidence shows. However, it suggests a basic model that can be adjusted to more accurately describe our planet's core.

      It is the same with this attempt to simulate a Universe. The goal is to understand how things interact, how the simulated universe differs from our Universe, and why it differed. Some things would be due to problems running computer simulations with a Von Neuman Machine (such as the "three body problem"). Other errors will be caused by problems with our current model. If the two effect can be seperated and analysed, then advancements could be done in both computer science (e.g. weather forcasting) and cosmology. That's the point of this excersize.

    • General relativity gives a clear answer to the "speed of gravity" question. Essentially gravity does travel at the speed of light. However, it is not quite a simple "inverse-squared" attraction. Small corrections which you can think of as the properties of gravity waves mostly cancel out the "attracted to where the Sun was 8 minutes ago" effect, although a very tiny correction remains, so that the orbit of Mercury, for instance, is not exactly as predicted by Newton's laws -- although you need a good telesc
      • Mod parent up!
        This discussion thread looks like if newton's law is the last theory there is about gravity. But Einstein has invented/discovered general relativity now nearly 90 years ago.

        And yes, gravitational waves have been at least indirectly observed (there was a nobel prize for that discovery!).

        So, general relativity has some experimental/observational data which supports it. It's probably not the last word about gravity, but it is a significant refinement of newton's theory.

        And, if you consider spe
  • Hmm, is this the beginning of his theories? ;)
    • No, psychohistory [] is all about people, crowds, civilizations, uninformed mobs. Billions upon billions upon trillions of people, really. And even them it was just one galaxy-full. Long-range gravitational simulation does not compare.
  • by Proc6 ( 518858 ) on Saturday September 04, 2004 @06:00PM (#10159357)
    REQ: plz repost RARs 43,491,400 - 296,102,232 of "Whole.Universe.Simulation.FULL.DATABASE.WinALL.FA iRLiGHT"


  • by geekoid ( 135745 )
    will thay have enough power to similatate the similation of the ubiverse?
  • by TMB ( 70166 ) on Saturday September 04, 2004 @07:09PM (#10159663)
    (disclaimer: I Am An N-Body Modeller, and although I'm not part of the Virgo collaboration, a large fraction of what I do is study cosmological models like the one described)

    It doesn't quite come out in the article, but what's really groundbreaking about this work is the number of particles they're using. When you make models like these, you always have to prioritize how large a volume you want to simulate (the more volume you have, the more representative a fraction of the universe you have and the larger number of structures you can analyze) vs how massive the particles are (the smaller the particles, the smaller structures you can analyze).

    The more total particles you have, the less you need to compromise your volume or particle mass. Until now, simulating such a large a fraction of the universe (NOTE: unlike what the submitter said, this is not the full universe; as the article itself says, it's about 0.003 of the Hubble volume) required such large particles that it was impossible to say anything about individual galaxies.

    However, with 10^10 particles, the mass of their particles is only about 10^9 solar masses, so they can reliably resolve structures of 10^11 solar masses. For reference, the mass of the Milky Way is roughly 10^12 solar masses. This is a fantastic leap forward - most other modern simulations have 10^8 - 10^9 particles, and so either can only simulate a much smaller fraction of the universe (like the simulations I study), or cannot say anything about galaxies, only massive galaxy clusters.

    • (from the article)
      "Each point is made up of normal and dark matter in proportion to the best current estimates, having a mass a billion times that of our sun, or 2000 trillion trillion trillion (2^39) kilograms."

      Actually, there appears to be an error in the article, with the author leaving out a "*10^". 2^39 is supposed to be 2x10^39 and that is the number of zeros used in one of the excerpts(sp?). That works out to only 1E19 solar masses, significantly more than the mass of our galaxy by, oh, seven or
  • Those sissy Virgans are going about the sim backwards. Take their engine, plug in data on the Universe's particles' state now, and run the sim forward, discarding the data until they get to the simulated day after they publish the results. Then extract from that state the particles composing the simulated published report, and then actually publish it. By simulating only a few months, rather than 15By, they can increase the resolution enough that they won't even have to pick the font: it'll be in the simula
  • I knew it (Score:2, Funny)

    by bruce227 ( 810573 )
    Thought so. The universe is pre-alpha, which is why every feature sucks.
  • The resulting data, which represents about 20 terabytes, will be available to everyone in the months to come

  • 10 billion particles cannot simulate a glass of milk, let alone the universe.

    Just in terms of mass, each particle in the "universe" simulation must represent between 10 and 100 galaxies.

    So the simulation actually represents the gravitational interactions of groups of early galaxies, not really the whole "universe".
  • Sounds like a job for...

    *duh duh duh-duh!*


    That would be beautiful, thousands of HUGE files posted to one massive bittorrent tracker!
  • Whew! (Score:3, Interesting)

    by cmacb ( 547347 ) on Saturday September 04, 2004 @11:31PM (#10160854) Homepage Journal
    "The resulting data, which represents about 20 terabytes, will be available to everyone in the months to come, at least to people with a high-bandwidth connection."

    Well, at least we know that we will be around for a few months. Do we have to download the whole bloody thing to find out when the world ends?

Numeric stability is probably not all that important when you're guessing.