Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Science

IBM & CERN openlab for DataGrid Applications 43

Jules V.D. writes "CERN and IBM today announced that IBM is joining the CERN openlab for DataGrid applications to collaborate in creating a massive data-management system built on Grid computing.IBM's innovative storage virtualization and file management technology, will play a pivotal role in this collaboration, which aims to create a data file system far larger than exists today to help scientists at CERN understand some of the most fundamental questions about the nature of matter and the Universe."
This discussion has been archived. No new comments can be posted.

IBM & CERN openlab for DataGrid Applications

Comments Filter:
  • by CrazyJim0 ( 324487 ) on Sunday April 06, 2003 @01:22PM (#5673678)
    Excellent.

    Much like the appeal of Seti at home was searching for AI... People now have a choice which distributed net they want to support.

    Its a system, similar to voting, that will have every distributed net in the future trying to please.

    I forsee distributed nets of the future attempting to produce results, in order to keep people interested and donating their computer cycles.

    Its an interesting system, that works a bit like deomocracy.
    • More about IBM and Cern- Gridcomputingplanet [gridcomputingplanet.com]

      Cern and Java- Vnunet [vnunet.com]

      More about Cern-Hepwww [rl.ac.uk]
      The Large Electron Positron Collider at Cern-Hepwww [rl.ac.uk]

    • I was going to say this is nothing like democracy but then I noticed key word in some of your phrases in your reply such as:

      "in the future trying to please"

      or

      "I forsee distributed nets of the future attempting to produce results"

      and

      "to keep people interested and donating"

      then it all made sense.

    • I forsee distributed nets of the future attempting to produce results, in order to keep people interested and donating their computer cycles.

      Ain't gonna happen. Unlike seti@home, the data per CPU minute throughput in high energy physics (aka particle physics) is much higher.

      As an order of magnitude, simulating a single collision of the upcoming LHC's proton-proton beam (Large Hadron Collider is CERN's upcoming accelerator, supposed to start 2007/2008), takes about 1 CPU minute and generates ca 1 MB of
  • by ElJosho ( 642871 ) on Sunday April 06, 2003 @01:29PM (#5673706)
    Finally, IBM is building a computer powerful enought to answer the ultimate question of Life, the Universe, and Everything.
    • by Ridge ( 37884 )
      If you're in a war, instead of throwing a hand grenade at some guys, throw one of those little baby-type pumpkins. Maybe it'll make everyone think of how crazy war is, and while they're thinking, you can throw a real grenade.

      -- Jack Handy
  • by manseman ( 582150 ) on Sunday April 06, 2003 @01:31PM (#5673715)
    some of the most fundamental questions about the nature of matter and the Universe.

    I'll save them the effort.

    42.

    • No, you've got is wrong: They're trying to understand the questions, not the answers.
      • ...I thought the question and the answer couldn't exist in the same universe... Or maybe that understanding came from an alternate me in an alternate universe where "what do you get if you multiply six by nine" is forty-two.

        More on-topic though, I am utterly boggled by the amount of data they're looking at holding. I can see a stack of CDs and know they're a terrabyte worth of data -- but I barely grok what a terrabyte can really hold -- maybe I don't and just think I do.

        But a petabyte. Wow. 1.5 million

        • But a petabyte. Wow. 1.5 million CDs. That's just... Just... *shrug*

          Think of it as being about a quarter of a Google.

          (I don't know exactly how big Google is now, but they were at 1.5PB a couple years ago, so they're probably somewhere around 4PB now.)
  • by Anonymous Coward on Sunday April 06, 2003 @01:35PM (#5673729)
    CERN is definitely one of the coolest places on Earth. For a bid a couple of years ago, I had to do some reasearch regarding their storage requirements and data management facilities. These people produce 10 *Peta*bytes* of data per year. For us mortals to understand that number, it's got to be converted to gigabytes per second: 300+ GB/s. On the basement we've got a 2 TB RAID. The people at CERN fill that baby in less than 7 seconds. No, scratch that. Our RAID (dual fibrechannel 10000 rpm SCSI discs) tops at 120 MB/s. See what I mean? Just try to grasp the kind of SAN these people have in order to move 300 GB/s arround.

    Filling application right away.
  • Correct me if I'm wrong, but aren't we going to get a taste of IBM's Grid technology on Sony's spanking new Playstation 3?
  • Specifically. . . (Score:5, Informative)

    by Fritz Benwalla ( 539483 ) <randomregs@@@gmail...com> on Sunday April 06, 2003 @01:55PM (#5673807)

    This system stores, crunches, and distributes data generated by the Large Hadron Collider. They generate a million gig a year in data, and need to make it available in some functional way to physicists. Manditory groovy collider pic here. [web.cern.ch]

    A major collaborator on this stuff is Globus [globus.org] which provides an API for grid applications. Same people who are partners with IBM in the butterfly.net game grid.

    Maybe MTU can use it to store their students' Kazaa archives.

    ------

  • 1 Petabyte! Thats a lotta porn!

Neutrinos have bad breadth.

Working...