Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Supercomputing Science

CERN Launches Huge LHC Computing Grid 46

RaaVi writes "Yesterday CERN launched the largest computing grid in the world, which is destined to analyze the data coming from the world's biggest particle accelerator, the Large Hadron Collider. The computing grid consists of more than 140 computer centers from around the world working together to handle the expected 10-15 petabytes of data the LHC will generate each year." The Worldwide LHC Computing Grid will initially handle data for up to 7,000 scientists around the world. Though the LHC itself is down for some lengthy repairs, an event called GridFest was held yesterday to commemorate the occasion. The LCG will run alongside the LHC@Home volunteer project.
This discussion has been archived. No new comments can be posted.

CERN Launches Huge LHC Computing Grid

Comments Filter:
  • Finally... (Score:5, Funny)

    by Anonymous Coward on Saturday October 04, 2008 @10:54AM (#25256181)

    We can run Vista *with* Aero

  • Doomed (Score:5, Funny)

    by dexmachina ( 1341273 ) on Saturday October 04, 2008 @11:01AM (#25256221)
    I've heard this so called "grid" might create an information black hole from all of the data. Is this true? We're all gonna die!
  • DOS, anyone? (Score:3, Interesting)

    by xdor ( 1218206 ) on Saturday October 04, 2008 @11:02AM (#25256223)
    So does this network "grid" rely on TCP? Can all this be rendered useless by Robert E. Lee's hack?
    • by Zibri ( 1063838 )

      Yes TCP/IPv4. I saw an interview with some scientist working on ATLAS talking about the grid. Funny thing is, the interviewer was amazed "Is it REAL internet? It is not IPv6?"

    • by mysidia ( 191772 )

      So does this network "grid" rely on TCP? Can all this be rendered useless by Robert E. Lee's hack?

      Yes, but it's not open to the world, it is only accessible at all to their computing centers and research partners.

      Your attack would probably be noticed relatively quickly, due to the distributed nature of the grid and the fact they were concerned about security measures, and supposedly designed countermeasures into the system, an attack is likely to be rapidly quashed.

  • Porn? (Score:5, Funny)

    by Anonymous Coward on Saturday October 04, 2008 @11:13AM (#25256269)

    "As the grid is ready but no data is expected to be produced by the LHC for the next few months, engineers have received permission to temporarily fill all 15 petabytes with adult material in an effort to test the infrastructure."

    • Re:Porn? (Score:4, Funny)

      by kurzweilfreak ( 829276 ) <kurzweilfreak&gmail,com> on Saturday October 04, 2008 @12:59PM (#25256919) Journal
      Mmm, hot particle-on-particle action.
    • by mysidia ( 191772 )

      "As the grid is ready but no data is expected to be produced by the LHC for the next few months, engineers have received permission to temporarily fill all 15 petabytes with adult material in an effort to test the infrastructure."

      I have a better idea... let's concentrate the power on cracking RSA.

      Specifically MS' certificates' private keys to be able to sign software to run on the Xbox, and to sign software like the LILO & Grub bootloaders to run in TCPA / Palladium "trusted" mode.

      They would be d

    • by RDW ( 41497 )

      '"As the grid is ready but no data is expected to be produced by the LHC for the next few months, engineers have received permission to temporarily fill all 15 petabytes with adult material in an effort to test the infrastructure."'

      The problem, of course, is what to do with all this 'material' when the collider eventually starts producing data. It's now well known that Tim Berners-Lee hurriedly developed the Web within months of the LEP going on line in 1989, which was causing an equally serious storage pro

    • Re: (Score:2, Funny)

      by Anonymous Coward
      You jest, but BTDT. A couple of years ago, when Audiogalaxy was the newest and greatest way to get music (you remember Audiogalaxy, right?), my university's computing centre and CS department got a Very Big Storage Box from IBM. I worked as an assistant there at the time. My boss told me he was "testing the box by downloading Audiogalaxy". I at first didn't get it, until he added, smirking: "...all of it." High load stress testing then was done by leaking knowledge of this box' existence to a couple of v
  • From a circle to a grid, what will they come up with next?
  • by Anonymous Coward

    Great, just Great. Whose idea was it to connect a wormhole machine to sky net?

  • Now you can help suck up the world from the comfort of your own home.
  • This is Fantastic News! If only the LHC could provide information to actually process data it would be an interesting project. ;) If I find any of the mystery particles on my computer, do I get a free black hole? Cash prizes, as some other distributed computing projects offer, would suck in comparison. Don't like someone? Cash won't always help but a black hole will ruin anyone's day!

    • by mysidia ( 191772 )

      In their spare time, they can try analyzing data from other collides.. perhaps some outsourced computing services.

      Perhaps they can make a sales pitch to some grid time to Google and buyers of mainframe processing.

      You know.. until the LHC comes online, use the resources to spider and analyze the web. Who knows... Higgs boson may be hiding on the interweb(TM) somewhere.

  • I don't know why they need such a big grid, according to the inquirer they only create about 15 Gigs of data each year. Whatever that means.

    They were bad, but now I'm 100% sure that they are nothing but a big gig themselves, and I've removed them from my bookmarks.

    Source:
    http://www.theinquirer.net/gb/inquirer/news/2008/10/03/lhc-spews-15million-gb [theinquirer.net]

    • by SnowZero ( 92219 )

      Did you miss the "million" in there?

    • Re: (Score:2, Informative)

      by timboe ( 932655 )

      I don't know why they need such a big grid, according to the inquirer they only create about 15 Gigs of data each year.

      No, 15 million gigs - now you see! And yes, a full detector read out consists of every non zero channel in the entire detector which comes to about 3 Mb per event and we readout ~200 events/sec. And there are 4 main detectors each doing this. Not even mentioning the processor power to run statistical analysis on these data sets!

    • by mysidia ( 191772 )

      It's much more massive than that.

      Also, they may not release all data.. I.E. They have a need to analyze much more data than will be published as "interesting" with regards to the experiment being run.

      You don't magically know what parts of the collected data are the most interesting until the analysis is completed on the grid.

  • I has a (Score:5, Funny)

    by markov_chain ( 202465 ) on Saturday October 04, 2008 @12:33PM (#25256769)

    hadron!

  • SLAC used to use the crystal ball detector for detecting the data and pouring it into an Oracle 10g grid of sun machines, and they'd data mine the data to generate theories. http://en.wikipedia.org/wiki/Crystal_Ball_(detector) [wikipedia.org] I wonder what they call their detector for the LHC, and what database software they use. Also, I'd be curious to know what they use to mine the data, and how they go about it.

    • by Gromius ( 677157 ) on Sunday October 05, 2008 @04:23AM (#25262311)
      Traditionally particle physics doesnt use the data to "generate" theories as such. We use the data to measure various properties (W mass, Z->ll mass spectrum, lepton pt spectra), looking for descrepances with the theory predictions. Then we (hopefully) go, oops this doesnt agree with the theory, we'ld better come up with another explaination. Recently its been, ah SM predictions confirmed *again*.

      I can only really speak for CMS (one of the two big general purpose experiments) but every experiment does similar things. Basically the data is split into smaller datasets based on what we decided was interesting in the event (basically what trigger fired). So we split it into events with electrons, muons, photons, jets (yes events will have multiple of the above but dont worry about how we deal with it). Then each physicist looking for a specific signature (ie a top quark, or in my case high mass e+e- pair) runs their custom homebrew statistical analysis (which use common tools) to pick out the events they are interested in. There are also physicists who run custom designed programs to pick out *any* descrepancy from theory predictions but as they are more general, they arent as senstive as a dedicated analysis on a single channel.
      • by c0d3r ( 156687 )

        Don't you think a different approach should be made, where you use the data to create the theories, rather than for verification? Machine learning, neural networks and data mining are some of the biggest projects in corporate america these days, where you use the businesses resulting data to optimize and refine their operation.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (10) Sorry, but that's too useful.

Working...