Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Compare cell phone plans using Wirefly's innovative plan comparison tool ×
Networking The Internet Science Linux

"Evolution of the Internet" Powers Massive LHC Grid 93

jbrodkin brings us a story about the development of the computer network supporting CERN's Large Hadron Collider, which will begin smashing particles into one another later this year. We've discussed some of the impressive capabilities of this network in the past. "Data will be gathered from the European Organization for Nuclear Research (CERN), which hosts the collider in France and Switzerland, and distributed to thousands of scientists throughout the world. One writer described the grid as a 'parallel Internet.' Ruth Pordes, executive director of the Open Science Grid, which oversees the US infrastructure for the LHC network, describes it as an 'evolution of the Internet.' New fiber-optic cables with special protocols will be used to move data from CERN to 11 Tier-1 sites around the globe, which in turn use standard Internet technologies to transfer the data to more than 150 Tier-2 centers. Worldwide, the LHC computing grid will be comprised of about 20,000 servers, primarily running the Linux operating system. Scientists at Tier-2 sites can access these servers remotely when running complex experiments based on LHC data, Pordes says. If scientists need a million CPU hours to run an experiment overnight, the distributed nature of the grid allows them to access that computing power from any part of the worldwide network"
This discussion has been archived. No new comments can be posted.

"Evolution of the Internet" Powers Massive LHC Grid

Comments Filter:
  • But does it run... (Score:5, Interesting)

    by RiotingPacifist ( 1228016 ) on Wednesday April 23, 2008 @01:56PM (#23174040)
    Oh wait ofc it does, youve basically got science which is fundamentally open source.
    Then youve got a bunch of scientists who are fundamentally geeks
    And its all being setup in Europe, which isnt as under the grip of MS

    As a bonus
    They need to ability to look back and explain all their analysis which means they have to see the source
    It costs a hell of a lot to get the data so they dont want to loose any data anywhere.
    They have a lot of results to analyse so they dont want to be waiting for the server to come back on-line.
    Could they of gone with BSD? probably, but most science tools are developed for linux.
  • by Anonymous Coward on Wednesday April 23, 2008 @01:56PM (#23174052)
    I2 is a US organization. The owner of the transatlantic cables is called the "LHC OPN" (Optical Private Network), I think. The full build-out will be about 80Gbps.

    I suspect the "special protocols" they are referring to are about the data transfer protocols (GridFTP for data movement), not some wonky Layer-1 protocol. However, these folks, like I2, have been investing in dynamic-circuit equipment, meaning that sites could eventually get dedicated bandwidth between any two installations.
  • by evanbd ( 210358 ) on Wednesday April 23, 2008 @02:12PM (#23174220)

    So, what have you done today to help make science fiction closer to reality?

    I worked on the board layout for my rocket test stand data acquisition system. Sure, it's far removed from a trip to Mars, but you have to start somewhere. I'll bet you can't even say that much.

    If you're unwilling to put forth any effort, quit bitching at those who are.

  • Some Realtime (Score:5, Interesting)

    by Roger W Moore ( 538166 ) on Wednesday April 23, 2008 @02:14PM (#23174254) Journal
    Actually not all of it is offline. One of the things I have a research grant for is to develop a realtime remote farm for monitoring the detector. This is to catch subtle detector problems quickly before we end up collecting 2 weeks of useless data.

    For the Tier 1 a significant fraction of the data is raw 'sensor' (we call it detector) data. This allows reconstruction program converts the data into physics objects like electrons, muons, jets etc.) to be rerun on the data once bugs in the initial reconstruction program have been fixed.
  • So, what have you done today to help make science fiction closer to reality?

    I work in my spare time on an open source project called factdiv. The idea is to use FACTOR as a problem to learn how to attack complexity itself. Complexity problems underly all the great open questions in science and so if you can solve those, you sorta solve them all.

    So far, results haven't been all that great, but, someone will get there. If we do, then we can have computers answer questions, like, how to take 10,000,000 parts and build a spaceship, how to take a model of science and devise experiments to probe its limits, and have it wired up automatically to manufacturing apparatus so that it can pretty much do unattended science 24x7, and then file away the knowledge that will last far more long than a mere human brain can live.

    But, I still can only FACTOR about 20 digits numbers and have no good complexity answers, but as you said, you have to start somewhere.

"Truth never comes into the world but like a bastard, to the ignominy of him that brought her birth." -- Milton