"Evolution of the Internet" Powers Massive LHC Grid 93
jbrodkin brings us a story about the development of the computer network supporting CERN's Large Hadron Collider, which will begin smashing particles into one another later this year. We've discussed some of the impressive capabilities of this network in the past.
"Data will be gathered from the European Organization for Nuclear Research (CERN), which hosts the collider in France and Switzerland, and distributed to thousands of scientists throughout the world. One writer described the grid as a 'parallel Internet.' Ruth Pordes, executive director of the Open Science Grid, which oversees the US infrastructure for the LHC network, describes it as an 'evolution of the Internet.' New fiber-optic cables with special protocols will be used to move data from CERN to 11 Tier-1 sites around the globe, which in turn use standard Internet technologies to transfer the data to more than 150 Tier-2 centers. Worldwide, the LHC computing grid will be comprised of about 20,000 servers, primarily running the Linux operating system. Scientists at Tier-2 sites can access these servers remotely when running complex experiments based on LHC data, Pordes says. If scientists need a million CPU hours to run an experiment overnight, the distributed nature of the grid allows them to access that computing power from any part of the worldwide network"
But does it run... (Score:5, Interesting)
Then youve got a bunch of scientists who are fundamentally geeks
And its all being setup in Europe, which isnt as under the grip of MS
As a bonus
They need to ability to look back and explain all their analysis which means they have to see the source
It costs a hell of a lot to get the data so they dont want to loose any data anywhere.
They have a lot of results to analyse so they dont want to be waiting for the server to come back on-line.
Could they of gone with BSD? probably, but most science tools are developed for linux.
Re:"fiber-optic cables with special protocols" (Score:3, Interesting)
I suspect the "special protocols" they are referring to are about the data transfer protocols (GridFTP for data movement), not some wonky Layer-1 protocol. However, these folks, like I2, have been investing in dynamic-circuit equipment, meaning that sites could eventually get dedicated bandwidth between any two installations.
Re:All that and we still have no anti-gravity (Score:3, Interesting)
So, what have you done today to help make science fiction closer to reality?
I worked on the board layout for my rocket test stand data acquisition system. Sure, it's far removed from a trip to Mars, but you have to start somewhere. I'll bet you can't even say that much.
If you're unwilling to put forth any effort, quit bitching at those who are.
Some Realtime (Score:5, Interesting)
For the Tier 1 a significant fraction of the data is raw 'sensor' (we call it detector) data. This allows reconstruction program converts the data into physics objects like electrons, muons, jets etc.) to be rerun on the data once bugs in the initial reconstruction program have been fixed.
Re:All that and we still have no anti-gravity (Score:3, Interesting)
I work in my spare time on an open source project called factdiv. The idea is to use FACTOR as a problem to learn how to attack complexity itself. Complexity problems underly all the great open questions in science and so if you can solve those, you sorta solve them all.
So far, results haven't been all that great, but, someone will get there. If we do, then we can have computers answer questions, like, how to take 10,000,000 parts and build a spaceship, how to take a model of science and devise experiments to probe its limits, and have it wired up automatically to manufacturing apparatus so that it can pretty much do unattended science 24x7, and then file away the knowledge that will last far more long than a mere human brain can live.
But, I still can only FACTOR about 20 digits numbers and have no good complexity answers, but as you said, you have to start somewhere.