IT At the LHC — Managing a Petabyte of Data Per Second 248
schliz writes "iTnews in Australia has published an interview with CERN's deputy head of IT, David Foster, who explains what last month's discovery of a 'particle consistent with the Higgs Boson' means for the organization's IT department, why it needs a second 'Tier Zero' data center, and how it is using grid computing and the cloud. Quoting: 'If you were to digitize all the information from a collision in a detector, it’s about a petabyte a second or a million gigabytes per second. There is a lot of filtering of the data that occurs within the 25 nanoseconds between each bunch crossing (of protons). Each experiment operates their own trigger farm – each consisting of several thousand machines – that conduct real-time electronics within the LHC. These trigger farms decide, for example, was this set of collisions interesting? Do I keep this data or not? The non-interesting event data is discarded, the interesting events go through a second filter or trigger farm of a few thousand more computers, also on-site at the experiment. [These computers] have a bit more time to do some initial reconstruction – looking at the data to decide if it’s interesting. Out of all of this comes a data stream of some few hundred megabytes to 1Gb per second that actually gets recorded in the CERN data center, the facility we call "Tier Zero."'"
Call the Interns! (Score:4, Funny)
Large Organization Has 2 Data Centers (Score:2, Funny)
Comment removed (Score:4, Funny)
Re:Keeping us humble... (Score:5, Funny)
And we jokingly call our data center the "Large Software Collider". Not as funny when the real thing is even bigger!
And Still. (Score:5, Funny)
The head researcher will STILL come to IT and ask them to please help him sync his outlook contacts to his phone.