First 500 Terabytes Transmitted via LHCGlobal Grid 244
neutron_p writes "When the LHC Computer Grid starts operating in 2007, it will be the most data-intensive physics instrument on the planet. Today eight major computing centers successfully completed a challenge to sustain a continuous data flow of 600 megabytes per second on average for 10 days from CERN in Geneva, Switzerland to seven sites in Europe and the US. The total amount of data transmitted during this challenge -- 500 terabytes -- would take about 250 years to download using a typical 512 kilobit per second household broadband connection."
Great! (Score:5, Insightful)
Re:Great! (Score:1, Insightful)
Re:This pales in comparison to... (Score:1, Insightful)
Re:rr (Score:5, Insightful)
Not really. (Score:3, Insightful)
More to the point, the time it would take to get the data onto and off the tapes is left out of your argument. The bandwidth of a truck full of tapes is an old argument, but they're just so damn slow at both endpoints, they're not that useful after all
When the data arrives through a network pipe, it's on disk ready to be crunched through whatever program you're running...
8 or 9 years ago, I used to work in the post-production industry in Soho, London. There's a network called 'Sohonet' where lots of the major post-houses had ATM links to each other (hey, ATM was blazingly fast for the time
Simon
What is there in it for Joe Sixpack? (Score:2, Insightful)
Imaging 2007, *AA has made it almost impossible to download any content. So I'm sitting on 600 MB/sec of BW and checking /. and reading emails.
Not sure why this is completely notable (Score:5, Insightful)
OK... they lit up the equivalent of two OC48's worth of bandwidth. That's half of an OC192 or a 10G Ethernet. There have been long haul OC192's for a number of years now. If I hook up a hardware-based traffic generator and run at 100% over an OC192 for a few weeks will I get a slashdot article, too?