How the Tevatron Influenced Computing 66
New submitter SciComGeek writes "Few laypeople think of computing innovation in connection with the Tevatron particle accelerator, which shut down earlier this year. Mention of the Tevatron inspires images of majestic machinery, or thoughts of immense energies and groundbreaking physics research, not circuit boards, hardware, networks, and software. Yet over the course of more than three decades of planning and operation, a tremendous amount of computing innovation was necessary to keep the data flowing and physics results coming. Those innovations will continue to influence scientific computing and data analysis for years to come."
And the web... (Score:3)
And the web was created at CERN. Enough said.
Re:And the web... (Score:5, Informative)
U of I [illinois.edu] have had supercomputers for decades. Of course a lot of computation is needed for the Tevatron, from controlling the streams to analyzing the data. U of I is also home to the Tevatron. [illinois.edu]
Odd that people don't think of Illinois when they think of computing and physics.
Re: (Score:3)
U of I is not home to the Tevatron. That's a page for people who are at U of I and work on the Tevatron. Illinois (the state) is the home of the Tevatron.
Fermilab was built by the DOE and managed by a consortium of universities. It's now run as a partnership between that consortium and the U of Chicago.
Re: (Score:2)
Re: (Score:2)
They are/were a member as well, but the DOE under the Bush administration wanted a single university or company as the prime contractor.
http://www.fra-hq.org/ [fra-hq.org]
Rose-tinted glasses (Score:1)
And the web was created at CERN.
Not to mention Scientific Linux (which was frankly unusable it was so out of date until CERN took over) and ROOT and a whole host of other particle physics computing applications. In fact the whole article is the most rose-tinted, inaccurate view of Fermilab computing I have ever seen - perhaps they should have talked to some of the users of that computing on the experiments. Rather than leading the charge into Linux computing farms Fermilab had to be dragged kicking and screaming away from their large mon
Re: (Score:2)
Not to mention Scientific Linux (which was frankly unusable it was so out of date until CERN took over)
I can't comment on the rest of what is written here, but this statement in particular is definitely a false statement. CERN did not take over this project. Scientific Linux remains a collaboration between the two labs. See:
SL is a Linux release put together by Fermilab, CERN, and various other labs and universities around the world. Its primary purpose is to reduce duplicated effort of the labs, and to have a common install base for the various experimenters. -From http://www.scientificlinux.org/ [scientificlinux.org]
If you click on the "about" page, you'll see that there are two "main" developers from Fermilab, two from CERN, one from DESY, and one from ETHZ.
Re: (Score:1)
I can't comment on the rest of what is written here, but this statement in particular is definitely a false statement. CERN did not take over this project. Scientific Linux remains a collaboration between the two labs.
Officially true. However given the state of 'Fermilinux' when they combined and how much things improved it was very clear where the project leadership was really coming from.
Your logic here is fallacious. Fermi Linux was never intended for internal use, and it had a small team with limited time to work on it. The improvement you cite could just as well be attributed to the fact that you can accomplish more with more people working on it, and the fact that they were now designing it for outside use.
As for how the project progressed, it came out of a HEPIX meeting in 2003. The Red Hat change was discussed, and the system's two Fermilab developers went home, repackaged it, and ret
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
I've seen plenty of SGI and Cray and VAX iron in the early nineties at the various physics departments when I was an undergrad.
Re: (Score:2)
Particle physics and astronomy are some of the disciplines that have the biggest amount of numbers to crunch.
The limits of what can be done often are major design constraints for the devices being built.
Everyone knows about CERN, but I'm currently working on LOFAR, a low frequency radiotelescope in NW-europe (NL, DE, SE, UK, FR)
Our regular data flow is 200 Gbit/s, we can get up to 10 Tbit/s in burst mode. Even after heavy averaging we write about a Petabyte a week to disk. Most data only is kept for up to 4
TRUE inovation will always happen at places ... (Score:4, Insightful)
... where people need something new to fix a problem.
It will never really happen at places where people want to make a quick buck with it.
Re: (Score:3)
TRUE inovation will always happen at places where people need something new to fix a problem. It will never really happen at places where people want to make a quick buck with it.
What about "our customers are leaving us for the competition", is that a problem? Of course they try all sorts of other and sometimes quite innovative ways of keeping the customers too, but sometimes corporations do innovate to make a buck ;)
Re: (Score:3)
Your preposition kind of excludes innovation. If competition exists, it implies you are working in an established field and market. True innovations on the other hand create new and unprecedented markets which have yet to be established.
Re: (Score:3)
More often than not actual innovation will be the last option considered after all others have failed. History has shown that innovative companies often lose out to aggressive and unscrupulous competitors.
Re: (Score:3)
Unfortunately it can often be more profitable to stifle or delay an innovation, and so in a purely profit driven organisation this will be the course of action taken.
How many highly innovative technologies exist behind closed doors because their release would obsolete an older but more profitable technology?
Re: (Score:2)
There is a world of difference between saying sometimes innovative technologies are held up for business reasons and saying that businesses NEVER innovate. It is obvious to anyone with half a brain that businesses DO in fact innovate. Just look at a smart phone, for instance. Everything from the battery to the display to the touch screen to the manufacturing processes that allow so much function to be packed in such a small space required innovation, most of it done by businesses looking to 'make a buck
Re: (Score:2)
Re: (Score:1)
Corporate innovation is of course there. And I definitely only dismiss the short-sighted corporate planning of these days.
When say a hundred, fifty, or even thirty years ago an employee came up with an idea that didn't give short-term profit, but might have been a profitable thing 4-5 years from now, the chances were WAY bigger that a company developed something in that direction.
In the climate today, where everything has to turn a profit in 6-12 month or it isn't tried corporate research can of course make
Red Hat not opens source??? (Score:2)
Fermi Linux enjoyed limited adoption outside of Fermilab, until 2003, when Red Hat Linux ceased to be open source.
A typo?
Re: (Score:1)
Re: (Score:1)
This is why we need the big projects (Score:5, Informative)
Re: (Score:2, Funny)
Impossible. Progress can only happen in space.
RAIT (Score:4, Interesting)
One of the National Labs was using a parallel array of fast tape, I think LTO, to get decent speed (1 GBPS or so) and decent capacity (10TB). Good for recording all the data from one experiment.
Re: (Score:2)
I worked on a "tiny" particle physics experiment ... in about 4 months of running, we collected 150TB of data. My current experiments will collect PBs of data, and LHC is expected to collect EB of data over its lifetime. 10TB would be considered peanuts these days :-)
Re: (Score:3, Insightful)
I worked in a mid-size experiment in the mid-90's -- we acquired about 2TB of data -- but that's back when the biggest readily available SCSI disk you could buy had just doubled in size to a whopping 18Gb; a big tape would hold maybe 10Gb compressed. Data volumes all depend on data flow capacities.
Two things hold: you will acquire more data faster until you hit a bottleneck then you'll move the bottleneck a few times; AND you will increase the computation being performed on a data set until it is slow --
Re: (Score:2)
Yeah, there are still some of those high end tape systems in our basement. Used a lot in radioastronomy up until a few years ago. A single tape unit can do 500 Mbit/s, we have 16 I think. 8 Gbit/s to 2 decade old hardware is still impressive. They don't get used much any more. No new recording, only playback of some old data. But when they are running it's impressive, IIRC they do 20 m/s tape speed.
Re: (Score:1)
Re: (Score:3)
Wrong. The first electronic computer (discounting the secret British one -- if it's secret, for all intents and purposes it doesn't exist) was ENIAC, [wikipedia.org] patented in 1946, a quarter century before Apollo 11 and six years before I was born.
I hope you're still in junior high, because if not your teachers REALLY suck.
Re: (Score:1)
Wrong. The first electronic computer (discounting the secret British one -- if it's secret, for all intents and purposes it doesn't exist) was ENIAC, [wikipedia.org] patented in 1946, a quarter century before Apollo 11 and six years before I was born.
I hope you're still in junior high, because if not your teachers REALLY suck.
Wrong, the first programmable computer was Konrad Zuse's Z1 [wikipedia.org], built between 1935 and 1938 (or his Z3, which was the first "Turing complete" computer in 1941). Time to retire, mcgrew :)
PET/MRI (Score:5, Informative)
The article barely scratches the surface (Score:5, Informative)
I'll rattle off a half dozen from the top of my head:
According to Robert Young, one of the founders of Red Hat, Fermilab's adoption of Linux was one of the seminal events in the acceptance of Linux as a real operating system.
IBM's SP series of computers was inspired by the IBM RS6000 compute farms at Fermilab.
The original Linux CD driver was written by an experimenter at the DZero group at Fermilab.
Many parallel programming techniques were pioneered on the ACP/MAPS system designed, engineered, and built at Fermilab.
The term "compute farm" was coined at Fermilab.
Fermilab was the world's third web site, after CERN and SLAC.
Re: (Score:3)
Re: (Score:2)
remember the original purpose of the world wide web wasn't to distribute porn
We had usenet for porn. Porn over http didn't really take off until Joe Six Pack got online.
Re: (Score:1)
I didn't know writting a CD driver counted as "innovation".
The Software (Score:1)
And just think (Score:2)
Still going on (Score:3)
Tevatron eh? (Score:2)
Re: (Score:2)
I worked there for several years back in the '90s (Score:2)
Best job (all things considered) that I ever had. I got to participate at a pretty deep level in the construction of what was (for a few weeks at least, until it was overtaken by other systems) the fastest supercomputer on the planet; learned a lot about computing in general; and made a number of professional connections that persist to this day (I currently share an office with the same guy I shared an office with when I worked at Fermilab).
So why did I leave, you ask? Multiple reasons: Money (if you were
Tagged Photon Lab (Score:3)
I worked at the Tagged Photon Lab - that's my PhD advisor Don Summers loading tapes into the Great Wall of drives. We drove the poor folks at Exabyte nuts.
Eric Aitala