Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Supercomputing Science Technology

How the Tevatron Influenced Computing 66

New submitter SciComGeek writes "Few laypeople think of computing innovation in connection with the Tevatron particle accelerator, which shut down earlier this year. Mention of the Tevatron inspires images of majestic machinery, or thoughts of immense energies and groundbreaking physics research, not circuit boards, hardware, networks, and software. Yet over the course of more than three decades of planning and operation, a tremendous amount of computing innovation was necessary to keep the data flowing and physics results coming. Those innovations will continue to influence scientific computing and data analysis for years to come."
This discussion has been archived. No new comments can be posted.

How the Tevatron Influenced Computing

Comments Filter:
  • by InfiniteZero ( 587028 ) on Friday December 23, 2011 @10:30AM (#38471510)

    And the web was created at CERN. Enough said.

    • Re:And the web... (Score:5, Informative)

      by mcgrew ( 92797 ) * on Friday December 23, 2011 @10:53AM (#38471822) Homepage Journal

      U of I [illinois.edu] have had supercomputers for decades. Of course a lot of computation is needed for the Tevatron, from controlling the streams to analyzing the data. U of I is also home to the Tevatron. [illinois.edu]

      Odd that people don't think of Illinois when they think of computing and physics.

      • by vondo ( 303621 )

        U of I is not home to the Tevatron. That's a page for people who are at U of I and work on the Tevatron. Illinois (the state) is the home of the Tevatron.

        Fermilab was built by the DOE and managed by a consortium of universities. It's now run as a partnership between that consortium and the U of Chicago.

    • by Anonymous Coward

      And the web was created at CERN.

      Not to mention Scientific Linux (which was frankly unusable it was so out of date until CERN took over) and ROOT and a whole host of other particle physics computing applications. In fact the whole article is the most rose-tinted, inaccurate view of Fermilab computing I have ever seen - perhaps they should have talked to some of the users of that computing on the experiments. Rather than leading the charge into Linux computing farms Fermilab had to be dragged kicking and screaming away from their large mon

      • Not to mention Scientific Linux (which was frankly unusable it was so out of date until CERN took over)

        I can't comment on the rest of what is written here, but this statement in particular is definitely a false statement. CERN did not take over this project. Scientific Linux remains a collaboration between the two labs. See:

        SL is a Linux release put together by Fermilab, CERN, and various other labs and universities around the world. Its primary purpose is to reduce duplicated effort of the labs, and to have a common install base for the various experimenters. -From http://www.scientificlinux.org/ [scientificlinux.org]

        If you click on the "about" page, you'll see that there are two "main" developers from Fermilab, two from CERN, one from DESY, and one from ETHZ.

      • what a bullshit fairy tale. Various clusters were built of SGI, IBM workstations and DEC vaxstations. There were CADD/CAE groups who liked SGI machine (and others who used PCs and others with Sun) and some of those had SGI servers,. Until 1995, "the mainframe" for physics use outside of the clusters was an Amdahl, IBM mainframe clone on steroids (actually won over competing supercomputer bids!). First PC farm with dual pentium iii at 333MHz was in 1998. I was there.
        • I've seen plenty of SGI and Cray and VAX iron in the early nineties at the various physics departments when I was an undergrad.

    • Particle physics and astronomy are some of the disciplines that have the biggest amount of numbers to crunch.

      The limits of what can be done often are major design constraints for the devices being built.

      Everyone knows about CERN, but I'm currently working on LOFAR, a low frequency radiotelescope in NW-europe (NL, DE, SE, UK, FR)

      Our regular data flow is 200 Gbit/s, we can get up to 10 Tbit/s in burst mode. Even after heavy averaging we write about a Petabyte a week to disk. Most data only is kept for up to 4

  • by aix tom ( 902140 ) on Friday December 23, 2011 @10:34AM (#38471554)

    ... where people need something new to fix a problem.

    It will never really happen at places where people want to make a quick buck with it.

    • by Kjella ( 173770 )

      TRUE inovation will always happen at places where people need something new to fix a problem. It will never really happen at places where people want to make a quick buck with it.

      What about "our customers are leaving us for the competition", is that a problem? Of course they try all sorts of other and sometimes quite innovative ways of keeping the customers too, but sometimes corporations do innovate to make a buck ;)

      • Your preposition kind of excludes innovation. If competition exists, it implies you are working in an established field and market. True innovations on the other hand create new and unprecedented markets which have yet to be established.

      • by Bert64 ( 520050 )

        More often than not actual innovation will be the last option considered after all others have failed. History has shown that innovative companies often lose out to aggressive and unscrupulous competitors.

    • I agree that the fact that basic research leads to unexpected spinoff technologies is not generally given sufficient recognition, which your comment seems to imply, aix tom. But don't forget that both the Tevatron and LHC computing architectures are based on the use of cheap commercial technology. Without affordable computing components and later PCs, they could not accomplished all of these other things. It's a symbiosis. Of course, from your comment it isn't clear whether or not you meant to dismiss corp
      • by aix tom ( 902140 )

        Corporate innovation is of course there. And I definitely only dismiss the short-sighted corporate planning of these days.

        When say a hundred, fifty, or even thirty years ago an employee came up with an idea that didn't give short-term profit, but might have been a profitable thing 4-5 years from now, the chances were WAY bigger that a company developed something in that direction.

        In the climate today, where everything has to turn a profit in 6-12 month or it isn't tried corporate research can of course make

  • Fermi Linux enjoyed limited adoption outside of Fermilab, until 2003, when Red Hat Linux ceased to be open source.

    A typo?

  • by sandytaru ( 1158959 ) on Friday December 23, 2011 @10:48AM (#38471754) Journal
    I got the mini tour at Lawrence Livermore National Lab a few years back. They've spent about three billion dollars on a proof of concept system for hot fusion. During the project, they invented a process to extrude entire sheets of solid ruby crystals, and hundreds of other innovations. Yes, three billion dollars is a lot of money. The things they had to create will reverberate throughout the private sector for decades, however, and they plan on selling off the final hot fusion plans to private companies who will profit from it once they've got all the kinks worked out.
    • Re: (Score:2, Funny)

      by Anonymous Coward

      Impossible. Progress can only happen in space.

  • RAIT (Score:4, Interesting)

    by Smallpond ( 221300 ) on Friday December 23, 2011 @10:49AM (#38471778) Homepage Journal

    One of the National Labs was using a parallel array of fast tape, I think LTO, to get decent speed (1 GBPS or so) and decent capacity (10TB). Good for recording all the data from one experiment.

    • by krlynch ( 158571 )

      I worked on a "tiny" particle physics experiment ... in about 4 months of running, we collected 150TB of data. My current experiments will collect PBs of data, and LHC is expected to collect EB of data over its lifetime. 10TB would be considered peanuts these days :-)

      • Re: (Score:3, Insightful)

        by Anonymous Coward

        I worked in a mid-size experiment in the mid-90's -- we acquired about 2TB of data -- but that's back when the biggest readily available SCSI disk you could buy had just doubled in size to a whopping 18Gb; a big tape would hold maybe 10Gb compressed. Data volumes all depend on data flow capacities.

        Two things hold: you will acquire more data faster until you hit a bottleneck then you'll move the bottleneck a few times; AND you will increase the computation being performed on a data set until it is slow --

    • Yeah, there are still some of those high end tape systems in our basement. Used a lot in radioastronomy up until a few years ago. A single tape unit can do 500 Mbit/s, we have 16 I think. 8 Gbit/s to 2 decade old hardware is still impressive. They don't get used much any more. No new recording, only playback of some old data. But when they are running it's impressive, IIRC they do 20 m/s tape speed.

  • PET/MRI (Score:5, Informative)

    by sirdude ( 578412 ) on Friday December 23, 2011 @10:56AM (#38471844)
    In medicine, one of the offshots from CERN & the LHC has been the development/improvement of the MRI scanner [web.cern.ch].
  • by stox ( 131684 ) on Friday December 23, 2011 @10:59AM (#38471888) Homepage

    I'll rattle off a half dozen from the top of my head:

    According to Robert Young, one of the founders of Red Hat, Fermilab's adoption of Linux was one of the seminal events in the acceptance of Linux as a real operating system.

    IBM's SP series of computers was inspired by the IBM RS6000 compute farms at Fermilab.

    The original Linux CD driver was written by an experimenter at the DZero group at Fermilab.

    Many parallel programming techniques were pioneered on the ACP/MAPS system designed, engineered, and built at Fermilab.

    The term "compute farm" was coined at Fermilab.

    Fermilab was the world's third web site, after CERN and SLAC.

    • About the website, remember the original purpose of the world wide web wasn't to distribute porn but for researchers like those at CERN and Fermilan to share information. Yes email existed but they needed something more akin to a kiosk to post information to anyone in the world that was interested.
      • by lewiscr ( 3314 )

        remember the original purpose of the world wide web wasn't to distribute porn

        We had usenet for porn. Porn over http didn't really take off until Joe Six Pack got online.

    • by Bomazi ( 1875554 )

      I didn't know writting a CD driver counted as "innovation".

  • Great article. Well written, interesting and informative. Once more we are reminded that It's All About The Software.
  • What the hardware used for LHC is going to spawn. High speed networking, storage arrays, things of that nature are going to be interesting.
  • by fermion ( 181285 ) on Friday December 23, 2011 @11:32AM (#38472298) Homepage Journal
    My understanding is the the LHC currently involves a worldwide computing grid capable of distributing on the order of a petabyte of data a month, and doing basic analysis of much more. The thing is that the people who work at such places are highly intelligent problem solvers that are not going to throw out ideas simply because it does not meet some preconceived notion. They are not going to say don't paint the roof white simply because no one has done it before. They have problems to solve, and know how to get the funding to do it.
  • Sounds like a Transformer, probably a Decepticon.
  • Best job (all things considered) that I ever had. I got to participate at a pretty deep level in the construction of what was (for a few weeks at least, until it was overtaken by other systems) the fastest supercomputer on the planet; learned a lot about computing in general; and made a number of professional connections that persist to this day (I currently share an office with the same guy I shared an office with when I worked at Fermilab).

    So why did I leave, you ask? Multiple reasons: Money (if you were

  • by aitala ( 111068 ) on Friday December 23, 2011 @06:23PM (#38477232) Homepage

    I worked at the Tagged Photon Lab - that's my PhD advisor Don Summers loading tapes into the Great Wall of drives. We drove the poor folks at Exabyte nuts.

    Eric Aitala

If all else fails, lower your standards.

Working...