How the Tevatron Influenced Computing 66
New submitter SciComGeek writes "Few laypeople think of computing innovation in connection with the Tevatron particle accelerator, which shut down earlier this year. Mention of the Tevatron inspires images of majestic machinery, or thoughts of immense energies and groundbreaking physics research, not circuit boards, hardware, networks, and software. Yet over the course of more than three decades of planning and operation, a tremendous amount of computing innovation was necessary to keep the data flowing and physics results coming. Those innovations will continue to influence scientific computing and data analysis for years to come."
TRUE inovation will always happen at places ... (Score:4, Insightful)
... where people need something new to fix a problem.
It will never really happen at places where people want to make a quick buck with it.
Re:RAIT (Score:3, Insightful)
I worked in a mid-size experiment in the mid-90's -- we acquired about 2TB of data -- but that's back when the biggest readily available SCSI disk you could buy had just doubled in size to a whopping 18Gb; a big tape would hold maybe 10Gb compressed. Data volumes all depend on data flow capacities.
Two things hold: you will acquire more data faster until you hit a bottleneck then you'll move the bottleneck a few times; AND you will increase the computation being performed on a data set until it is slow -- you pick your tolerance for how long a statistically significant sample should take to analyze then adjust the analysis to fill that time. This holds in particles AND it holds in MR imaging physics where say a new technique gives you 2x the signal per unit time -- a balancing game ensues and some signal is spent in better SNR, some in better spatial resolution and some in shorter scan times. You will expand what you do until it becomes painful in some manner.