Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Science Technology

Fermi Lab's New Particle Discovery in Question 62

"Back in April physicists at Fermilab speculated that they may have discovered a new force or particle. But now another team has analyzed data from the collider and come to the exact opposite conclusion. From the article: 'But now, a rival team performing an independent analysis of Tevatron data has turned up no sign of the bump. It is using the same amount of data as CDF reported in April, but this data was collected at a different detector at the collider called DZero. "Nope, nothing here – sorry," says Dmitri Denisov, a spokesman for DZero.'"
This discussion has been archived. No new comments can be posted.

Fermi Lab's New Particle Discovery in Question

Comments Filter:
  • Data sharing (Score:5, Insightful)

    by symes ( 835608 ) on Monday June 13, 2011 @08:22AM (#36423962) Journal

    I think more than anything, this demonstrates why sharing data openly is such a good thing. Sure, not great news for those at Fermi Lab, but if scientists generally (especially those in the behavioural sciences...) were encouraged (or forced?) to allow others free access to their data then I'm sure a few surprising claims might be rewritten and a few interesting blips otherwise missed might be found.

  • Re:Data sharing (Score:2, Insightful)

    by mcelrath ( 8027 ) on Monday June 13, 2011 @09:47AM (#36424574) Homepage

    Bullshit. Look to the the astronomy community for counter-examples. WMAP, SDSS, etc.

    The only reason particle physics keeps its data closed is history and turf-protection by its members. Astronomy has a longer history, and realized the benefits of sharing star catalogs hundreds of years ago.

  • Re:Data sharing (Score:5, Insightful)

    by chissg ( 948332 ) on Monday June 13, 2011 @10:48AM (#36425040)
    [Re-post non-AC] Star catalogs aren't data: they're the results of decades of observations, corroborations, corrections and debates over just exactly what that particular black spot on the white plate was. You want the raw telemetry from every telescope that isn't read out with a Mark I eyeball, and every plate ever taken and scientist's observation note from those that were? You want all the calibration data from WMAP, and all the histograms that were plotted to analyze them and turn them into corrections for the main data so they actually *mean* something? Particle physics, "data" is the 1s and 0s from every piece of sensory equipment in the detector hall, beam area and points between: often millions of readout channels, each of which means something and has its own quirks and problems that need to be measured and understood with more and different types of data (calibration, cosmic rays, etc). And, these readings are taken at frequencies between thousands and millions of times per second. We often have to analyze the data to a preliminary level just to decide whether they're worth keeping to analyze properly later because there's neither the bandwidth nor the storage space nor the computing power -- even now -- to keep them all. The LHC experiments store petabytes of data per month, and storage, access and transfer costs are significant: you pay for access to those data by contributing to the experiment. OK, now let's assume you get the raw data. Now what? Good luck with that. There's a reason scientist groups and expert contractors spend years and sometimes decades writing the reconstruction and analysis software for particle physics experiments: teasing useful results from the data are hard. If we were to spend our timing pointing out the rookie mistakes of every schmo who fiddled with the data for a while and thought he'd found something new, the work would never be done. "Heisenberg's Uncertainty Principle Untenable [icaap.org]," anyone?

The moon is made of green cheese. -- John Heywood

Working...