Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Biotech Cloud Earth Education Space The Internet News Science Technology

CERN Releases 300TB of Large Hadron Collider Data Into Open Access (techcrunch.com) 60

An anonymous reader writes: The European Organization for Nuclear Research, known as CERN, has released 300 terabytes of collider data to the public. "Once we've exhausted our exploration of the data, we see no reason not to make them available publicly," said Kati Lassila-Perini, a physicist who works on the Compact Muon Solenoid detector. "The benefits are numerous, from inspiring high school students to the training of the particle physicists of tomorrow. And personally, as CMS's data preservation coordinator, this is a crucial part of ensuring the long-term availability of our research data," she said in a news release accompanying the data. Much of the data is from 2011, and much of it is from protons colliding at 7 TeV (teraelectronvolts). The 300 terabytes of data includes both raw data from the detectors and "derived" datasets. CERN is providing tools to work with the data which is handy.
This discussion has been archived. No new comments can be posted.

CERN Releases 300TB of Large Hadron Collider Data Into Open Access

Comments Filter:
  • by Lisandro ( 799651 ) on Saturday April 23, 2016 @02:44AM (#51970461)

    I just can visualize a horde of crackpots using this data to fuel fringe theories, find messages from God and prove the existence of aliens.

    That being said, this is awfully cool from CERN. The raw data will be really useful in academic environments, and the Linux visualization tools are great.

    • Re: (Score:2, Funny)

      by Anonymous Coward

      I just can visualize a horde of crackpots using this data to fuel fringe theories

      I heard from good authority that the LHC breached a planar dimension, and one of its red/white striped inhabitants escaped into the LHC data stream.
      So now they're releasing the data into the public in the hopes that someone will find this wimpy alien lifeform data object (waldo)...

    • "Once we've exhausted our exploration of the data, we see no reason not to make them available publicly,"

      Actually they're just sifting and patching out the winning lottery numbers first. In these ~300TB dregs you'd be lucky to find a Pick 3. Best suggestion is to make a list of numbers absent from the data and play those.

      There's also a lot of Quantum Space Spam in it, such as embedded 3D jpgs meant to be projected into 4D space showing reproductive attachments for higher dimensional beings.

    • by starless ( 60879 ) on Saturday April 23, 2016 @09:46AM (#51971727)

      Data from most NASA astronomy satellites is available after a specified amount of time.
      e.g. Hubble Space Telescope data are available after one year, and Fermi gamma-ray space telescope data are available as soon as it's processed (within one day).
      Software tools are also publicly available along with software support.

      Nice to see particle physicists catching up with astronomers on data release!

      • by Anonymous Coward

        CERN has been releasing a lot of data since the 90s. The hard part is not deciding to release it and just copy paste the data to some webserver, but the process of documenting it and making sure there are tools to work with the data. Both the particle physics and astronomy fields have put a lot of time and money into developing these tools for decades. It is now at the point other smaller, data heavy projects can take advantage of the same the same tools with a lot less investment of manpower. But it still

    • by Altrag ( 195300 )

      horde of crackpots using this data to fuel fringe theories

      To be fair, most crackpots would manage to fuel their fringe theories just as well without this data.

      • But this saves the from the delays of writing a random-number generator to invent their supporting data with.

        I'm still expecting to see the crackpots doing the logical equivalent of reading from a book held upside down.

  • by x0ra ( 1249540 ) on Saturday April 23, 2016 @03:01AM (#51970519)
    If I'm not mistaken, the LHC has been publicly funded, so these data should have been public to start with. Anything else is bs.
    • It wasn't publicly funded by the entire world, though, so it makes sense to restrict the data sharing to the scientists of the countries that helped funding it.
      • You mean the taxpayers of the countries funding it. After all, not only scientists were paying for it out of their taxes.
    • by BitterOak ( 537666 ) on Saturday April 23, 2016 @03:20AM (#51970601)

      If I'm not mistaken, the LHC has been publicly funded, so these data should have been public to start with. Anything else is bs.

      It's standard practice in experimental particle physics to give those who put the time and effort into designing, building, and running the experiment the first chance to analyze the data and publish results. After that, it's not unusual to release the raw data publicly. Otherwise, there'd really be no incentive to do the work, since someone else could swoop in and publish results without having contributed to producing the data.

    • It was available to all scientists of the funding and visiting countries. Now as the scientists are through with it you can have a look too.

      • by x0ra ( 1249540 )
        It should have been available to the whole population...
        • It is now. Before that the people who developed the experiments got first access. I personally understand that perfectly. They invested decades of their lives.

        • Why? What interest does the general population have in access to the LHC data? They've already release a subset of the data [opendata.cern.ch] for educational purposes, in addition to this considerable data dump. It serves no public interest to make the whole data set available to everyone, and in fact would run contrary to the public interest: the data set is absolutely massive (the LHC produces petabytes of data per day), and the costs associated with making that data available to the public would be non-negligible.

          If a spe

      • by AmiMoJo ( 196126 )

        Cool. Where's the torrent? It's not in TPB yet.

    • Unfortunately its not simple. Scientists and the organization that they work for are judged based on their publications. So a lab that spent a lot of money to build a new experiment need to show a lot of publications from that experiment or they won't get future funding.

      Its not a great system, but it is what is in place and its not obvious how to do better.

  • by symes ( 835608 )

    I understand there are tools to work with these data, but even so, 300TB is a lot. Wouldn't it be better, assuming they want to encourage future generations of particle physicists, to open source the tools and provide better instruction on how one should manage these data? That seems like half the problem. No way will anyone in high school download 300TBs to play with. Even if they could, what would they use to play with it?

    • I assume you don't need all the data if you just wanted it for education purposes.
    • Nobody will use it in high school where people have problems with calculus. It might be helpful in college and university.

    • Yeah certainly, building the largest particle collider in the world is way more easier than copying 300 TB of data. And it will be way more fun, too!

    • You manage it on a thumb drive, DUH! Make sure it's USB 3.0! Load the data into Excel, and you can make pretty graphs.

    • Congratulations on not following TFLinks. They did open-source the tools and provide instructions.
      You also don't need to download the entire 300 TB, the data is divided into batches.

      Available on the CERN Open Data Portal - which is built in collaboration with members of CERN's IT Department and Scientific Information Service - the collision data are released into the public domain under the CC0 waiver and come in types: The so-called 'primary datasets' are in the same format used by the CMS Collaboration to perform research. The 'derived datasets' on the other hand require a lot less computing power and can be readily analysed by university or high-school students, and CMS has provided a limited number of datasets in this format.

      Notably, CMS is also providing the simulated data generated with the same software version that should be used to analyse the primary datasets. Simulations play a crucial role in particle-physics research and CMS is also making available the protocols for generating the simulations that are provided. The data release is accompanied by analysis tools and code examples tailored to the datasets. A virtual-machine image based on CernVM, which comes preloaded with the software environment needed to analyse the CMS data, can also be downloaded from the portal.

  • 300 TB?
    How many Libraries of Congress is that?

  • Maybe now, we can unlock the mysteries of Steins Gate! Mwahaha!

  • by SkyratesPlayer ( 1320895 ) on Saturday April 23, 2016 @10:01AM (#51971795)
    Before this, the largest collection of collision data was the Russian dash-cam footage on YouTube
  • Just curious how many floppy disks would it take to store 300 TB?

    • Let's assume 3.5" form factor. Using HD floppies, you need 20 million. The one on my desk is 3.25 mm thick, so they'd make a stack 65 km tall. In the spirit of Randall Munroe's What If, it would of course collapse and kill you long before it got that high.
      • by HiThere ( 15173 )

        Why not use the 8 inch hard sectored floppy disks? Of course there's nowhere you could either read or write them... but they were a bit thinner, and I think they stored all of 100KB.

        • Why not use the 8 inch hard sectored floppy disks? Of course there's nowhere you could either read or write them... but they were a bit thinner, and I think they stored all of 100KB.

          Because I am willing to bet there were more 3.5" floppies made than all other types of removable media put together. I have a DEC RX-02 somewhere, but haven't had to use it since 1998... ISTR they held about 500kB.

  • Sure are a lot of articles about the Large Hardon Collider lately.
  • by Anonymous Coward

    http://opendata.cern.ch/about/CMS

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...