Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Data Storage Earth Networking Science

How a Frozen Neutrino Observatory Grapples With Staggering Amounts of Data (vice.com) 49

citadrianne writes: Deep beneath the Antarctic ice sheet, sensors buried in a billion tons of ice—a cubic kilometer of frozen H2O—are searching for neutrinos. "We collect...one neutrino from the atmosphere every ~10 minutes that we sort of care about, and one neutrino per month that comes from an astrophysical sources that we care about a very great deal," researcher Nathan Whitehorn said. "Each particle interaction takes about 4 microseconds, so we have to sift through data to find the 50 microseconds a year of data we actually care about." Computing facilities manager Gonzalo Merino added, "If the filtered data from the Pole amounts to ~36TB/year, the processed data amounts to near 100TB/year." Because IceCube can't see satellites in geosynchronous orbit from the pole, internet coverage only lasts for six hours a day, Whitehorn explained. The raw data is stored on tape at the pole, and a 400-core cluster makes a first pass at the data to cut it down to around 100GB/day. A 4000-CPU dedicated local cluster crunches the numbers. Their storage system has to handle typical loads of "1-5GB/sec of sustained transfer levels, with thousands of connections in parallel," Merino explained.
This discussion has been archived. No new comments can be posted.

How a Frozen Neutrino Observatory Grapples With Staggering Amounts of Data

Comments Filter:
  • by jfdavis668 ( 1414919 ) on Friday October 16, 2015 @02:05PM (#50745405)
    We had one of the professors who work on the project from F&M university give a talk on the project to our local astronomy club. The amount of work required to build that thing was amazing. They are using the Earth to filter out local sources of interference so that they can find true reactions caused by neutrinos. The Earth filters out other man-made particles. They can spot neutrinos from super novas coming through the Earth.
  • by xxxJonBoyxxx ( 565205 ) on Friday October 16, 2015 @02:14PM (#50745479)

    Perhaps they could buy a station wagon, load it up with tapes and send it with the next dogsled. (I kid.)

    It's not like they are using real-time data from this thing - it's more like a traditional particle smashing experiment where most of the analysis is done months and years after the data is collected.

    • by JoshuaZ ( 1134087 ) on Friday October 16, 2015 @02:30PM (#50745603) Homepage

      Yes and no. There is some advantage to getting close to real time data: there's a a Supernova Early Warning System http://snews.bnl.gov/ [bnl.gov]. This isn't a safety issue, but rather an astronomy issue.

      Detectors like IceCube can be used to actually detect the neutrinos from a supernova before the supernova's light reaches Earth. This isn't due to the erroneous claim from a few years ago that neutrinos travel faster than light, but rather because when a supernova occurs, the light from the core of the star takes multiple hours to get out of the core because of all the mass in the way, while the neutrinos aren't slowed down by this almost at all. This means that the neutrinos effectively get a few hours head start on the light- since they are traveling so close to the speed of light, they get to keep almost all this head start by the time they reach Earth. In the case of SN 1987A https://en.wikipedia.org/wiki/SN_1987A [wikipedia.org], a supernova in 1987 which was close enough that we could detect the flood of neutrinos, the neutrinos did as predicted arrive a few hours before the light. This means we can if we detect a neutrino burst and can get its directional data (which IceCube can approximately do) then we can point our telescopes at a supernova *before the light arrives at Earth* which means we'll get to see the very beginning of the supernova and hopefully get a much better understanding.

      In order to do this you have to do at least some of your processing in at least close to real time as you can. This is especially important because it isn't actually easy to figure out from the neutrino burst what direction the supernova is coming from, and IceCube is one of the few detectors which gets any good directional data at all, so if this happens we want to process the data rapidly enough to get a good idea of where to look.

      • by buchner.johannes ( 1139593 ) on Friday October 16, 2015 @04:11PM (#50746329) Homepage Journal

        In the case of SN 1987A https://en.wikipedia.org/wiki/SN_1987A [wikipedia.org], a supernova in 1987 which was close enough that we could detect the flood of neutrinos, the neutrinos did as predicted arrive a few hours before the light.

        That claim is disputed (as you can see on the page you link to), because the equipment has not reliably recorded the times of arrival of the neutrino detections. What you say about Supernova models predicting neutrinos escaping before light is true, but the observational proof is yet to come... making IceCube even more important.

      • by PPH ( 736903 )

        So the real time processing at the pole would have to be done to a point where it can indicate (with some level of confidence) the presence of a candidate supernova. Otherwise the 18 hour latency (just missed the daily 6 hour window) would render the data useless. Given this level of processing, information sufficient to aim telescopes could easily be sent on an HF band via Morse code (celestial azimuth and declination, time of event, number of neutrinos counted, etc.). The detailed information could follow

    • by starless ( 60879 )

      It's not like they are using real-time data from this thing - it's more like a traditional particle smashing experiment where most of the analysis is done months and years after the data is collected.

      Well, in some cases the detection of a high energy neutrino has triggered a search for a counterpart at other wavelengths (X-ray, optical etc.).
      So near real-time detection of a neutrino can be important to determine its astrophysical origin.
      http://www.astronomerstelegram... [astronomerstelegram.org]
      http://www.astronomerstelegram... [astronomerstelegram.org]

      And from the ANTARES neutrino telescope:
      http://www.astronomerstelegram... [astronomerstelegram.org]

    • Perhaps they could buy a station wagon, load it up with tapes and send it with the next dogsled. (I kid.)

      Replace the station wagon with a transport plane and that's not an inaccurate picture.

      It's not like they are using real-time data from this thing

      There is a need for real time data. First you want to know that your detector is working. Finding out 6 months later when you are doing the detailed analysis that there is something seriously wrong with e.g. the trigger would be a very bad thing. Secondly there are astronomical events which can occur rapidly like Supernovae. If IceCube pick up a SN signal then you want to let the astronomers know quickly, not several mon

    • Dogsleds aren't used in the Antarctic, just the Arctic. Likewise, no penguins in the Arctic. The Antarctic is pretty barren and inhospitable.

      But, yes, never underestimate the bandwidth of a snowmobile-towed sledge works just fine here.

      • Dogsleds aren't used in the Antarctic, just the Arctic. Likewise, no penguins in the Arctic. The Antarctic is pretty barren and inhospitable.

        Dog sleds in the Antarctic seemed to work well for Roald Amundsen.

        • by dargaud ( 518470 )
          Yes, but they are now outlawed by the Antarctic Treaty (a pretty good read in itself, and not too long).
  • If there are 6 events every minute, and each last 4 microseconds, then that is 131,400 events to review per year. If you multiply all those microseconds and events you get 525,600 microseconds of data, or about .5 seconds worth of neutrinos to review per year. What the heck is this guy so upset about! They must get really bored down there in the Antarctic.
    • by cdrudge ( 68377 )

      If there are 6 events every minute, and each last 4 microseconds, then that is 131,400 events to review per year

      You have all sorts of issues with your calculation. 131,400 events per year would be 1 every 4 minutes. 6 events per minute as you state would be 6 per minute * 60 minutes * 24 hours * 365 days equaling 3,153,600 events per year.

      However that doesn't really matter because 6 events don't happen a minute. They state in the summary and article that they detect 1 neutrino about every ten minutes or 6 a

  • "We collect [...] one neutrino [...] every ~10 minutes," researcher Nathan Whitehorn said.

    How did he pronounce "~"?

  • by OzPeter ( 195038 ) on Friday October 16, 2015 @02:39PM (#50745677)

    A tonne is the SI unit for 1000 kilograms.
    A ton (US) is a funny unit of measure for 2000 lbs (907kg)
    A ton (Imperial) is a funny unit of emasure for 2,240 lbs (1,016 kg)
    Thus a tonne is about 1.1 tons (US), and 0.98 tons (Imperial)
    A cubic kilometer of water is 1 billion (1E9) tonnes
    But water expands when it is frozen by about 9%
    So a cubic kilometer of ice would be about 1E9 tons (US)

    Thus the statement in TFS

    a billion tons of ice—a cubic kilometer of frozen H2O

    while numerically about correct is a hell of a mess of mixed units.

  • Why are they processing it in place if the processed data requires more bandwidth and they have a transmission bandwidth problem?

    What did I miss?

    David

    • by uigin ( 985341 )

      I read the article. The summary is out of sequence wrt the article. They transmit ~36TB/yr back to Wisconsin whereupon processing expands it threefold. The bandwidth issue is with the 36TB.

  • Not having read the article or not knowing anything about how an event is detected... It rather sounds if CPUs are not the best tool for the job. FPGAs should be able to run data acquisition and filtering in real time, doing most of the heavy lifting. A single FPGA (rather large FPGA like the Virtex range from Xilinx) can do thousands of multiply accumulates in parallel. GPUs like the Tesla or similar may also be a better fit.
    • by dargaud ( 518470 )
      I don't know the specifics of this project but I work with scientific data processing, where there are multiple software updates daily while the soft is still in development (but already 'in production'). Much harder to do with FPGA.
  • Aren't roioom temperature neutrinos good enough?

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...