Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
Science Technology

Government Lab Uses Smartphones To Measure Gamma Ray Exposure 105

Posted by samzenpus
from the hulk-phone-is-strongest-there-is dept.
KentuckyFC writes "Back in 2008, Slashdot reported that researchers were developing ways of turning cellphones into radiation detectors. Since then a few apps have even appeared that claim to do this. However, convincing evidence that they work as advertised is hard to come by. Now government researchers at Idaho National Labs have created their own app that uses an ordinary smartphone as a gamma ray detector, put it through its paces in the lab and published the results. The pixels in smartphone cameras can detect gamma rays in the same way as they pick up visible light. So when the lens is covered, the image should reveal evidence of gamma ray exposure once other noise has been removed, such as that from heat and current leakage. These guys have tested several types of Android smartphone with a variety of gamma ray sources at various different doses. The researchers say the phones give a reasonable measure of radiation dose, can detect the direction of source (by comparing the measurements from the front and back cameras) and can even measure the energy of the gamma rays by measuring the length of the tracks that appear in the image. While the results do not match the quality of bespoke detectors, that may not matter since in many circumstances cellphones are likely to be the only sensors that are available. That could be useful for emergency services, air travelers wanting to monitor their extra radiation dose on routes over the arctic and people who live in areas with a higher than average background radiation level."
This discussion has been archived. No new comments can be posted.

Government Lab Uses Smartphones To Measure Gamma Ray Exposure

Comments Filter:
  • by hubie (108345) on Monday January 13, 2014 @02:08PM (#45942565)

    Digital cameras are very insensitive to the IR. Silicon, which is what all commercial camera sensors are made of, loses its sensitivity around 1000 nm, so photons with a longer wavelength than that generally pass through undetected (they are most sensitive around 600-ish nm, which is something like orange light). On the other hand, if you look at the spectrum of light coming from the Sun, you get the most photons around that same 600 nm wavelength (how's that for coincidence?), but you also still have a whole lot of photons flying around with wavelengths of 1000 nm and less. Camera makers put IR-blocking filters on because the optics for the cameras are optimized for visible wavelengths, so IR wavelengths will not come to a nice focus. These IR wavelengths add image blur. Some people want to pop their IR filters off because it will make their camera more sensitive, which technically is true, but you'll make your pictures look blurry unless you do something else (i.e., filters) to restrict the wavelengths of light through your optics.

    You also have to be careful when you talk about the IR that these cameras can detect. What you're really talking about is very deep red, or the first parts of the NIR (near infrared) region. Most people, when they hear IR, think heat signatures, but that is not what you're dealing with here. The thermal IR is much longer wavelengths, and you'll never see that with a silicon-based camera. In fact, pure silicon is very useful as a window material for IR sensors because it is very transparent to photons at those wavelengths.

"Pascal is Pascal is Pascal is dog meat." -- M. Devine and P. Larson, Computer Science 340

Working...