High-Speed Video Free With High-Def Photography 75
bugzappy notes a development out of the University of Oxford, where scientists have developed a technology capable of capturing a high-resolution still image alongside very high-speed video. The researchers started out trying to capture images of biological processes, such as the behavior of heart tissue under various circumstances. They combined off-the-shelf technologies found in standard cameras and digital movie projectors. What's new is that the picture and the video are captured at the same time on the same sensor. This is done by allowing the camera's pixels to act as if they were part of tens, or even hundreds, of individual cameras taking pictures in rapid succession during a single normal exposure. The trick is that the pattern of pixel exposures keeps the high-resolution content of the overall image, which can then be used as-is, to form a regular high-res picture, or be decoded into a high-speed movie. The research is detailed in the journal Nature Methods (abstract only without subscription).
How long (Score:2, Funny)
I read the title as "High-Def Pornography"... (Score:4, Funny)
I think it's past my bedtime.
Representative sample (Score:4, Funny)
Re:Representative sample (Score:2, Funny)
As I read this, there are three comments. Two are about porn. Slashdot in a nutshell.
Actually, only one of those three was about porn. The other two (and this one) are just offtopic. So that makes Slashdot 25% horny, 25% pedantic, and 75%-100% offtopic.
Re:I've actually thought about this... (Score:4, Funny)
And what about the lens? If the sensors are omnidirectional and can simply keep reporting their state at a high frequency, the "lens" (its optical purpose) can be done in software. You just need a high density of sensors and the ability to process the information fast enough.
Obviously, the individual sensors can't be truly omnidirectional, but rather their visibility angle would depend on the geometry of the surface they're placed on -- which could be a hemisphere, or even an almost complete sphere. As you mentioned, the angle of light would still be relevant, but this would be done on an individual sensor basis -- rather than one lens orchestrating the entire image.
There, we solved it. Engineers, get to work!