The Real Reason why Spirit Only Sees Red 273
use_compress writes To produce a color photograph, the rover's panoramic camera takes three black-and-white images of a scene, once with a red filter, once with a green filter and once with a blue filter. Each is then tinted with the color of the filter, and the three are combined into a color image.
In assembling the Spirit photographs, however, the scientists used an image taken with an infrared filter, not the red filter (NYTimes, Free Registration Required). Some blue pigments like the cobalt in the rover color chip also emit this longer-wavelength light, which is not visible to the human eye."
obligatory registration free link... (Score:5, Informative)
Re:Why b/w & filter? (Score:1, Informative)
They can get much higher resolution in grayscale, so they get better pictures but slower, with cheaper and lighter kit.
I would guess the bandwidth is the same, or almost the same.
Re:Why b/w & filter? (Score:4, Informative)
Also, having external color filter masks which can be rotated into place means we are no longer limited in vision to just the visible spectrum we see, but we can see anything the raw silicon sensor allows, meaning we can also view the infrared to ultraviolet, and let us assign "pseudo color" as we see fit.
Short version (Score:5, Informative)
On the panoramic picture: We goofed. It should not have been that red.
The other photographs are taken with the infa-red instead of visible red filter. Iron dominated the visible red spectrum. To allow a better analysis of the compounds found infa-red light is used instead.
<joke>No conspiracy here. Move on.</joke>
the_crowbarRe:Why b/w & filter? (Score:4, Informative)
I can still remember using a NewTek DigiView digitizer with a b/w video camera and filters so I guess the Alzheimer hasn't gotten to me yet. :-)
Re:Why b/w & filter? (Score:3, Informative)
As a side-effect you can colorise and recombine the images to approximate a colour picture as you might see if you were stood there yourself. Some NASA PR guys stitch these together while the science team go to work on the black and white stuff.
Re:Was in New Scientist a week or so ago (Score:4, Informative)
They have at least 14 filters, taking 14 cameras would be impossible.
Info here. [nasa.gov]
Re:Why don't they release the RGB too? (Score:4, Informative)
The engineers are focusing on the filters that return good science.
Re:Why b/w & filter? (Score:1, Informative)
This might be the same thing you're saying, but I would guess that it's because in a color camera, you have three different color sensors per pixel that are arranged in a bundle like the pixels on your TV set. This causes a certain amount of chromatic distortion because each color really only sees one-third of the whole picture. However, if you use a monochrome camera and filters, each pixel gets completely recorded in each color and you can later stack the color planes on top of each other and blend them which gives you better color and resolution with a LOT less distortion.
Re:Why b/w & filter? (Score:5, Informative)
With scientific imaging, OTOH, you want the raw information coming off the CCD. They are interested in everything, not just what the human eye can see.
So, with lossless encoded, 3 greyscale images actually come out to be the same size as a color image. (Look at a color TIFF for example.) The advantage of the B/W and filter approach is that you need only one capture device. On a spacecraft there are many design advantages. Besides, you now have 3 copies of the same image. You never know when one copy will pick something up that the others missed.
Re:Was in New Scientist a week or so ago (Score:5, Informative)
Re:Why don't they release the RGB too? (Score:5, Informative)
Cant remember links from the top of my head(Search older
AFAIR the things is a bit more complicated though - the cameras have 7 different filters, which have quite a bit of overlap, and doesnt peak at frequencies of light that directly could be used in an RGB image - so some fiddling is requered.
And TBH - I think its perfectly fine NASA doesn't focus on producing "correct" images if it doesn't mean better science!
Re:Blue? Infrared? (Score:2, Informative)
It emits both blue and infra-red, neither has any effect on the other - we just only see the blue because the human eye does not detect infra-red.
K
Partner Link (Score:4, Informative)
Re: Compression (Score:2, Informative)
Unfortuantly I can't find any references as to the loss{y|less}ness of the compression used
Re:Blue? Infrared? (Score:4, Informative)
Re:Blue? Infrared? (Score:4, Informative)
One of the few things I remember from my chemistry degree was that many pigments are far brighter in the UV region since the "normal" colour corresponds to a forbidden transition - i.e. one that involves a change of spin as well as change of orbital.
I do hope that wasn't a rhetorical question...
Re:Why don't they release the RGB too? (Score:5, Informative)
Unfortunately, all data isn't released. There is not radiometric data or pointing data for pictures, spectrometer data, etc.
And NASA puts a hold on images they plan to use later for press conferences-- e.g. the individual PanCam pictures of the parachute and backshell weren't released. This goes directly against the promises they made pre-mission.
Re:Why b/w & filter? (Score:4, Informative)
Essentially, that's what all professional cameras do.
A broadcast television camera (which is really pretty low-resolution, unless it's a true HTDV camera) has three CCD sensors mounted to a prisim block that splits the image into the three component colors for television (RGB). The use of three CCDs for television is necessitated by the fact that the desired result is a color image without waiting to assemble a color composite from three black and whites. Broadcast television results in images that are pretty close to 640x480 (again, prety low res).
The MER images are stills. As such, there is time to put together a composite of the separate components taken with the filters. The data desired is high resolution and each of the composite images (irRGB) yields different information. Additionally, JPL is not lacking computer time for assembling the result of the component images. We're not talking live video feeds here.
I note that there has been some discussion of weight here. That is not a factor in this case. Each of the filters, together with the CCD and the precise movement motor probably weighs about the same as a three CCD system, but in this case, it is one CCD, so any defects can be known and programmed around so there are no trade-offs. The issues JPL/NASA are dealing with have more to do with the size of the data sets and the available time in which the MERs can communicate with Earth.
Re:21st C (Score:3, Informative)
Re:Why b/w & filter? (Score:3, Informative)
Also, the human vision system also performs white-balancing on it's own. If you've ever looked through a window at dusk in Winter, you'll notice that outside will appear with a blue tint, while if you're outside, all the rooms inside will appear to have an orange/yellow tint. Your eyes are trying to get the average colour to white.
Re:Why don't they release the RGB too? (Score:3, Informative)
In most cases, the infrared filter is close enough to red that a composite still gives you a good image. They do occasionally take a picture with the red filter instead of the infrared, as the article states, but these aren't as useful for scientific purposes.
If the public's interest can be satisfied with a composite using the IR channel, and you get a lot more science done with it, doesn't it make sense to use it? Their mistake was in releasing color photographs without noting that the color might not be right.
Incidentally, all of the raw images are available on the NASA web site. Instructions for a do-it-yourself composite are available from the previous Slashdot article discussing the color of the images on Mars [slashdot.org].
Re:Why b/w & filter? (Score:5, Informative)
Almost all of the manufactured sensors are black and white; only Foveon's are 3-color, and they're expensive for the resoultion and the first generation software had color clipping problems (overexposed areas of images went abruptly to white). This has apparently been fixed.
A monochrome sensor with external filters is much more flexible than the single-duty Foveon, so I guess that's why they chose it. Also, NASA doesn't usually buy space-faring hardware off-the-shelf two weeks before launch, and this full-color sensor simply did not exist a couple of years ago.
Re:Why b/w & filter? (Score:5, Informative)
Re:Why don't they release the RGB too? (Score:3, Informative)
As to positioning data: nope, it's not, but it is important for accurately producing anaglyphs/range maps/good stitches unfortunately. And the radiometric data -is- important for nearcolor-- I could release a lot more nearcolor imagery if I had confidence the radiometric data was right. As it is now, I have to inspect each image by hand and compare to the spectroscopy data I have on hand to make sure things are close to right. As to why those pieces of engineering data associated with the image aren't being distributed-- i don't know. Perhaps NASA assumes no one is interested.
Re:Why b/w & filter? (Score:3, Informative)
Gujju
Re:Why b/w & filter? (Score:5, Informative)
Close, but not quite...
A Broadcast Quality camera is usually capable of recording a substantially higher resolution of image than is eventually broadcast. This allows for much better editing facilities later on - ie. Cropping and resizing of the recorded images without loss of detail in the later broadcast. Final Broadcast (in the UK at least) is around 760x575 pixels (actual broadcast lines are 625, but several are taken by the Vertical Blanking Pulse, the Frame Field Markers and Teletext data) - but the camera definately records a much higher resolution than that.
For comparison, a standard Hi-8 Domestic Hand Camera records around 540 picture lines (about 720x540), and the picture quality from this kind of camera is much lower than that needed by the broadcast editing suites to work effectively - just watch any "home video" programme (such as "You've been Framed!") for proof!
Also, expensive professional broadcast cameras use "Dichromatic Mirrors", not prisms to do colour seperation. Prismatic seperation would lead to too much signal loss and colour bleed accross the image. The first mirror directs the Red image to the appropriate sensor, and also allows enough light of all wavelengths to pass to the next mirror, where the Green image is diverted to the appropriate sensor, and again, light of all wavelengths passes to the final sensor in the camera. Blue is never explicitly seperated from the incoming image, but is instead inferred from the intensity data from the three individual sensors.
I can be very certain of both of these facts because my dad was a Video Electronics Engineer for the BBC for a number of years...
It's actually an old technique. (Score:3, Informative)
I don't know the date of the first use, but Sergei Mikhailovich Prokudin-Gorskii [loc.gov] photographed parts of Russia for the Tsar in 1909.
He used a three photo technique [loc.gov] where the scene was recorded three times on a glass plate (in a row, not overlaid) with different filters. If you look carefully at the river, there is color distortions from the small waves.
Magic Lantern (Score:3, Informative)
When you take a look at what this old-tech can really do, it's quite astounding.
The Library of Congress has an exhibition of pre-WWI (that's World War I) *color* photos of Russia [loc.gov] shot using the exact same process. Since this was a while before any practical color photo printing processes the photographer built a "magic lantern" for "optical color projections."
Props to Bolo Boffin [blogspot.com] for the link.
From the horses mouth (Score:4, Informative)
Prokudin-Gorskii (Score:3, Informative)
LOC's Explaination (Score:3, Informative)
This reminds me (Score:3, Informative)
Re:I've often wondered (Score:2, Informative)
But, in fact, there are mutated humans who have differences in some of the substances responsible for seeing. Some have altered pigments (this is the most common case, and most commonly, this results in 'red' and 'green' absorption maxima getting closer, rendering the individual 'red-green blind' - usually not that much of a hindrance, but I had a friend who couldn't make out mushrooms clearly visible to me on a lawn because of that) There are also conditions where one type of cells is completely missing, this may be really bad for the people affected. Think of not seeing a red traffic light.
Colorblindness usually affects men (more than 9 of 10 cases are men, IIRC) because the red/green color seeing substances are encoded in the X Chromosome. If one is broken in a female, then she has another one as a backup, which men don't have.
I have never heard of "superhumans" though who can see ultraviolet or something, besides the constant rumours about "tetrachromates", that's women (because of the X chromosome) who have four types of "color cells" (all would be seeing in the normal "visible range" though). But there's not much evidence about this, so take it with a grain fo salt.
Color blindness - example images and details [webexhibits.org]
Tetrachromates [utk.edu]
Re:Why not just give NEW pictures! (Score:3, Informative)
Re:Why b/w & filter? (Score:2, Informative)
http://tmo.jpl.nasa.gov/progress_report/42-1
Re:But what is this thing? (Score:3, Informative)
I have put up a crop of the original [ajs.com] which you can feel free to stare at. Yes, it does appear to be some sort of round object with two large protrusions. It could easily be a rock of volcanic origin, but my bet is on its being some piece of the lander itself.
Approximation has been going on for decades (Score:3, Informative)
Re:Why b/w & filter? (Score:5, Informative)
Foveon's sensors don't have really good color separation, and NASA wanted to have more than RGB anyway (they actually have something like 14 different filters). For science you don't want to limit yourself to just the visible spectrum. Hubble works similarly.
Debunking NASA Color Conspiracy (Score:3, Informative)
--
For news, status, updates, scientific info, images, video, and more, check out:
(AXCH) 2004 Mars Exploration Rovers - News, Status, Technical Info, History [axonchisel.net].
Re:Interesting. (Score:3, Informative)
First of all, the calibration strip: yeah, the MERs each have something similar as well.
You can't rely solely on them, though. If the light that filters down to the surface is tinted any color other than whitish-blue (like her on Earth), trying to match the calibration device to the one back here on Earth is going to produce the wrong coloration.
Secondly, it's widely known that the Viking lander images showing the Martian sky as blue were colored incorrectly.
Thirdly, the Martian atmosphere is mostly CO2, which doesn't scatter blue light all that much AFAIK. Throw in a surprising amount of reddish-orange dust that's almost always there, and the probability that the Martian atmosphere is usually reddish (or butterscotch) is pretty good. Now, it isn't always reddish, or so I've read. Hubble took a photo of Mars, IIRC, and near the edges the atmosphere looks bluish. This is probably because the light has much more atmosphere to go through at sunrise and sunset than it does during the day, and so what little oxygen is in the Martian atmosphere has a chance to scatter the blue light. There's even a Mars Pathfinder image to back this up right here [nasa.gov].
The image doesn't prove that the Martian atmosphere is blue, but it does show that there is a lot of dust in the atmosphere over there, and that under the right conditions it can be a little bit blue in some places.