Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Space Science

The Real Reason why Spirit Only Sees Red 273

use_compress writes To produce a color photograph, the rover's panoramic camera takes three black-and-white images of a scene, once with a red filter, once with a green filter and once with a blue filter. Each is then tinted with the color of the filter, and the three are combined into a color image. In assembling the Spirit photographs, however, the scientists used an image taken with an infrared filter, not the red filter (NYTimes, Free Registration Required). Some blue pigments like the cobalt in the rover color chip also emit this longer-wavelength light, which is not visible to the human eye."
This discussion has been archived. No new comments can be posted.

The Real Reason why Spirit Only Sees Red

Comments Filter:
  • Re:Why b/w & filter? (Score:1, Informative)

    by EyeSavant ( 725627 ) on Tuesday February 10, 2004 @09:16AM (#8236419)
    I would guess that it is to do with the cost, and weight of the high resolution camera.

    They can get much higher resolution in grayscale, so they get better pictures but slower, with cheaper and lighter kit.

    I would guess the bandwidth is the same, or almost the same.
  • Re:Why b/w & filter? (Score:4, Informative)

    by anubi ( 640541 ) on Tuesday February 10, 2004 @09:16AM (#8236422) Journal
    I understand its because its a heckuva lot easier to build high resolution cameras as monochrome, as you can place the pixels immediately adjacent to each other and not concern yourself with placement of color filter masks.

    Also, having external color filter masks which can be rotated into place means we are no longer limited in vision to just the visible spectrum we see, but we can see anything the raw silicon sensor allows, meaning we can also view the infrared to ultraviolet, and let us assign "pseudo color" as we see fit.

  • Short version (Score:5, Informative)

    by the_crowbar ( 149535 ) on Tuesday February 10, 2004 @09:18AM (#8236438)

    On the panoramic picture: We goofed. It should not have been that red.

    The other photographs are taken with the infa-red instead of visible red filter. Iron dominated the visible red spectrum. To allow a better analysis of the compounds found infa-red light is used instead.

    <joke>No conspiracy here. Move on.</joke>

    the_crowbar
  • Re:Why b/w & filter? (Score:4, Informative)

    by richie2000 ( 159732 ) <rickard.olsson@gmail.com> on Tuesday February 10, 2004 @09:21AM (#8236466) Homepage Journal
    My guess is that it's easier to get more resolution out of the camera this way. You can use the full resolution for every colour instead of having 4 sensors (RGB + IR) on-chip per pixel. More on the MER cameras here [space.com]

    I can still remember using a NewTek DigiView digitizer with a b/w video camera and filters so I guess the Alzheimer hasn't gotten to me yet. :-)

  • Re:Why b/w & filter? (Score:3, Informative)

    by MartyC ( 85307 ) on Tuesday February 10, 2004 @09:22AM (#8236473) Homepage
    The imaging system used is a monochromatic camera, because they are simpler to operate and calibrate. The science teams aren't particularly interested in colour photography, the filters are there to narrow down the response range of the detector to provide some useful information on the surface properties of the things they image, as different minerals reflect/absorb/scatter light differently. By using filters of known transmission characteristics you can infer things about the soil and rocks around you. A colour CCD like you have in your average digital camera wouldn't be able to do this.
    As a side-effect you can colorise and recombine the images to approximate a colour picture as you might see if you were stood there yourself. Some NASA PR guys stitch these together while the science team go to work on the black and white stuff.
  • by SpinyManiac ( 542071 ) on Tuesday February 10, 2004 @09:22AM (#8236477)
    Not just that, using a B/W camera allows them to use any filter they like.

    They have at least 14 filters, taking 14 cameras would be impossible.

    Info here. [nasa.gov]
  • by Anonymous Coward on Tuesday February 10, 2004 @09:24AM (#8236491)
    They release all the individual raw pictures on the mars rovers website. You are free to composite them yourself.

    The engineers are focusing on the filters that return good science.
  • Re:Why b/w & filter? (Score:1, Informative)

    by Asprin ( 545477 ) <gsarnoldNO@SPAMyahoo.com> on Tuesday February 10, 2004 @09:25AM (#8236495) Homepage Journal

    This might be the same thing you're saying, but I would guess that it's because in a color camera, you have three different color sensors per pixel that are arranged in a bundle like the pixels on your TV set. This causes a certain amount of chromatic distortion because each color really only sees one-third of the whole picture. However, if you use a monochrome camera and filters, each pixel gets completely recorded in each color and you can later stack the color planes on top of each other and blend them which gives you better color and resolution with a LOT less distortion.

    ...but that's just a guess.
  • Re:Why b/w & filter? (Score:5, Informative)

    by EvilTwinSkippy ( 112490 ) <yoda AT etoyoc DOT com> on Tuesday February 10, 2004 @09:26AM (#8236500) Homepage Journal
    They are sending the raw picture uncompressed. Well, they might use a run-length encoding, but the result is a lossless image. JPEG is so much smaller because it really cuts corners, and exploits the fact that our eyes are more sensitive to contrast than the magnetude difference between colors.

    With scientific imaging, OTOH, you want the raw information coming off the CCD. They are interested in everything, not just what the human eye can see.

    So, with lossless encoded, 3 greyscale images actually come out to be the same size as a color image. (Look at a color TIFF for example.) The advantage of the B/W and filter approach is that you need only one capture device. On a spacecraft there are many design advantages. Besides, you now have 3 copies of the same image. You never know when one copy will pick something up that the others missed.

  • by mcbevin ( 450303 ) on Tuesday February 10, 2004 @09:26AM (#8236503) Homepage
    If you read the whole article, you'd see that they actually used both the infrared AND the red filter for the pictures. So they had their infrared for their science as well as the red for the photos to show the public. However they mucked up in producing the photos for the public, using the infrared instead of the red. Nothing to do with science vs public interest, rather a simple mistake.
  • by Seahawk ( 70898 ) <tts@nOsPAm.image.dk> on Tuesday February 10, 2004 @09:27AM (#8236510)
    ALL data IS actually released.

    Cant remember links from the top of my head(Search older /. stories), but several people have taken the raw data and composed their own versions of the colour photos.

    AFAIR the things is a bit more complicated though - the cameras have 7 different filters, which have quite a bit of overlap, and doesnt peak at frequencies of light that directly could be used in an RGB image - so some fiddling is requered.

    And TBH - I think its perfectly fine NASA doesn't focus on producing "correct" images if it doesn't mean better science! :)
  • Re:Blue? Infrared? (Score:2, Informative)

    by LordK2002 ( 672528 ) on Tuesday February 10, 2004 @09:33AM (#8236546)
    Some blue pigments like the cobalt in the rover color chip also emit this longer-wavelength light, which is not visible to the human eye."

    If it's a *blue* pigment, why does it emit a *longer* (i.e. infrared) wavelength?

    Clarification of the original statement: "some materials, such as cobalt, which reflect light that appears blue to the human eye, also reflect light in the infra-red range".

    It emits both blue and infra-red, neither has any effect on the other - we just only see the blue because the human eye does not detect infra-red.

    K

  • Partner Link (Score:4, Informative)

    by HFShadow ( 530449 ) on Tuesday February 10, 2004 @09:35AM (#8236553)
  • Re: Compression (Score:2, Informative)

    by the real darkskye ( 723822 ) on Tuesday February 10, 2004 @09:36AM (#8236561) Homepage
    To blockquoth the source [cornell.edu]
    When each twin-lens CCD (charge-coupled device) camera takes pictures, the electronic images will be sent to the rover's onboard computer for a number of onboard image processing steps, including compression, before the data are sent to Earth


    Unfortuantly I can't find any references as to the loss{y|less}ness of the compression used
  • Re:Blue? Infrared? (Score:4, Informative)

    by dwm ( 151474 ) on Tuesday February 10, 2004 @09:39AM (#8236581)
    A material can emit light at various wavelengths, and at wavelengths quite different than that which it reflects, which is what you most commonly see in the visible range. It's quite possible for something to reflect blue light and emit light at wavelengths longer than the visible range.
  • Re:Blue? Infrared? (Score:4, Informative)

    by Bill_Mische ( 253534 ) on Tuesday February 10, 2004 @09:39AM (#8236582)
    I'd imagine because the blue colour corresponds to an electron transition in the d-orbital whereas the IR corresponds to a different transition or more likely to a change in mode of molecular vibration.

    One of the few things I remember from my chemistry degree was that many pigments are far brighter in the UV region since the "normal" colour corresponds to a forbidden transition - i.e. one that involves a change of spin as well as change of orbital.

    I do hope that wasn't a rhetorical question...
  • by mlyle ( 148697 ) on Tuesday February 10, 2004 @09:45AM (#8236617)
    My site [lyle.org] was one of the past ones featured on Slashdot.

    Unfortunately, all data isn't released. There is not radiometric data or pointing data for pictures, spectrometer data, etc.

    And NASA puts a hold on images they plan to use later for press conferences-- e.g. the individual PanCam pictures of the parachute and backshell weren't released. This goes directly against the promises they made pre-mission.
  • Re:Why b/w & filter? (Score:4, Informative)

    by mhollis ( 727905 ) on Tuesday February 10, 2004 @09:46AM (#8236618) Journal

    Essentially, that's what all professional cameras do.

    A broadcast television camera (which is really pretty low-resolution, unless it's a true HTDV camera) has three CCD sensors mounted to a prisim block that splits the image into the three component colors for television (RGB). The use of three CCDs for television is necessitated by the fact that the desired result is a color image without waiting to assemble a color composite from three black and whites. Broadcast television results in images that are pretty close to 640x480 (again, prety low res).

    The MER images are stills. As such, there is time to put together a composite of the separate components taken with the filters. The data desired is high resolution and each of the composite images (irRGB) yields different information. Additionally, JPL is not lacking computer time for assembling the result of the component images. We're not talking live video feeds here.

    I note that there has been some discussion of weight here. That is not a factor in this case. Each of the filters, together with the CCD and the precise movement motor probably weighs about the same as a three CCD system, but in this case, it is one CCD, so any defects can be known and programmed around so there are no trade-offs. The issues JPL/NASA are dealing with have more to do with the size of the data sets and the available time in which the MERs can communicate with Earth.

  • Re:21st C (Score:3, Informative)

    by RMH101 ( 636144 ) on Tuesday February 10, 2004 @09:48AM (#8236632)
    they probably wanted better than 5MP resolution - you can get higher res with a high quality scientific b&w camera. if you take 3 still photos through RGB it's functionally identical to a colour camera - i.e. it IS a colour camera - and there would be nothing gained by sending up a "colour" camera that took a single shot and ended up with a poorer quality (but by your definition a real colour picture) result.
  • Re:Why b/w & filter? (Score:3, Informative)

    by SmackCrackandPot ( 641205 ) on Tuesday February 10, 2004 @09:54AM (#8236671)
    We're working with an industrial use colour camera sensor - basically your typical digital CCD array with the rest of the camera removed (no auto-focus, white-balance, flash etc...) Pixels are arranged in a groups of 2x2 (Red, Green, Green, Blue). In bright scenes, the signal strength can bleed between the individual colour cells, which is extremely tricky to compensate for. However, If you take individual frames of each light wavelength that you are interested in using a monochrome camera, by using colour filters of your choice, you not only get a higher resolution, but you also know exactly the sensitivity of the CCD for that frequency.

    Also, the human vision system also performs white-balancing on it's own. If you've ever looked through a window at dusk in Winter, you'll notice that outside will appear with a blue tint, while if you're outside, all the rooms inside will appear to have an orange/yellow tint. Your eyes are trying to get the average colour to white.
  • by Fastolfe ( 1470 ) on Tuesday February 10, 2004 @10:03AM (#8236750)
    They don't necessarily have images from all filters. In many pictures it's more valuable from a scientific point of view to use the infrared filter instead of the red filter. They may only command the rover to take those three pictures.

    In most cases, the infrared filter is close enough to red that a composite still gives you a good image. They do occasionally take a picture with the red filter instead of the infrared, as the article states, but these aren't as useful for scientific purposes.

    If the public's interest can be satisfied with a composite using the IR channel, and you get a lot more science done with it, doesn't it make sense to use it? Their mistake was in releasing color photographs without noting that the color might not be right.

    Incidentally, all of the raw images are available on the NASA web site. Instructions for a do-it-yourself composite are available from the previous Slashdot article discussing the color of the images on Mars [slashdot.org].
  • Re:Why b/w & filter? (Score:5, Informative)

    by herko_cl ( 533936 ) on Tuesday February 10, 2004 @10:10AM (#8236821)
    Sorry to say this, but the parent in NOT "Informative". The only sensors with 3 photosites per pixel are Foveon's [foveon.com]. The vast majority of digital cameras has ONE photosite per pixel, and a Bayer mask (RGB filter) layered on top of it. Pixel color in the final image is then interpolated from the measured intensity of the three adjacent photosites. Yes, this means that digital cameras have higher Luma resoultion than Chroma. No, it does not matter much, because the eye is much more attracted to Luminance detail.
    Almost all of the manufactured sensors are black and white; only Foveon's are 3-color, and they're expensive for the resoultion and the first generation software had color clipping problems (overexposed areas of images went abruptly to white). This has apparently been fixed.
    A monochrome sensor with external filters is much more flexible than the single-duty Foveon, so I guess that's why they chose it. Also, NASA doesn't usually buy space-faring hardware off-the-shelf two weeks before launch, and this full-color sensor simply did not exist a couple of years ago.
  • Re:Why b/w & filter? (Score:5, Informative)

    by mdielmann ( 514750 ) on Tuesday February 10, 2004 @10:19AM (#8236876) Homepage Journal
    Note also that color CCDs aren't actually color, but a B/W CCD (There really isn't a such thing as a color CCD) with 3 different color filters applied pixel-by-pixel. This has the drawback of interpolation, and the collection of 1/3 of the raw data. The only way to get true color is to use either 3 (or more, depending on what you want to see) CCDs with an image splitter so they all see the same thing, or a series of filters, with each color taken in turn. Guess which one is smaller.
  • by mlyle ( 148697 ) on Tuesday February 10, 2004 @10:29AM (#8236970)
    Hehe. Thank you.

    As to positioning data: nope, it's not, but it is important for accurately producing anaglyphs/range maps/good stitches unfortunately. And the radiometric data -is- important for nearcolor-- I could release a lot more nearcolor imagery if I had confidence the radiometric data was right. As it is now, I have to inspect each image by hand and compare to the spectroscopy data I have on hand to make sure things are close to right. As to why those pieces of engineering data associated with the image aren't being distributed-- i don't know. Perhaps NASA assumes no one is interested.

  • Re:Why b/w & filter? (Score:3, Informative)

    by gujju ( 626201 ) on Tuesday February 10, 2004 @10:38AM (#8237039)
    Just for the parents information. This technique only works when the camera and the subject being photographed are not in motion. If they are in motion then, in the time it takes to switch filters, the picture would have changed and then you wouldnt be able to composite them together accurately.
    Gujju
  • Re:Why b/w & filter? (Score:5, Informative)

    by vofka ( 572268 ) on Tuesday February 10, 2004 @10:47AM (#8237089) Journal
    A broadcast television camera (which is really pretty low-resolution, unless it's a true HTDV camera) has three CCD sensors mounted to a prisim block that splits the image into the three component colors for television (RGB). The use of three CCDs for television is necessitated by the fact that the desired result is a color image without waiting to assemble a color composite from three black and whites. Broadcast television results in images that are pretty close to 640x480 (again, prety low res).

    Close, but not quite...

    A Broadcast Quality camera is usually capable of recording a substantially higher resolution of image than is eventually broadcast. This allows for much better editing facilities later on - ie. Cropping and resizing of the recorded images without loss of detail in the later broadcast. Final Broadcast (in the UK at least) is around 760x575 pixels (actual broadcast lines are 625, but several are taken by the Vertical Blanking Pulse, the Frame Field Markers and Teletext data) - but the camera definately records a much higher resolution than that.

    For comparison, a standard Hi-8 Domestic Hand Camera records around 540 picture lines (about 720x540), and the picture quality from this kind of camera is much lower than that needed by the broadcast editing suites to work effectively - just watch any "home video" programme (such as "You've been Framed!") for proof!

    Also, expensive professional broadcast cameras use "Dichromatic Mirrors", not prisms to do colour seperation. Prismatic seperation would lead to too much signal loss and colour bleed accross the image. The first mirror directs the Red image to the appropriate sensor, and also allows enough light of all wavelengths to pass to the next mirror, where the Green image is diverted to the appropriate sensor, and again, light of all wavelengths passes to the final sensor in the camera. Blue is never explicitly seperated from the incoming image, but is instead inferred from the intensity data from the three individual sensors.

    I can be very certain of both of these facts because my dad was a Video Electronics Engineer for the BBC for a number of years...
  • by Eevee ( 535658 ) on Tuesday February 10, 2004 @10:59AM (#8237262)

    I don't know the date of the first use, but Sergei Mikhailovich Prokudin-Gorskii [loc.gov] photographed parts of Russia for the Tsar in 1909.

    He used a three photo technique [loc.gov] where the scene was recorded three times on a glass plate (in a row, not overlaid) with different filters. If you look carefully at the river, there is color distortions from the small waves.

  • Magic Lantern (Score:3, Informative)

    by KaiBeezy ( 618237 ) <kaibeezy.gmail@com> on Tuesday February 10, 2004 @11:06AM (#8237357)

    When you take a look at what this old-tech can really do, it's quite astounding.

    The Library of Congress has an exhibition of pre-WWI (that's World War I) *color* photos of Russia [loc.gov] shot using the exact same process. Since this was a while before any practical color photo printing processes the photographer built a "magic lantern" for "optical color projections."

    Props to Bolo Boffin [blogspot.com] for the link.

  • by RetroGeek ( 206522 ) on Tuesday February 10, 2004 @11:12AM (#8237421) Homepage
    Read this [nasa.gov] and this [nasa.gov]. These are from the NASA Rover site and they explain it all.
  • Prokudin-Gorskii (Score:3, Informative)

    by rkenski ( 212876 ) on Tuesday February 10, 2004 @11:35AM (#8237712)
    Prokudin-Gorskii [loc.gov] travelled Russia taking color photos about a hundred years ago, a time when there was no color films. He used to take 3 pictures, one after another, each one with a different filter. He then projected the three together to get a color picture. Similar to Spirit but in a very, very old fashion.
  • LOC's Explaination (Score:3, Informative)

    by rsmith-mac ( 639075 ) on Tuesday February 10, 2004 @11:45AM (#8237844)
    The Library of Congress has an interesting exhibit [loc.gov] up devoted to an early 20th century Russian photographer who used this exact technique. The site includes a very detailed description of how this filter system works, along with dozens of color pictures from the photographer's travels. It's definately worth taking a look at, if not for the description, then for some very cool pictures.
  • This reminds me (Score:3, Informative)

    by IRNI ( 5906 ) <irni@NospaM.irni.net> on Tuesday February 10, 2004 @11:46AM (#8237858) Homepage
    of some turn of the century russian color photos [loc.gov] using the same or a similar technique. I like that people thought of doing stuff like this back then. It is amazing to see in color what things looked like back then. I think there are some from early in the 20th Century in New York online somewhere as well.
  • by dirt_puppy ( 740185 ) on Tuesday February 10, 2004 @12:56PM (#8238925)
    This 'rule' is pretty much fixed, as photochemical reactions in the eye are responsible for it which have fixed wavelength sensitivities. But as others have pointed out, the stuff in the eyes and lens filter a bit of the light, too.
    But, in fact, there are mutated humans who have differences in some of the substances responsible for seeing. Some have altered pigments (this is the most common case, and most commonly, this results in 'red' and 'green' absorption maxima getting closer, rendering the individual 'red-green blind' - usually not that much of a hindrance, but I had a friend who couldn't make out mushrooms clearly visible to me on a lawn because of that) There are also conditions where one type of cells is completely missing, this may be really bad for the people affected. Think of not seeing a red traffic light.
    Colorblindness usually affects men (more than 9 of 10 cases are men, IIRC) because the red/green color seeing substances are encoded in the X Chromosome. If one is broken in a female, then she has another one as a backup, which men don't have.
    I have never heard of "superhumans" though who can see ultraviolet or something, besides the constant rumours about "tetrachromates", that's women (because of the X chromosome) who have four types of "color cells" (all would be seeing in the normal "visible range" though). But there's not much evidence about this, so take it with a grain fo salt.

    Color blindness - example images and details [webexhibits.org]
    Tetrachromates [utk.edu]

  • by joshv ( 13017 ) on Tuesday February 10, 2004 @12:59PM (#8238964)
    This is way off, you've somehow killed the little red, white, and blue flag on the arm.
  • Re:Why b/w & filter? (Score:2, Informative)

    by Anonymous Coward on Tuesday February 10, 2004 @01:00PM (#8238980)
    The images are compressed. It sounds like they are doing wavelet compression that is similar to JPEG2000. For details take a look at:
    http://tmo.jpl.nasa.gov/progress_report/42-15 5/155 J.pdf
  • by ajs ( 35943 ) <{ajs} {at} {ajs.com}> on Tuesday February 10, 2004 @01:23PM (#8239270) Homepage Journal
    It looks like you took your image from the JPEG that NASA put up on the Web. Bad idea, of course. At first, I just wrote it off as an artifact, but it does exist in the original image [nasa.gov] (a 48MB TIFF file from the Mars gallery [nasa.gov]).

    I have put up a crop of the original [ajs.com] which you can feel free to stare at. Yes, it does appear to be some sort of round object with two large protrusions. It could easily be a rock of volcanic origin, but my bet is on its being some piece of the lander itself.
  • by Tablizer ( 95088 ) on Tuesday February 10, 2004 @01:33PM (#8239416) Journal
    Pioneer 10 and 11, which both flew past Jupiter in the mid 70's, used only a red and blue-green filter IIRC. They used these two colors to approximate full-color images based on earth-bound images. I don't remember anybody fussing about it back then. A disclaimer of sorts was usually in the more technical articles, but many articles said nothing about it.
  • Re:Why b/w & filter? (Score:5, Informative)

    by SnowZero ( 92219 ) on Tuesday February 10, 2004 @01:45PM (#8239562)
    Well, they did say "bundle like a TV" which is correct, but they did mistakenly call that one pixel. For all practical purposes Bayer filters are lying about the resolution and you might as well call the 2x2 block a single color pixel.

    Foveon's sensors don't have really good color separation, and NASA wanted to have more than RGB anyway (they actually have something like 14 different filters). For science you don't want to limit yourself to just the visible spectrum. Hubble works similarly.
  • by dekashizl ( 663505 ) on Tuesday February 10, 2004 @04:03PM (#8241118) Journal
    Very good, technical article making point that NASA is not altering colors on Mars [atsnn.com] (beyond normal minimal adjustments to generate color images, of course).

    --
    For news, status, updates, scientific info, images, video, and more, check out:
    (AXCH) 2004 Mars Exploration Rovers - News, Status, Technical Info, History [axonchisel.net].
  • Re:Interesting. (Score:3, Informative)

    by KewlPC ( 245768 ) on Wednesday February 11, 2004 @06:13AM (#8247080) Homepage Journal
    I hate to be the one to tell you this, but no. No.

    First of all, the calibration strip: yeah, the MERs each have something similar as well.

    You can't rely solely on them, though. If the light that filters down to the surface is tinted any color other than whitish-blue (like her on Earth), trying to match the calibration device to the one back here on Earth is going to produce the wrong coloration.

    Secondly, it's widely known that the Viking lander images showing the Martian sky as blue were colored incorrectly.

    Thirdly, the Martian atmosphere is mostly CO2, which doesn't scatter blue light all that much AFAIK. Throw in a surprising amount of reddish-orange dust that's almost always there, and the probability that the Martian atmosphere is usually reddish (or butterscotch) is pretty good. Now, it isn't always reddish, or so I've read. Hubble took a photo of Mars, IIRC, and near the edges the atmosphere looks bluish. This is probably because the light has much more atmosphere to go through at sunrise and sunset than it does during the day, and so what little oxygen is in the Martian atmosphere has a chance to scatter the blue light. There's even a Mars Pathfinder image to back this up right here [nasa.gov].

    The image doesn't prove that the Martian atmosphere is blue, but it does show that there is a lot of dust in the atmosphere over there, and that under the right conditions it can be a little bit blue in some places.

The key elements in human thinking are not numbers but labels of fuzzy sets. -- L. Zadeh

Working...