Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Space Science

The Real Reason why Spirit Only Sees Red 273

use_compress writes To produce a color photograph, the rover's panoramic camera takes three black-and-white images of a scene, once with a red filter, once with a green filter and once with a blue filter. Each is then tinted with the color of the filter, and the three are combined into a color image. In assembling the Spirit photographs, however, the scientists used an image taken with an infrared filter, not the red filter (NYTimes, Free Registration Required). Some blue pigments like the cobalt in the rover color chip also emit this longer-wavelength light, which is not visible to the human eye."
This discussion has been archived. No new comments can be posted.

The Real Reason why Spirit Only Sees Red

Comments Filter:
  • by Space cowboy ( 13680 ) on Tuesday February 10, 2004 @09:08AM (#8236379) Journal
    The reason being that the science gets better results using th e IR filter than if the red filter were used... At the moment, despite great public interest, the science is more important... that IS what it's there for....

    Simon
    • by SpinyManiac ( 542071 ) on Tuesday February 10, 2004 @09:22AM (#8236477)
      Not just that, using a B/W camera allows them to use any filter they like.

      They have at least 14 filters, taking 14 cameras would be impossible.

      Info here. [nasa.gov]
      • As I understand it, the explanation is simply that the public was given pictures using filters intended for scientific research. This alters the printed colors. At this point NASA should have given more pictures that produce colors closely matching what the human eye sees. With color chips and photoshop(tm), along with a picture taken on earth before the mission, even I could come up with a presentable picture.
        • so, i threw this one together the other day, is it anywhere close do you guys reckon?

          Spirit-pano-rgb-compose.jpg [sunsite.dk]
        • by daina ( 651638 ) on Tuesday February 10, 2004 @02:50PM (#8240300)
          It is a misconception that you can use Photoshop or some other image processing program to produce a "true colour" or RGB image when one channel represents infrared data. Here's why:

          If you use an infrared filter like the L2 filter on Sprit's Pancam, you get data that represents only things which reflect or emit light in that particular region of the spectrum. Anything that emits light ONLY in the red will be absent from the data set. It is possible for something that appears as a fairly monochromatic red to be entirely invisible. How can you use Photoshop to put back something that is invisible? You cannot.

          You can adjust an individual colour in the image using a reference image taken with the appropriate filters, and that colour will then appear correct. Other colours, however, will remain distorted.

          Worse, you cannot possibly know the emission/reflectivity spectrum of things on Mars, so any image you produce that appears to show the sundial colour chips correctly may distort terribly the Mars components of the image. It is not really very interesting to see a colour corrected photo of the sundial, is it? We could have achieved that without sending the rover all the way to Mars.

          Nope, using a relatively narrow-band-pass infrared filter like the L2 simply leaves out information about the red part of the spectrum, and extrapolation only goes so far in recreating that data. Non-linear data - discontinuities within the missing portion of the spectrum - are simply gone, never to be retrieved.

          Also, NASA is lying. Perhaps 'lying' is too strong a word, but they are either deceiving us or they are operating under a serious misconception.

          "We just made a mistake," said Dr. James F. Bell III, the lead scientist for the camera. "It's really just a mess-up." Well, NASA claims to be releasing the raw data from Spirit on its web site, but the raw data does not contain any image sets for the panoramas taken with the L4, L5, L6 filters. They have almost never used the L4 filter.

          So either the "mess up" is that they have forgotten to use the L4 filter from day one (unlikely, since each photograph taken presents another opportunity to switch to the L4) or that they have L4 images but they are not releasing them, in which case they really are not releasing the raw data.

          The argument about the L2 being better for science is bogus. There's no way that NASA scientists are doing serious mineral analysis with a pretty, stitched-together wide view panorama. That's just rubbish. they would be looking at detail images, and possibly comparing between detail-level images. The panoramas are strictly for public consumption, and maybe office posters at JPL.

          It's probably not a conspiracy, but it is a mystery.

    • by mcbevin ( 450303 ) on Tuesday February 10, 2004 @09:26AM (#8236503) Homepage
      If you read the whole article, you'd see that they actually used both the infrared AND the red filter for the pictures. So they had their infrared for their science as well as the red for the photos to show the public. However they mucked up in producing the photos for the public, using the infrared instead of the red. Nothing to do with science vs public interest, rather a simple mistake.
    • Was in my sig a week ago too. :^P Nasa denies 'sexing up' mars images [ananova.com] which references the New Scientist story.

      Will Slashdot cover oxygen discovered on extra-solar planet Osiris next week? Stay tuned!

    • by SnowZero ( 92219 ) on Tuesday February 10, 2004 @01:13PM (#8239141)
      And it was on NASA's site almost two weeks ago:
      Revealing Mars' True Colors: Part One [nasa.gov]
      Revealing Mars' True Colors: Part Two [nasa.gov]
      Nothing to see here, take off the tinfoil hat.
  • by Channard ( 693317 ) on Tuesday February 10, 2004 @09:08AM (#8236384) Journal
    'Aha! So that's why they don't see little green men...' - at last, the dream of aliens living on Mars is alive again.
  • by mobiux ( 118006 ) on Tuesday February 10, 2004 @09:10AM (#8236389)
    They mention slashdot.org by name.

    Could this be some sort of revenge?
  • Why b/w & filter? (Score:5, Interesting)

    by Lolaine ( 262966 ) on Tuesday February 10, 2004 @09:11AM (#8236394)
    Can anyone explain why 3 separate B/W images are taken? If it is because of bandwidth... 3 grayscale images weights (more or less) like one color image ... so why B/W and filters?
    • Re:Why b/w & filter? (Score:4, Informative)

      by anubi ( 640541 ) on Tuesday February 10, 2004 @09:16AM (#8236422) Journal
      I understand its because its a heckuva lot easier to build high resolution cameras as monochrome, as you can place the pixels immediately adjacent to each other and not concern yourself with placement of color filter masks.

      Also, having external color filter masks which can be rotated into place means we are no longer limited in vision to just the visible spectrum we see, but we can see anything the raw silicon sensor allows, meaning we can also view the infrared to ultraviolet, and let us assign "pseudo color" as we see fit.

      • Re:Why b/w & filter? (Score:4, Informative)

        by mhollis ( 727905 ) on Tuesday February 10, 2004 @09:46AM (#8236618) Journal

        Essentially, that's what all professional cameras do.

        A broadcast television camera (which is really pretty low-resolution, unless it's a true HTDV camera) has three CCD sensors mounted to a prisim block that splits the image into the three component colors for television (RGB). The use of three CCDs for television is necessitated by the fact that the desired result is a color image without waiting to assemble a color composite from three black and whites. Broadcast television results in images that are pretty close to 640x480 (again, prety low res).

        The MER images are stills. As such, there is time to put together a composite of the separate components taken with the filters. The data desired is high resolution and each of the composite images (irRGB) yields different information. Additionally, JPL is not lacking computer time for assembling the result of the component images. We're not talking live video feeds here.

        I note that there has been some discussion of weight here. That is not a factor in this case. Each of the filters, together with the CCD and the precise movement motor probably weighs about the same as a three CCD system, but in this case, it is one CCD, so any defects can be known and programmed around so there are no trade-offs. The issues JPL/NASA are dealing with have more to do with the size of the data sets and the available time in which the MERs can communicate with Earth.

        • Re:Why b/w & filter? (Score:5, Informative)

          by vofka ( 572268 ) on Tuesday February 10, 2004 @10:47AM (#8237089) Journal
          A broadcast television camera (which is really pretty low-resolution, unless it's a true HTDV camera) has three CCD sensors mounted to a prisim block that splits the image into the three component colors for television (RGB). The use of three CCDs for television is necessitated by the fact that the desired result is a color image without waiting to assemble a color composite from three black and whites. Broadcast television results in images that are pretty close to 640x480 (again, prety low res).

          Close, but not quite...

          A Broadcast Quality camera is usually capable of recording a substantially higher resolution of image than is eventually broadcast. This allows for much better editing facilities later on - ie. Cropping and resizing of the recorded images without loss of detail in the later broadcast. Final Broadcast (in the UK at least) is around 760x575 pixels (actual broadcast lines are 625, but several are taken by the Vertical Blanking Pulse, the Frame Field Markers and Teletext data) - but the camera definately records a much higher resolution than that.

          For comparison, a standard Hi-8 Domestic Hand Camera records around 540 picture lines (about 720x540), and the picture quality from this kind of camera is much lower than that needed by the broadcast editing suites to work effectively - just watch any "home video" programme (such as "You've been Framed!") for proof!

          Also, expensive professional broadcast cameras use "Dichromatic Mirrors", not prisms to do colour seperation. Prismatic seperation would lead to too much signal loss and colour bleed accross the image. The first mirror directs the Red image to the appropriate sensor, and also allows enough light of all wavelengths to pass to the next mirror, where the Green image is diverted to the appropriate sensor, and again, light of all wavelengths passes to the final sensor in the camera. Blue is never explicitly seperated from the incoming image, but is instead inferred from the intensity data from the three individual sensors.

          I can be very certain of both of these facts because my dad was a Video Electronics Engineer for the BBC for a number of years...
    • Re:Why b/w & filter? (Score:4, Informative)

      by richie2000 ( 159732 ) <rickard.olsson@gmail.com> on Tuesday February 10, 2004 @09:21AM (#8236466) Homepage Journal
      My guess is that it's easier to get more resolution out of the camera this way. You can use the full resolution for every colour instead of having 4 sensors (RGB + IR) on-chip per pixel. More on the MER cameras here [space.com]

      I can still remember using a NewTek DigiView digitizer with a b/w video camera and filters so I guess the Alzheimer hasn't gotten to me yet. :-)

    • The rover just doesn't have a color camera. Everything on the rover had to be justified, and a color camera would have no real value as they can fake it with the filters.

      • by MCZapf ( 218870 )
        Fake it?? It's still a real picture! The landscape isn't moving, so it doesn't really matter if the camera captures each color in succession, rather than all at once, as in most cameras. It's a tradeoff; it takes longer to capture all the data, but you get a higher resolution full-color image as a result.
    • Re:Why b/w & filter? (Score:3, Informative)

      by MartyC ( 85307 )
      The imaging system used is a monochromatic camera, because they are simpler to operate and calibrate. The science teams aren't particularly interested in colour photography, the filters are there to narrow down the response range of the detector to provide some useful information on the surface properties of the things they image, as different minerals reflect/absorb/scatter light differently. By using filters of known transmission characteristics you can infer things about the soil and rocks around you. A
    • Re:Why b/w & filter? (Score:5, Informative)

      by EvilTwinSkippy ( 112490 ) <{yoda} {at} {etoyoc.com}> on Tuesday February 10, 2004 @09:26AM (#8236500) Homepage Journal
      They are sending the raw picture uncompressed. Well, they might use a run-length encoding, but the result is a lossless image. JPEG is so much smaller because it really cuts corners, and exploits the fact that our eyes are more sensitive to contrast than the magnetude difference between colors.

      With scientific imaging, OTOH, you want the raw information coming off the CCD. They are interested in everything, not just what the human eye can see.

      So, with lossless encoded, 3 greyscale images actually come out to be the same size as a color image. (Look at a color TIFF for example.) The advantage of the B/W and filter approach is that you need only one capture device. On a spacecraft there are many design advantages. Besides, you now have 3 copies of the same image. You never know when one copy will pick something up that the others missed.

      • To blockquoth the source [cornell.edu]
        When each twin-lens CCD (charge-coupled device) camera takes pictures, the electronic images will be sent to the rover's onboard computer for a number of onboard image processing steps, including compression, before the data are sent to Earth


        Unfortuantly I can't find any references as to the loss{y|less}ness of the compression used
      • Re:Why b/w & filter? (Score:5, Informative)

        by mdielmann ( 514750 ) on Tuesday February 10, 2004 @10:19AM (#8236876) Homepage Journal
        Note also that color CCDs aren't actually color, but a B/W CCD (There really isn't a such thing as a color CCD) with 3 different color filters applied pixel-by-pixel. This has the drawback of interpolation, and the collection of 1/3 of the raw data. The only way to get true color is to use either 3 (or more, depending on what you want to see) CCDs with an image splitter so they all see the same thing, or a series of filters, with each color taken in turn. Guess which one is smaller.
        • There is a color CCD, it's made by Foveon, the New York Times
          just had an article on it this week.

          Of course, the Times could have gotten it wrong, they do that
          sometimes.

          http://nytimes.com/2004/02/09/technology/09camer a. html
    • Versatility (Score:5, Insightful)

      by Detritus ( 11846 ) on Tuesday February 10, 2004 @09:30AM (#8236525) Homepage
      Instead of being limited to some fixed approximations of red, green, and blue, they can use a larger set of filters that are tailored for various science objectives.

      The human eye's color vision is a poor scientific instrument. It can be easily fooled.

    • We're working with an industrial use colour camera sensor - basically your typical digital CCD array with the rest of the camera removed (no auto-focus, white-balance, flash etc...) Pixels are arranged in a groups of 2x2 (Red, Green, Green, Blue). In bright scenes, the signal strength can bleed between the individual colour cells, which is extremely tricky to compensate for. However, If you take individual frames of each light wavelength that you are interested in using a monochrome camera, by using colour
    • Re:Why b/w & filter? (Score:3, Interesting)

      by Sahib! ( 11033 )

      On a slightly related note, a Russian photographer named Sergei Mikhailovich Prokudin-Gorskii pioneered this technique of obtaining color images using colored filters and monochromatic film in the early 1900's. He actually built his own camera with three vertically-oriented lenses, each with a red, green or blue filter. The camera took the three pictures at the same time, but some interesting distortions come through because of the slight differences in paralax.

      http://www.loc.gov/exhibits/empire/ [loc.gov]

      This

  • by kinnell ( 607819 ) on Tuesday February 10, 2004 @09:12AM (#8236395)
    It's a good job the pictures aren't coming back with a blue tint, or lynch mobs would be turning up at NASA HQ.
  • I tried showing them to my pet bull and he immediately became bad-tempered and generally unpleasant to be around of. He's much fonder of the Neptune shots from Voyager really...
  • Short version (Score:5, Informative)

    by the_crowbar ( 149535 ) on Tuesday February 10, 2004 @09:18AM (#8236438)

    On the panoramic picture: We goofed. It should not have been that red.

    The other photographs are taken with the infa-red instead of visible red filter. Iron dominated the visible red spectrum. To allow a better analysis of the compounds found infa-red light is used instead.

    <joke>No conspiracy here. Move on.</joke>

    the_crowbar
  • They're taking images through blue, green, red and infrared filters. The color shift problem in the publicly released images is because they're blending in the infrared shot instead of the red shot, right? Why don't they just release the RGB images as well as the iRGB? They have all the images after all--why waste press conferences explaining the differences or lack thereof when they could just give us the pictures?
    • by Anonymous Coward on Tuesday February 10, 2004 @09:24AM (#8236491)
      They release all the individual raw pictures on the mars rovers website. You are free to composite them yourself.

      The engineers are focusing on the filters that return good science.
    • by Seahawk ( 70898 ) <tts&image,dk> on Tuesday February 10, 2004 @09:27AM (#8236510)
      ALL data IS actually released.

      Cant remember links from the top of my head(Search older /. stories), but several people have taken the raw data and composed their own versions of the colour photos.

      AFAIR the things is a bit more complicated though - the cameras have 7 different filters, which have quite a bit of overlap, and doesnt peak at frequencies of light that directly could be used in an RGB image - so some fiddling is requered.

      And TBH - I think its perfectly fine NASA doesn't focus on producing "correct" images if it doesn't mean better science! :)
      • by mlyle ( 148697 ) on Tuesday February 10, 2004 @09:45AM (#8236617)
        My site [lyle.org] was one of the past ones featured on Slashdot.

        Unfortunately, all data isn't released. There is not radiometric data or pointing data for pictures, spectrometer data, etc.

        And NASA puts a hold on images they plan to use later for press conferences-- e.g. the individual PanCam pictures of the parachute and backshell weren't released. This goes directly against the promises they made pre-mission.
        • Havent seen your site before - but VERY nice pictures! :)

          I was just under the impression all imaging data was released...

          Positioning data isnt really that important for creating truecolor images... ;)
          • Hehe. Thank you.

            As to positioning data: nope, it's not, but it is important for accurately producing anaglyphs/range maps/good stitches unfortunately. And the radiometric data -is- important for nearcolor-- I could release a lot more nearcolor imagery if I had confidence the radiometric data was right. As it is now, I have to inspect each image by hand and compare to the spectroscopy data I have on hand to make sure things are close to right. As to why those pieces of engineering data associated with
        • by robsimmon ( 462689 ) on Tuesday February 10, 2004 @11:27AM (#8237604)
          I would guess (based on my experience with other NASA data archives) that the full scientific data are not being released until they've been calibrated, at which point they'll probably end up in the Planetary Data System [nasa.gov] It's also possible that the Principle Investigators (who are affiliated with Cornell, not NASA) have exclusive use of the data for some period of time. Scientists are often very reluctant to share data until they're happy with it. Whether this is good public policy (since the data was all paid for by the US public) or good science is open to debate, but it's certainly not a conspiracy.

          In the case of the more dramatic images, Public Affairs is almost certainly embargoing the images so the press release will (in theory) have more impact. If you really want the data you can always try a Freedom of Information Act [nasa.gov] request.
          • No conspiracy theory here, just I think it's bad PAO PR to embargo. Me putting together some images on my site is not going to lessen the impact when they hold their press conference. But my inability to get the imagery annoys me and the rest of the hobbyist community.

            Sure, PDS is the authentic source for mission scientific data, but would it really be hard to throw us a bone with a few technical numbers? It's getting pushed occasionally for some of the imagery with Maestro updates-- why can't they just
            • in principle I agree about embargoing data to fit a scheduled press release, but on the other hand NASA would be justifiably upset if a major newspaper or network ran a mars image and gave credit to a hobbyist. (I've seen the AP, Reuters, and AFP frequently get credit for NASA images)

              it may also be an issue of infrastructure--archiving and distributing scientific data is non-trivial. you want to make sure all the data packets have been received, ensure data integrity, make sure the medata is connected to t
    • They don't necessarily have images from all filters. In many pictures it's more valuable from a scientific point of view to use the infrared filter instead of the red filter. They may only command the rover to take those three pictures.

      In most cases, the infrared filter is close enough to red that a composite still gives you a good image. They do occasionally take a picture with the red filter instead of the infrared, as the article states, but these aren't as useful for scientific purposes.

      If the publ
  • by drfishy ( 634081 ) on Tuesday February 10, 2004 @09:19AM (#8236445)
    Reminds me of the Gameboy Camera Color Project: http://www.ruleofthirds.com/gameboy/
  • Blue? Infrared? (Score:2, Interesting)

    by Rufus88 ( 748752 )

    Some blue pigments like the cobalt in the rover color chip also emit this longer-wavelength light, which is not visible to the human eye."

    If it's a *blue* pigment, why does it emit a *longer* (i.e. infrared) wavelength?
    • Re:Blue? Infrared? (Score:2, Informative)

      by LordK2002 ( 672528 )

      Some blue pigments like the cobalt in the rover color chip also emit this longer-wavelength light, which is not visible to the human eye."

      If it's a *blue* pigment, why does it emit a *longer* (i.e. infrared) wavelength?

      Clarification of the original statement: "some materials, such as cobalt, which reflect light that appears blue to the human eye, also reflect light in the infra-red range".

      It emits both blue and infra-red, neither has any effect on the other - we just only see the blue because the hum

    • Re:Blue? Infrared? (Score:4, Informative)

      by dwm ( 151474 ) on Tuesday February 10, 2004 @09:39AM (#8236581)
      A material can emit light at various wavelengths, and at wavelengths quite different than that which it reflects, which is what you most commonly see in the visible range. It's quite possible for something to reflect blue light and emit light at wavelengths longer than the visible range.
    • Re:Blue? Infrared? (Score:4, Informative)

      by Bill_Mische ( 253534 ) on Tuesday February 10, 2004 @09:39AM (#8236582)
      I'd imagine because the blue colour corresponds to an electron transition in the d-orbital whereas the IR corresponds to a different transition or more likely to a change in mode of molecular vibration.

      One of the few things I remember from my chemistry degree was that many pigments are far brighter in the UV region since the "normal" colour corresponds to a forbidden transition - i.e. one that involves a change of spin as well as change of orbital.

      I do hope that wasn't a rhetorical question...
    • It's not emitting much of anything near visible wavelengths, since the temperature maxes at about 5 degrees C. (for significant emissions you need to be in the thermal infrared (measued by the Mini-Thermal Emission Spectrometer, [cornell.edu] not the pan camera). The pan camera is seeing reflected light.
  • Partner Link (Score:4, Informative)

    by HFShadow ( 530449 ) on Tuesday February 10, 2004 @09:35AM (#8236553)
  • maybe you should try taking infrared photos?

    most of the digital cameras on the market dont have countermeasures to prevent IR exposures, so feel free to experiment with various infrared-transmitting, deep red and light red filters.

    from my non-scientific experience, ultraviolet photos of rocks is more interesting than infrared.
  • by tjmcgee ( 749076 ) on Tuesday February 10, 2004 @09:49AM (#8236640) Homepage
    The martian crab http://homepage.mac.com/thomasmcgee/ I know, I know, go ahead, mod me off topic. The truth is out there. Would anyone like to start a petition that requests NASA to try to get one more photo of this thing before they drive away?
  • Art Buchwald has the whole scoop here [washingtonpost.com].
  • AHA!! That's why it had that problem. It stopped and was waiting for the traffic light to turn green.
  • I've often wondered (Score:3, Interesting)

    by ajs318 ( 655362 ) <sd_resp2@@@earthshod...co...uk> on Tuesday February 10, 2004 @10:25AM (#8236932)
    I've often wondered exactly how rigid the "400-700nm is visible" rule applies. We know that some animals can see infra-red and ultra-violet. But just how well-defined is the wavelength range for human beings? I mean, our bodies are different shapes and sizes, our voices have different pitches, our ears have varying ranges, some of us are allergic to certain substances that others are not ..... but has anyone ever investigated the phenomenon of what wavelengths humans can see? Is it a person-to-person variable, or is it constant for everyone? Can some people see IR, red and green, for instance, instead of red, green and blue? Or green, blue and UV, for that matter ..... and what would it look like?
    • by sugar and acid ( 88555 ) on Tuesday February 10, 2004 @11:18AM (#8237486)
      A way of dealling with blindness from cataracts (the lens of the eye turning opaque), is to remove the lens of the eye and replace it with an artificial lens. An interesting side effect is that without the lens people can see further into the UV region of light.

      Interestingly the work of Claude Monet demonstrates this. Starting with his early work which is clear and in the normal colour range, then he develops cataracts and his work is more undefined swirls of colour, often dark and dim. Then he has cataract surgery and the new work is bright and vibrant, but with a deep purple/blue hue to many things because of the now increased presence of UV light in his vision.
    • This 'rule' is pretty much fixed, as photochemical reactions in the eye are responsible for it which have fixed wavelength sensitivities. But as others have pointed out, the stuff in the eyes and lens filter a bit of the light, too.
      But, in fact, there are mutated humans who have differences in some of the substances responsible for seeing. Some have altered pigments (this is the most common case, and most commonly, this results in 'red' and 'green' absorption maxima getting closer, rendering the individual
  • by jms ( 11418 ) on Tuesday February 10, 2004 @10:38AM (#8237040)
    This problem is not unique to the Mars rovers.

    As a hobby and as income, I make borosilicate lampwork beads and sell them on ebay. This requires me to take digital pictures of my beads, which I do with a Nikon Coolpix 885.

    Every once in a while I run into a color combination that simply cannot be photographed correctly. One bead set I have looks brown/butterscotch/caramel to the eye, but when photographed using that particular camera, some of the brown features in the bead come out electric red.

  • They've taken several pictures of the lander platform [nasa.gov] in true color in order to calibrate the camera.
  • Given that we're having so much trouble figuring out what the human eye would see (w.r.t. color), I probably shouldn't even bother to ask, but does anyone know how bright Martian daylight would appear to the naked eye? Insufficiently bright for sunglasses, for example? How (un)comfortable would it be looking at the sun?

    I know the human eye is fairly adaptive in this regard, but I'm curious about the qualitative answer to this question. (Quantitative answers expressed in lumens or whatever won't quite do
  • by Eevee ( 535658 ) on Tuesday February 10, 2004 @10:59AM (#8237262)

    I don't know the date of the first use, but Sergei Mikhailovich Prokudin-Gorskii [loc.gov] photographed parts of Russia for the Tsar in 1909.

    He used a three photo technique [loc.gov] where the scene was recorded three times on a glass plate (in a row, not overlaid) with different filters. If you look carefully at the river, there is color distortions from the small waves.

  • Magic Lantern (Score:3, Informative)

    by KaiBeezy ( 618237 ) <kaibeezy&gmail,com> on Tuesday February 10, 2004 @11:06AM (#8237357)

    When you take a look at what this old-tech can really do, it's quite astounding.

    The Library of Congress has an exhibition of pre-WWI (that's World War I) *color* photos of Russia [loc.gov] shot using the exact same process. Since this was a while before any practical color photo printing processes the photographer built a "magic lantern" for "optical color projections."

    Props to Bolo Boffin [blogspot.com] for the link.

  • by RetroGeek ( 206522 ) on Tuesday February 10, 2004 @11:12AM (#8237421) Homepage
    Read this [nasa.gov] and this [nasa.gov]. These are from the NASA Rover site and they explain it all.
    • Oh, come on. We're not interested in facts here. It's much more interesting to claim that a bunch of rocket scientists don't know what they're doing and that "we" are smarter than they are! Didn't you get the memo?
  • JPL says (Score:3, Interesting)

    by __aagmrb7289 ( 652113 ) on Tuesday February 10, 2004 @11:17AM (#8237471) Journal
    This isn't what JPL said. They said they were using a full color, basic digital camera. Damn, where's that link?
  • Prokudin-Gorskii (Score:3, Informative)

    by rkenski ( 212876 ) on Tuesday February 10, 2004 @11:35AM (#8237712)
    Prokudin-Gorskii [loc.gov] travelled Russia taking color photos about a hundred years ago, a time when there was no color films. He used to take 3 pictures, one after another, each one with a different filter. He then projected the three together to get a color picture. Similar to Spirit but in a very, very old fashion.
  • LOC's Explaination (Score:3, Informative)

    by rsmith-mac ( 639075 ) on Tuesday February 10, 2004 @11:45AM (#8237844)
    The Library of Congress has an interesting exhibit [loc.gov] up devoted to an early 20th century Russian photographer who used this exact technique. The site includes a very detailed description of how this filter system works, along with dozens of color pictures from the photographer's travels. It's definately worth taking a look at, if not for the description, then for some very cool pictures.
  • This reminds me (Score:3, Informative)

    by IRNI ( 5906 ) <irniNO@SPAMirni.net> on Tuesday February 10, 2004 @11:46AM (#8237858) Homepage
    of some turn of the century russian color photos [loc.gov] using the same or a similar technique. I like that people thought of doing stuff like this back then. It is amazing to see in color what things looked like back then. I think there are some from early in the 20th Century in New York online somewhere as well.
  • Now the world has gone to bed,
    Darkness won't engulf my head,
    I can see by infra-red,
    How I hate the night.
  • by lildogie ( 54998 ) on Tuesday February 10, 2004 @12:50PM (#8238823)
    Well, since they're only using one camera, and switching the filters, that means we can never know the true color of Martians.
  • by Tablizer ( 95088 ) on Tuesday February 10, 2004 @01:33PM (#8239416) Journal
    Pioneer 10 and 11, which both flew past Jupiter in the mid 70's, used only a red and blue-green filter IIRC. They used these two colors to approximate full-color images based on earth-bound images. I don't remember anybody fussing about it back then. A disclaimer of sorts was usually in the more technical articles, but many articles said nothing about it.
  • Umm... (Score:3, Interesting)

    by KewlPC ( 245768 ) on Tuesday February 10, 2004 @01:35PM (#8239444) Homepage Journal
    I think people are also forgetting that, on Mars, white light is probably not reaching the surface. The dust in Mars' atmosphere is probably tinting the sunlight a little bit red, which certainly doesn't make getting the "correct" color easy.

    But a comparison of the Mars Pathfinder images against the MER images shows that the colors in the MER images are too red. In the MPF images the rocks aren't all the same color.

    It's pretty obvious that NASA's been doing a lot of Photoshopping on these images. While some Photoshop'ing is necessary (to merge the 3 grayscale images and to eliminate the seams in the panoramic images), I think they're overdoing it this time. I can't find the link right now, but there's one image in particular where it's blatantly obvious that they've replaced the sky with a single, solid color (you can see jaggies along the horizon in the high-resolution version).

    I'm not trying to be all conspiracy theorist or anything. I certainly don't think they're faking the landings, nor do I think the Martian sky is bright blue as some have suggested.
    • Dust? (Score:3, Interesting)

      Mars has an atmosphere thick and windy enough to kick up dust to prevent sunlight from reaching the surface?

      Hm. I'm no meteorologist, but I wasn't seeing any evidence of a dusty atmosphere in any of those rover images. --Details at distance seemed as clear as near objects. There's WAY more crap in Earth's much more robust atmosphere, and we get plenty of white light.


      -FL

  • Pancam Details/Specs (Score:3, Interesting)

    by dekashizl ( 663505 ) on Tuesday February 10, 2004 @03:44PM (#8240914) Journal
    Here is a collection of links from the MER2004 Rovers and their Instruments Technical Info [axonchisel.net] section of the page listed below, with specs and details of the Pancam and its filters. Interesting reading:

    --
    For news, status, updates, scientific info, images, video, and more, check out:
    (AXCH) 2004 Mars Exploration Rovers - News, Status, Technical Info, History [axonchisel.net].
  • by dekashizl ( 663505 ) on Tuesday February 10, 2004 @04:03PM (#8241118) Journal
    Very good, technical article making point that NASA is not altering colors on Mars [atsnn.com] (beyond normal minimal adjustments to generate color images, of course).

    --
    For news, status, updates, scientific info, images, video, and more, check out:
    (AXCH) 2004 Mars Exploration Rovers - News, Status, Technical Info, History [axonchisel.net].

"What man has done, man can aspire to do." -- Jerry Pournelle, about space flight

Working...