Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Space Graphics Software Science

Colorization of Mars Images? 784

ares2003 writes "There is no scientific reason, why JPL is colorizing Mars in that dull red tint as in their press release images. In the latest panorama image, there is a hint, that they deliberately altered the colors, as the blue and green spots on the color calibration target (the sundial) suddenly converted to bright red and brown. Source of original images: 1, 2 - (for highres replace "br" with "med"). At normal weather conditions, as we have at the moment, there should be a blue sky on Mars and earthlike colors. Furthermore the sky looks overcasted on the pictures as it cannot be considering the sharp shadows on the sundial. If the sky was overcast, then because of diffuse lighting, there would be no shadows. A few years ago, I did an investigation about that very same topic for the Viking and Pathfinder missions."
This discussion has been archived. No new comments can be posted.

Colorization of Mars Images?

Comments Filter:
  • No Secret (Score:5, Informative)

    by eean ( 177028 ) <.slashdot. .at. .monroe.nu.> on Friday January 09, 2004 @03:17PM (#7931144) Homepage
    Its no secret that they doctor the images for press release. They also have the original available. Check out Maestro, it was mentioned on Slashdot a few days ago, its almost the same software JPL uses, and the images in the data set are the original ones.
  • by Lispy ( 136512 ) on Friday January 09, 2004 @03:18PM (#7931151) Homepage
    Not sure if this could be the reason but the MER-A pictures aren't taken at a specific time but rather during a whole day.

    That means that the colors you see on the sundial don't match all frames of the final picture you get.

    NASA therefore alters the colors to match the pictures as closely as possible. Maybe this disturbs the color? Not sure though. What do you think?

  • by Anonymous Coward on Friday January 09, 2004 @03:18PM (#7931155)
    This is one of his pet theories, along with the fact that the reason for the color change is the hide the Earth-like appearance of Mars so we don't figure out that humans once lived there.
  • Filters (Score:5, Informative)

    by paul248 ( 536459 ) on Friday January 09, 2004 @03:21PM (#7931207) Homepage
    The images they took are shot through near-infrared filters, and then digitally adjusted to compensate. The pan-cams each have about 16 different types of filters on a rotating wheel, but this near-infrared filter is the only color that's common to both lenses. Therefore, when they're taking stereo images, that's the best one to use. It's not a conspiracy, and they'll probably release images taken through the other filters eventually.
  • by MooCows ( 718367 ) on Friday January 09, 2004 @03:21PM (#7931217)
    Actually, (AFAIK, IANAS, correct me if I'm wrong) the Hubble images are correct, but they're just using pretty colours to represent different kinds of radiation, not just the normal light.
  • by mark-t ( 151149 ) <markt.nerdflat@com> on Friday January 09, 2004 @03:22PM (#7931220) Journal
    With enough dust in the air, yes... Mars would have a red sky.

    But the same light refraction phenomenon that gives Earth a blue sky as seen from the ground should give Mars a blue sky as seen from the ground as well. Enough dust in the atmosphere could interfere with that sufficiently to create a red hue, but this should not be the norm in calm weather conditions.

  • by GabeK ( 701376 ) * on Friday January 09, 2004 @03:22PM (#7931229) Homepage
    They do that so that different elements of the image can be more easily identified, not to make things prettier. It does make for some very impressive images, but that isn't the point.
  • by Jugalator ( 259273 ) on Friday January 09, 2004 @03:24PM (#7931251) Journal
    as the blue and green spots on the color calibration target (the sundial) suddenly converted to bright red and brown.

    The "sudden" change happened as NASA "suddenly" applied another filter for the camera. They do this to better detect certain things in the picture I suppose. They spoke about it on a press conference when they was asked this question.

    From Mozilla guru Asa Dotzler's weblog [mozillazine.org]:

    Q. Then what we're seeing that's in that Pancam image doesn't correspond to what we'd see if we were standing there?

    Jim: we have a pair of red filters that give us stereo. The red you're asking about is the infrared filter which is different from the red humans see. We can convert that red easily. We also have a red filter that matches human sight red but we prefer to use the infrared filter to get matchup with both cameras. Two cameras each have 8 filters. One filter on one eye is a dense welder-like filter to look at the sun. On the left camera is low frequency and the right camera is higher frequencies. Total of 11 unique wavelengths.
  • by aenea ( 34844 ) on Friday January 09, 2004 @03:24PM (#7931255)
    My keyboard is obviously a part of the conspiracy. Butterscotch martian sky [nasa.gov]
  • by Eyah....TIMMY ( 642050 ) * on Friday January 09, 2004 @03:24PM (#7931261)
    Unfortunately, it seems the primary motivation for the Mars for the general population is now sensationalism. I'm sure the Slashdot audience how a different view on Mars though.
    USA Today has a good article [usatoday.com] about how Mars is shifting from science to politics.
    The Washington Post explains better the goals of the current US gov [washingtonpost.com].

    I'm not saying it's necessarily a bad thing because that's usually how space projects get more funding but it might explain why the photos are looking more "nice to the user" than "scientifically realistic".
  • by marsvin ( 84268 ) on Friday January 09, 2004 @03:27PM (#7931297)
    I don't think this has been addressed on it yet, but a good reference for these sorts of claims is Bad Astronomy [badastronomy.com].
  • by legoleg ( 514805 ) on Friday January 09, 2004 @03:28PM (#7931321)
    Read here [nwsource.com]

    The sundial from a little while ago helps find tint and all. The pics need calibration.... doesn't sound like a conspiracy to me.
  • by Royster ( 16042 ) on Friday January 09, 2004 @03:30PM (#7931342) Homepage
    WHen Hubble uses false color, that fact is *always* noted at the official site. If other people use the images and drop NASA's text, they can't be held responsible.

    And, yes, NASA has to color correct just about every image one of their probes or landers takes. It's necessary because of now the images are taken. That ain't no cheap digital camera up there.
  • by Jesrad ( 716567 ) on Friday January 09, 2004 @03:31PM (#7931348) Journal
    You are wrong. The sky's color comes mainly from the scattering of light, which has to do with the wavelength of light. That's why the sky is blue on virtually every planet.

    Check this panoramic photo [nasa.gov] (warning, 4.1 MB). Here's a small example [namu.free.fr] of what it should look like to human eyes, without the stupid NASA red tint. See the rainbow around the sun ? It's because of ice in the upper atmosphere.
  • by b-baggins ( 610215 ) on Friday January 09, 2004 @03:32PM (#7931375) Journal
    The Earth's sky is blue because Nitrogen scatters blue light. Last I checked, there ain't a whole lot of Nitrogen in the Martian atmosphere.

    Mars' atmosphere is pinkish because of the dust suspended in it.
  • by djh101010 ( 656795 ) on Friday January 09, 2004 @03:33PM (#7931392) Homepage Journal
    Actually, I was just at rednova.com yesterday looking at archives of Nasa images, and not only is this explicitly mentioned, but for many of the false-color images, they specify the method by which they were constructed (shot thorough this filter, that filter, and the other filter, and recombined, that sort of thing).

    The scientists understand the real colors, the public (who funds it, after all) expects it to be red. They want red, we'll give 'em red. I'm not saying I agree with that, but I understand where they're coming from.

    The veracity of the person who brought this up (Mr. Martian Pyramids and such) isn't something I'll do much commenting on.
  • by overunderunderdone ( 521462 ) on Friday January 09, 2004 @03:42PM (#7931481)
    I was watching a press conference on CSPAN and the guys at JPL actually brought this up themselves. The thing is the camera's have filters for a wide variety of wavelengths many of which aren't visual light at all. Each camera has a different array of filters and actually only share two filters in common for stereo vision.

    I got the impression that many of the fiters that ARE within the visual portion of the spectrum were only letting in narrow bands of the spectrum. Exactly what color SHOULD infra-red images be? For obvoius reasons keeping them in their "orignal" spectrum would be fairly useless - though "red" would be as close as we can come.

    For just pretty pictures rather than scientific data NASA is color-correcting the images - I think it is more involved than simply colorizing a black and white image. They mentioned compositing together several images from different filters to get a fair approximation of what the human eye would percieve if it was there.
  • by orac2 ( 88688 ) on Friday January 09, 2004 @03:42PM (#7931484)
    The Rovers are solar powered. Taking pictures would suck a lot of power from the batteries otherwise needed to make iti through the night.
  • Re:Filters (Score:4, Informative)

    by Anonymous Coward on Friday January 09, 2004 @03:46PM (#7931526)
    I was looking to see a response like this, so I wouldn't post something similar.

    The MER-A people gave a very detailed account of the filters in yesterday's press conference, and of why the coloured spots on the calibration targets on the image from Mars really didn't appear to match up with the identical version they had in front of them.

    Apparently, they know the response to light of lots of different frequencies for each of the coloured tabs - the blue one, for instance, also reflects strongly in the near infra-red, which is why it appears bright red in the image from Mars and blue to human eyes. They know this, and calibrate accordingly - in fact, the blue target was chosen specifically for this behaviour.

    The rest of the colours in the image are as good an approximation to the real colours as they can get, based both on the calibration targets and on the results from other landers and from what astronomers can see with the naked eye through telescopes.

    And as I write this, I see that Jugulator has already posted something very similar, and which goes into more depth. Never mind, I'll submit this anyway. :-)
  • by FrostedWheat ( 172733 ) on Friday January 09, 2004 @03:48PM (#7931550)
    Mars does have a blue atmosphere but there is normally enough dust to give it it's pinkish colour.

    During sunrise/sunset however the air around the sun becomes blue. The light is traveling through much more atmosphere so gets a deeper blue colour, and also the dust particles are reflecting the light away from the viewer (your seeing the dark side of the particles) so the blue has a better chance of getting through.

    Here's a good example from the pathfinder lander. [nasa.gov]
  • by kindbud ( 90044 ) on Friday January 09, 2004 @03:48PM (#7931557) Homepage
    No device "sees" colors the way humans see color. Heck, no two humans see color the same way. All images, especially science images, whether they are photographic prints or digital images, are colorized and manipulated and stretched and bent and filtered and modified to enphasize the details the investigator is interested in.

    You think Jupiter is a really garish ball of swirling colorful gasses? Think again. All the Galileo and Voyager images have saturation boosted a great deal, and the contrast is stretched mightily. Furthermore, the luminance layer is deconvolved to bring subtle spatial details into sharper relief. To the human eye, Jupiter is a rather bland beige-ish ball with some hint of subtle color here and there, and not much obvious detail. The same goes for Io, which is usually depicted as a bright yellow/orange malestrom. It's "real" colors - what a human in orbit would see - are also rather bland.
  • by science_gone_bad ( 730182 ) on Friday January 09, 2004 @03:49PM (#7931570)
    "They do that so that different elements of the image can be more easily identified"

    There's another even more important reason...most of the colors are for wavelengths of light that could not be seen anyway.

    The last time I checked I could not see UultraViolet, Infrared, or X-rays.

    Anyway, the color dots on the lander SHOULD look different as the lighting conditions are different on Mars due to the scattering properties of that atmosphere. Colors under Flourescent lights like we all sit under are very different than those out in the sunlight. If the images from Mars had the color corrected to pure colors, it would not be a true representation of what we would see if we were standing there.
  • by Anonymous Coward on Friday January 09, 2004 @03:50PM (#7931577)
    The colour of the Martian ground doesn't determine the colour of the sky due to airborne dust until there's so much dust that the sky is opaque ("optically thick").

    The colour of the ground is determined by the reflective properties of the material which vary with the chemistry. So: iron in ground gives red colour of opaque, slightly-reflective surface.

    The colour of the sky is due to scattering. When you look away from the sun, the light from the sky is all scattered in the atmosphere, otherwise it wouldn't be lit. Gas molecules tend to scatter blue light (esp. N2 molecules), so the dustless sky looks blue. Dust grains tend to scatter redder light, so dust storms look reddened. It doesn't matter much what the dust is made of, only on the size of the dust grains.

    This is true both on all planets with atmospheres dominated by light gases. Probably not so true for gas giants etc.
  • by C10H14N2 ( 640033 ) on Friday January 09, 2004 @03:51PM (#7931594)
    Ok, here's a little experiment for 'ya.

    Procure a color chart. If you cannot, procure a box of crayons and make several large marks of relatively uniform saturation using the colors "Red" "Green" and "Blue." If you're truly adventurous, you may try a nice burnt umber or perhaps attempt various gradations from black to white.

    Place this color chart on the ground.

    Using the exact same settings on your camera, photograph this chart at sunrise, high noon and sunset. Do this on days of varying weather conditions.

    If possible, start a large brush fire. Wait for large reddish clouds to filter the sunlight. Photograph your chart again. This is probably illegal, so wait until someone else does this for you.

    Now wait until midnight. Photograph your chart using a flash.

    In Photoshop, adust the color balance of all of your photos to match the last image.

    Voila, all of your images are now completely indistinguishable from each other and you have lost all of the information you recorded by making photographs in varying lighting conditions.

    DUH.
  • by entrager ( 567758 ) on Friday January 09, 2004 @03:57PM (#7931653)
    Possibly because they aren't actually visible from the surface. They are pretty dang small.

    For geek's sake:

    Our moon has an apparent size in the sky of about 1800 arcseconds. This is found by arctan(radius of the moon/distance to the moon) * 2 [google.com].

    By comparison, Phobos would appear to be about 900 [google.com] arcseconds from the surface of Mars. Deimos would be about 200 [google.com] arcseconds.

    So actually Phobos would appear to be about half the diameter of our moon and Deimos would appear to be about 1/9 the diameter. I suppose that's not terribly small, but you also need to recognize that far less light will be hitting them and then reflecting off. Phobos would be much dimmer than our moon, and Deimos is dark in color, so it may not be easy to see even with the naked eye.

    I imagine capturing an image of the moons with the camera on board a rover would be difficult.
  • by barfy ( 256323 ) on Friday January 09, 2004 @03:59PM (#7931686)
    This is a "Bill Nye" project. [astrobio.net]
  • by siskbc ( 598067 ) on Friday January 09, 2004 @04:00PM (#7931692) Homepage
    Exactly, and since the Martian atmosphere is considerably thinner that Earth's there will be far less scattering. Without any dust in Mars atmosphere, i.e., on a clear day, Mars sky should still look redder than Earth's, or at least whiter, kinder of like Earth's sky would look through a lens filtering out the blue.

    Thinner would make blacker, not bluer - in other words, less scattering total, but the frequency range won't change. Outside of dust, Mars' atmosphere won't be much red. I'm not sure what wavelength CO2 scatters up to, though, so you could get some greener light there. But not red, and not white.

  • by UPAAntilles ( 693635 ) on Friday January 09, 2004 @04:02PM (#7931711)
    No, the sky is blue on earth due to the exact conditions we have here. If our atmosphere was less dense, the sky would be darker (less diffused light). Our atmosphere is so dense and made up of the right stuff (nitrogen, oxygen, carbon dioxide) that our sky is actually violet. However, because our sun puts off more yellow and green light then any other colors, our eyes have adapted to seeing those colors better, and the sky appears to be "sky blue". As the atmosphere gets less dense, it shifts left on the EM scale (roygbiv), and gets darkers overall. As it gets more dense, it shifts left on the EM scale(that's why sunsets are red, the sunlight passes through more air at sunset and sunrise) It's actually very complex to determine what color a sky will be. It depends on these factors-
    Incoming light colors
    atmosphere make-up
    atmosphere density
    angle of incidence
    the eye of the observer

    That's why Mars has a butterscotch sky- very low density atmosphere made up almost entirely of CO2
  • by Anonymous Coward on Friday January 09, 2004 @04:02PM (#7931714)
    I would disagree somewhat. These are quite likely to be used by the scientists. The colours on the images were not done on earth, this is an effect of the lens filters placed ontop of the camera lens (photographers use filters to achieve various effects and to change the way the light reaches the film, ie polarizers, UV filters, and colour filters for black and white photography).

    NASA has limited bandwidth to the rover, and a small window to communicate with it. They don't have the resources to waste on making seperate images for the public. There are probably legitimate reasons, infact very good scientific reasons, for having tinted these images using the filters. What these reasons are, I do not know, but they have likely been explained somewhere.

    The pancam (that is the camera which takes these photos) has 15 filters, 8 on one camera, and 7 on the other I think. The filters are different, giving NASA plenty of choices based on what they are capturing. The second camera has one filter missing so they can capture an image using no filter should they need to.

    This whole subject was covered on NASA TV during the previous press conference. Does anyone know where to find recordings of these conferences? that would be neat, I wouldn't mind having a copy as I missed a portion of the last, and simply to watch again. They were very informative.

  • by entrager ( 567758 ) on Friday January 09, 2004 @04:03PM (#7931728)
    For futher comparison, when it is closest (as is was recently), Mars itself appears to be about 18 [google.com] arcseconds in diameter when viewed from Earth.
  • Bullshit (Score:4, Informative)

    by Royster ( 16042 ) on Friday January 09, 2004 @04:05PM (#7931785) Homepage
    First, there is no loss of information. The original data streams are maintained and kept available.

    Second, the images *need* processing. They are taken in ambient light which does not contain the same distribution of frequencies as "white" light on Earth. The cameras are designed to be calibrated with the ambient light actually found when they land for later postprocessing.
  • by pyropaul ( 571423 ) on Friday January 09, 2004 @04:06PM (#7931808)
    You're wrong - it's red light that has a "low" wavelength. Blue light has a shorter wavelength which is why it gets scattered.
  • Here's how it works (Score:3, Informative)

    by starsong ( 624646 ) on Friday January 09, 2004 @04:23PM (#7932090)
    This story pissed me off so much I almost had a seizure... it's complete unadulterated bullsh*t. Here's how it works: the two cameras on the rover are BLACK AND WHITE CAMERAS. They don't see color. They're not designed to see color. They take GRAYSCALE images, through a series of COLOR filters. So what NASA ends up with are a series of black and white images with little tags on them that say "600nm" or "700nm". To give you an impression as to what it would look like "to us", they convert the black and white images to solid color; e.g. the B&W photo with a "red" tag is now just different shades of red. They take a series of these "color-grayscale" images in different regions of the spectrum, overlay them, and voila... a full-color image.

    Once again.... THERE ARE NO "ORIGINAL" COLOR IMAGES, just black & whites shot through filter wheels. The best we can do is color transformations and approximations, to give you the best sense possible. As for the paranoid nonsense about the sundial/calibration target changing color, THAT'S SUPPOSED TO HAPPEN! What do you think a "calibration" target is??? You certainly wouldn't expect to see a bright blue spot if you looked at it through a red filter, would you? It will look different depending on what particular filters they used that day, and what color transforms they used to put it on the Internet.

    Lastly, that bullcr*p about how the "sky should be blue" is just that---bullcr*p. Mars has almost no atmosphere, and what there is is filled with reddish dust. In the first horizon image we got from Mars (Viking), which the poster referenced, they screwed up the color transformation... it looked too red to be real so they fiddled with the data to make it "look right" [1]. They admitted it right away and all subsequent, peer-reviewed images have shown the correct, reddish sky.

    [1] On Mars: Exploration of the Red Planet 1958-1978, p.384 (NASA History Series).
  • by srleffler ( 721400 ) on Friday January 09, 2004 @04:27PM (#7932147)
    They mentioned compositing together several images from different filters to get a fair approximation of what the human eye would percieve if it was there.

    That's just it. The camera captures separate images through various filters (possibly red, green, and blue), which are then merged back on earth to produce a color photo. With only a finite number of filters, this always involves some "color correction". The colored spots on the sundial act as a calibration guide for this process, since they have known spectral characteristics.

    Keep in mind too that they haven't had time yet to take pictures of everything with every filter. Obviously the first "big" photo to take is the high-res panoramic view of the surroundings, captured with whichever filters give the best scientific information (for identification of rock types, etc.) This doesn't necessarily give you the most accurate depiction of what a human would see, although one can try to correct for the filters after the fact.

  • by whovian ( 107062 ) on Friday January 09, 2004 @04:27PM (#7932150)
    IANANE (N.A.S.A. engineer).

    This guy's web page provides the description (http://world.std.com/~mmcirvin/bluesky.html#sky):

    The color pictures from Mars Pathfinder are a spectacular reminder that the sky is not blue on Mars. Instead, it has colors that have been described as everything from "orange-pink" to "gray-tan", as was discovered in the 1970s by the Viking landers. This is because the atmosphere of Mars is very thin and dusty, and atmospheric light scattering is dominated not by the molecules of gas (in the case of Mars, mostly carbon dioxide) but by suspended dust particles. These are larger than the wavelengths of visible light, and they are reddened by iron oxide, like Martian soil. It's not just Rayleigh scattering, so the power spectrum is different.

  • by slinted ( 374 ) on Friday January 09, 2004 @04:31PM (#7932211)
    Seems like they're working pretty quick over at JPL to get the colorized version of the images out to the general public, since this week, they've been releasing them less between 6 and 18 hours after receiving them. But if you're not happy with their coloration, then I invite those among the slashdot community who know such things to do it themselves.

    The pan cam is black and white, and uses filters to pick out certain colors in the images it takes. If you want, you can read more about what filters are on which half of the pancam (l and r). There are 8 on a side, each with its own particular wavelength and bandpasses. The description of each as well as the numbering scheme is available from the Athena instruments website at Cornell University [cornell.edu]

    The raw images are being freely distributed from the JPL MER website [nasa.gov]. You'll notice camera (l or r) and filter (1-8) used is described from the naming of the pancam files (eg. 2P126471535EDN0000P2303L6M1.JPG)

    Just from this last days images, they have quite a few images in differant filters, of the color wheel itself, for calibration. For a better description of the filters themselves, and of the way they plan to (and have *BEGUN* to) calibrate the images, check out several [usra.edu] differant [usra.edu] publications [nasa.gov]. (thanks to JPL-Gene and doug_ellison of #maestro irc.freenode.net for the links).

    I, for one, am thankful that they're releasing the raw data/images at all, considering the scale of the global-slashdotting currently going on. The speedy data turnaround, and amazing openness with which they are conducting this mission is really impressive compared to anything else of this scale. Thanks to everyone at JPL, Cornell, and NASA as a whole for all the incredible work from this meager enthusiast.
  • by KewlPC ( 245768 ) on Friday January 09, 2004 @04:34PM (#7932252) Homepage Journal
    The images aren't "tinted" using filters. They use the filters to cut out specific wavelengths, since the CCDs themselves are grayscale IIRC. For example, the red filter blocks out all light except red, so that only red light reaches the CCD. Scientists will take 3 photos, each one using a different filter: one for red, one for green, and one for blue. Then, they combine these 3 images into one, using the image taken with the red filter as the red channel, the one taken with the green filter as the green channel, and the one taken with the blue filter as the blue channel to create a true-color image. The color on these images then needs to be adjusted to reflect what it would look like to a human standing there on Mars.

    And this practice is done practically everywhere in spacecraft imaging systems AFAIK. It's easier to have one CCD that is sensitive to a wide variety of wavelengths (some of which may be outside the visible spectrum) and a bunch of filters so you can control which wavelengths reach the CCD than to have a bunch of CCDs, each of which is only sensitive to a specific wavelength.

    Complaining about this and calling it a conspiracy is like complaining about the common practice of taking many images in quick succession of an object (such as, say, Jupiter) and then averaging them together to cut down on noise. All it does is show the complainer's lack of knowledge.
  • Re:HST Images (Score:5, Informative)

    by mbrother ( 739193 ) <mbrother@uwyoWELTY.edu minus author> on Friday January 09, 2004 @04:35PM (#7932272) Homepage
    OK, I do spend part of my life processing HST Images (and Chandra images, VLA images, etc.). cynicalmoose is sort of on the right track but the explanation is muddled, confusing spectroscopy with imaging. HST takes no true color images as you would get with color film, for instance. Yes, images are digital with an array of numbers, but so what? An individual image is a simple intensity map *taken through a single color filter*. HST has a pile of filters, some colors like blue, red, etc., even infrared and ultraviolet (so you do need false color for these). Some are narrow-band filters centered on particular emission lines to pick out particular elemental emission (e.g., useful when studying nebulas). You can make a so-called "true-color" image by mixing together several of the individual images taken in different filters, and this can be pretty close to true. The emission-line filters high-light colors in a false but useful way. UV and IR do require false color (and Hubble cannot see X-rays). Sometimes "black and white" single-color images are rendered with a color map that permits subtle detail to be more easily seen (this is pretty common actually, and I have done it myself for press releases, since you rarely pick out filters for the creation of true-color images as there isn't a lot of science in that).
  • by Xolotl ( 675282 ) on Friday January 09, 2004 @04:49PM (#7932493) Journal
    This depends on which images. The famous Hubble image of the Orion nebula [seds.org] was colour corrected by Professor O'Dell [rice.edu] of Rice University to match what he saw visually a long time ago through a veyr large telescope (possibly the Palomar 100-inch, but I can't remember), back in the days when you could still look through large telescopes. (In order to see colour you need a lot of light, which means either a very bright object or a very large telescope.)

    However, in general you are right, the colour corrections are arbitrary and don't match the "real" colours. Moreover, the brightness stretching and image processing often changes the colour in strange ways. There's a recent paper [soton.ac.uk] which discusses the problem and presents some solutions.

  • by elendel ( 229983 ) on Friday January 09, 2004 @04:51PM (#7932534)
    True, but the JPL images webpage [nasa.gov] has a couple pictures of the color calibrator [nasa.gov] while _on_ mars, clearly showing the blue and green.

    So the images are clearly color-doctored. Whether this is part of some grand martian conspiracy I leave as an exercise to the reader...
  • by valmont ( 3573 ) on Friday January 09, 2004 @04:58PM (#7932623) Homepage Journal

    Holger Isenberg, the guy behind mars-news.de, is one of many [earthlink.net] kooks [badastronomy.com] out there who are too ugly and interpersonally incompetent to ever hope to get laid in this life time. He must therefore resort to enclosing himself into his imaginary universe of in-bred conspiracy theories [earthlink.net]. enjoy.

    NASA has always made raw data available to the public, which is what you can leverage thru the Maestro the software. The red tint observed in composite pictures made available to the public are, in fact, a fairly accurate representation of the truth [nasa.gov]. Pictures MUST be composited to be available in a JPEG format Joe Six Pack can look at in his browser, hence some level of alteration is necessary. There is no lie. There is no conspiracy. Even your average Joe Six Pack can grok the fact that some basic alterations are necessary to represent flat images. Otherwise Joe Six Pack can always download Maestro.

  • by gyges ( 79472 ) <jasonpjdc@gmail.com> on Friday January 09, 2004 @05:19PM (#7932905)
    NASA's explination [nasa.gov] for the changes and need for image processing. I am still not sure the get it exactly right, but that's OK, neither is any one else [universetoday.com].
  • by merlin_jim ( 302773 ) <{James.McCracken} {at} {stratapult.com}> on Friday January 09, 2004 @05:31PM (#7933043)
    Actually, (AFAIK, IANAS, correct me if I'm wrong) the Hubble images are correct, but they're just using pretty colours to represent different kinds of radiation, not just the normal light.

    Or to rephrase, the pretty pictures from the Hubble that are accused of beind doctored, would not be visible to humans if the colorization were not tampered with
  • by rarose ( 36450 ) <`rob' `at' `robamy.com'> on Friday January 09, 2004 @05:35PM (#7933088)
    It appears that due to limited downlink bandwidth (since the HGA isn't fully up yet) they've been making the mosaics from a mix of left and right camera images.
    Due to the different viewpoints (it looks likes they're a couple of feet apart) the mosaics have issues... but I suspect that once they downlink a full set of either left or right images the panorama will instantly get much much better.
  • by Ben Jackson ( 30284 ) on Friday January 09, 2004 @06:11PM (#7933517) Homepage
    Several people have explained what's going on, and even quoted the press conference where this was discussed. One of the other points from that same press conference was that the pigments of the calibration target were carefully chosen so that each is useful for multiple filters. That sounds strange if you think about the pancams like a pocket digital, but they're not. They use a filter wheel, so each wavelength images all of the calibration target. By making each "color" on the target cover multiple wavelengths they get more information. I think the specific example was that the blue target shows up as bright white to the near-IR filter they were using. The result is that in the *composite* they are wacky colors, since the aggregate of the calibrations doesn't "make sense".

    In other exciting news, this morning they showed some of the mini-TES (thermal emission spectrometer) images. That data is very hard to interpret, so it is ripe for crackpot articles that can be posted on /. with no editorial review.
  • Check your facts. (Score:2, Informative)

    by Anonymous Coward on Friday January 09, 2004 @08:28PM (#7934576)
    For those of you who studied chemistry, remember that oxygen scatters blue light, hence Earth's blue sky.

    Mars has less than 1 percent oxygen in it's atmosphere. Mar's atmosphere is 1 percent of our's.
    Hmmmm.... maybe the sky on Mar's ISN'T blue, except in Totall Recall.

    http://calspace.ucsd.edu/marsnow/library/science /c limate_history/general_circulation_of_the_atmosphe re1.html

    As for the different collage shades, f-stop changes with different light conditions at different angles and NASA (sloppily) put the thing together, or they were being rigorously truthfull.

    Or it's all a government plot.

    Excuse my spelling, I'm a scientist.
  • by JabberWokky ( 19442 ) <slashdot.com@timewarp.org> on Friday January 09, 2004 @08:29PM (#7934579) Homepage Journal
    I have never seen this. Most composite and colorised photos I've seen match the raw telescope images pretty accurately with the exception of color and contrast. Features (like stars and such) are identical. Nebula are... well.. nebulous, so they up the contrast so you can see them. Stars and galaxies are pretty find on their own and stand out nicely.

    If you use KDE, fire up KStars - you can do raw database transactions and pull DSS images by right clicking anywhere. Nifty. Then click on a nebula and compare the original to the HST image. It's pretty obvious they are clarifying and not adding anything to the original.

    --
    Evan

  • by fenix down ( 206580 ) on Friday January 09, 2004 @09:22PM (#7934844)
    The crazy German guy complains about those too. In the real world, those weren't actually adjusted that way for any real reason. They were just the very first pictures they had from Viking, and the engineers assembled them and tossed them into the press room at 4 in the morning after not sleeping for 2 days before any of the scientists could tell them that they'd screwed it up. Carl Sagan went off into a big thing about our chauvanism for trying to make Mars look like Earth and stuff.
  • by Mr2cents ( 323101 ) on Friday January 09, 2004 @09:33PM (#7934900)
    I watched a press meeting at NASA Tv [nasa.gov]. Actually, the rover has 8 filters on each camera, with only a few in common (also, one of them is a sun filter, so the rover can figure out it's orientation and direct it's antenna to earth). The blue pigment on the sundial is specially selected because it also has a strong infrared signature. So if you watch the blue spot with the infrared filter, the "blue" spot turns out red. Another mistery solved.
  • by S_Dub ( 739327 ) on Friday January 09, 2004 @09:45PM (#7934963)
    If you use Maestro [telascience.org], you can download the actual original images as first seen by Spirit and the scientists at NASA.
  • This is not true. (Score:5, Informative)

    by mbone ( 558574 ) on Friday January 09, 2004 @10:32PM (#7935201)
    I worked (from MIT) with Viking Lander data, not camera data, but I followed all of this closely at the time and had lots of discussions with people at JPL about this and other topics.

    The Viking landers used a scanning (spot) camera, which was slow but which was also one of the first really good scientific cameras sent on a space probe. It was designed to provide a very repeatible color readout of what it saw, but, like most such cameras, was subject to drift, so color calibration targets were included on top of each lander.

    When Viking Lander 1 landed, the first color pictures released had a blue sky. These were done with the color balance adjusted "by eye" at JPL. When they had time to analyze the color targets, they released that they had made a mistake, and that the sky was red.

    I specifically remember hearing that they had adjusted the color balance in the first release image, and had to adjust it back to get true color.

    They had no reason to lie and were a little embarassed to have made the initial mistake.

    So I regard thiis article as being without merit.
  • by theolein ( 316044 ) on Saturday January 10, 2004 @03:25AM (#7936341) Journal
    I grew up in Southern Africa at an altitutde of around 1500 meters (somewhere near 5000 feet) above sea level. I remember the sky of my childhood being a dark deep blue. Take a loof at the pictures taken at the top of K2 or everest, or even better, if you can find them, colour images of the X-15 experimental planes of the 60s. At that altitude where the X-15 is soon after launch, close to 30'000 meters (100'000 feet) the sky is almost black.

    That is, as most of know, because the very low air density at higher altitudes refracts far less light.

    The average surface air density on Mars is more or less the same as it is on Earth at 30'000 meters. That means that the sky on Mars will probably be almost black with a small band of colour on the horizon.

    That band of colour will be due to so called rayleigh scattering, by which air molecules scatter the light passing through them. Oxygen and Nitrogen on earth, being small molecules will scatter light of a smaller wavelength (blue) than on mars, where the atmosphere is mostly carbon dioxide. The light thus produced on mars will be NOT be red and NOT be blue but somewhere in the middle (yellow/brown) as the larger carbon dioxide molecules will scatter light of larger wavelengths than on earth, but not enough to make the light seem red as that would require a gas of larger molecules such as methane or propane which, of course, is the main atmospheric component on Titan, saturns moon, and lo and behold, we get a deep orange light there.

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...