Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Data Storage NASA Mars Space Hardware

NASA Switches Curiosity Rover To Backup Computer Following Glitch (extremetech.com) 78

NASA has switched its Curiosity rover over to its backup computer system after the main system started experiencing errors last month. "Many NASA spacecraft and surface missions have redundant systems built-in," reports ExtremeTech. "Once they've launched from Earth, there's no way to repair damage to critical systems, so it makes sense to double-up on the vital components. That includes Curiosity's computers, which were designed specifically for the harsh environment on Mars." From the report: The rover has a pair of identical brains running a 5-watt RAD750 CPU. This chip is part of the PowerPC 750 family, but it has been custom designed to survive high-radiation environments as you'd find on Mars or in deep space. These radiation-hardened CPUs cost $200,000 each, and NASA equipped the rover with two of them. Each computer also has 256 kB of EEPROM, 256 MB of DRAM, and 2 GB of flash memory. They run identical VxWorks real-time operating systems. When Curiosity landed on Mars in 2012, it used the "Side-A" computer. However, just a year later in 2013 (Sol 200), the computer failed due to corrupted memory. The rover got stuck in a bootloop, which prevented it from processing commands and drained the batteries. NASA executed a swap to Side-B so engineers could perform remote diagnostics on Side-A. In the following months, NASA confirmed that part of Side-A's memory was unusable and quarantined it. They kept Curiosity on Side-B, though. With Side-B experiencing problems preventing the rover from storing key science and engineering data, NASA switched Curiosity back to Side-A while it investigates the problem, which it can only do when the other computer is active. "NASA hasn't said how much of Side-A's RAM is bad, and it only had 256MB to start, but the team does intend to move Curiosity operations back to Side-B if possible," the report adds. "For now, the mission is functioning normally on Side-A."
This discussion has been archived. No new comments can be posted.

NASA Switches Curiosity Rover To Backup Computer Following Glitch

Comments Filter:
  • Only? (Score:5, Informative)

    by xonen ( 774419 ) on Saturday October 06, 2018 @02:38AM (#57436398) Journal

    it only had 256MB

    What a strange use of the world `only`. 256MB is a lot, really a lot.

    We could run our desktop computers with 256MB RAM at ease if we wanted to. As comparison: When windows 2000 was released (around 2000...) most computers only had around 64MB of memory, and that would already be a reasonable beefy computer. Windows 2000 liked a bit more, around 128, but would run on this 64MB just fine albeit a bit slow. And this same windows 2000 offered a desktop experience not much different from the (windows) desktops we are using today, including proper plug and play support, multimedia and whatever fancy you liked. When XP came a couple of years later, most people still hadn't updated to 256MB of memory.

    Now, compare that to an embedded computer that does not have to waste any memory on fancy graphics, user interfaces and what more, and you will notice that 256MB is a lot. Really a lot. Try writing code to fill that up - a single human couldn't, even a whole team can't. Obviously it'll need some memory to store images etc, but this 2GB of flash is also a lot and comparable to what the first digital camera's came with..

    And for those saying 'long time ago long time ago' - even today it's pretty common to write software that has no more than a single kilobyte of RAM memory available, for embedded purposes. With all the modern webcrap using gigabytes of memory for trivial tasks people seem to have lost the feeling for quantity. 256MB are more bits than i can count in my lifetime..

    • by Mal-2 ( 675116 )

      It's a data collection instrument. That data has to sit somewhere, it doesn't have a continuous uplink to the MRO. Photos take space.

    • by Njovich ( 553857 )

      Yes, I guess most people that have programmed in the 80's or 90's frowned when reading that, there is very little you can't do with 256MB. Having said that, they need to buffer 2 Megapixel images before sending - and I imagine they don't use lossy compression. While bandwidth may still be the limiting factor that may eat up into what will be left of the 256MB.

      • Re:Only? (Score:4, Informative)

        by ls671 ( 1122017 ) on Saturday October 06, 2018 @04:48AM (#57436630) Homepage

        Seriously?

        You didn't work too hard on that one :)

        https://en.wikipedia.org/wiki/... [wikipedia.org]

        Each camera has eight gigabytes of flash memory, which is capable of storing over 5,500 raw images, and can apply real time lossless data compression.[62] The cameras have an autofocus capability that allows them to focus on objects from 2.1 m (6 ft 11 in) to infinity.[65] In addition to the fixed RGBG Bayer pattern filter, each camera has an eight-position filter wheel. While the Bayer filter reduces visible light throughput, all three colors are mostly transparent at wavelengths longer than 700 nm, and have minimal effect on such infrared observations.[62]

    • Re:Only? (Score:5, Interesting)

      by FaxeTheCat ( 1394763 ) on Saturday October 06, 2018 @05:10AM (#57436688)
      Mythoughts as well.
      I used to have a Dell CPx laptop with 256 MB in the early 2000's, and that worked quite well. Not to even mention Windows 3.1 on 4 MB.

      Just out of cutiosity, I googled the RAM requirements for vxorks, and Version 6 from 2004, has the following requirements:
      VxWorks CISC processors require 1 MB of RAM for a development system that includes the standard VxWorks features, such as the shell, network, file system, loader, and others.
      RISC processors typically require more RAM space: 2 MB of RAM is the minimum; 4 MB is encouraged. For a scaled-down production system, the amount of RAM required depends on the application size and the options selected.

      So 256 MB leaves most of the RAM available for the applications.
      • by ls671 ( 1122017 )

        There is almost no applications that require memory from that 256MB. Each unit is pretty well independent. For example, the Mast Camera has 16GB of flash memory and can do hardware compression on the fly.

        So yes, 256MB is plenty. It is almost like 256MB dedicated to the kernel.

      • by kackle ( 910159 )

        Just out of cutiosity,

        Typo aside, I see what you did there!

      • Voyagers 1 and 2 had less than 70 kB of RAM [wired.com]. Data was stored on tape drives.

        You don't need a lot of RAM to provide lots of functionality. You only need a lot to allow for lazy/sloppy programming. We've just moved towards the lazy/sloppy end because transistor fab technology has drive then price of RAM into the dirt, meaning it's cheaper just to put 4-16 GB into every modern computer than it is to pay programmers more to write small and concise code.
    • by Kjella ( 173770 )

      It's really easy to overwhelm a computer with sensor data though, which is kinda the point of the rover being there. CERN is producing data at about 25GB/s and even after specialized selection and compression they now have >200 PB stored. Considering the very limited bandwidth back to Earth of only 100-250 Mbit/day (= 12-32 MB) I'm pretty sure they wish they had more CPU and more memory to pre-process it more and maybe derive its own results like say a 3D map of the environment both for itself and to rel

      • by ls671 ( 1122017 )

        Would you guys please stop this? :)

        256MB is like 640MB when it used to be 640KB for the job it has to do. Do you figure the guys who sent it there were on some kind of budget or that they were imbeciles?

        I already posted with arguments and links and I don't want to repeat myself but the rover has many gigabytes of memory.

        Then again, we could blame click-bating summaries but that's another story.

        • I already posted with arguments and links and I don't want to repeat myself but the rover has many gigabytes of memory.

          How is having gigabytes of flash relevant for the amount of DRAM ? These have different purposes. You have not posted a single argument about the RAM requirements.

          • by ls671 ( 1122017 )

            Mast cam has its own self-sufficient "DRAM" buffers so in the end flash vs DRAM is irrelevant. It isn't like the mast cam is going to use the main board memory as buffer if flash memory isn't fast enough. Same for sending pictures back to Earth. Data probably bypasses the main computer and goes directly from the camera to the transmitter.

            So it goes for the rest of that rover.

            Different architecture than your typical running linux kernel, it does a lot less, letting other hardware do the job by themselves. It

    • by Aereus ( 1042228 )

      Win2K on a 64MB system would be running really lean. From what I remember, my pared-down boot config was something like 52MB while still having a fully functional system. That doesn't leave much room for running applications.

      • I've run Windows 2000 on 64 MB. It runs but it's pretty painful. Windows 2000 really needs 128 MB, and really isn't happy until you have 256 MB.

        With Windows XP double those numbers. For XP SP3, double them again.

  • by Required Snark ( 1702878 ) on Saturday October 06, 2018 @02:58AM (#57436426)
    Boeing to develop next-generation radiation-hardened space processor based on the ARM architecture [slashdot.org]

    It will be used by NASA and the military and should be available in 2020.

    • by DanDD ( 1857066 ) on Saturday October 06, 2018 @03:08AM (#57436442)

      Your link is circular back to this same slashdot article, but a google search of your link title finds this:

      https://www.militaryaerospace.... [militaryaerospace.com]

      This is cool, thanks!

      Also interesting:

      An alternate approach [rankred.com] that doesn't require expensive radiation hardening also seems to have worked with a half-day transition through the Van Allen belt [wikipedia.org]. It will be interesting to see if their approach can stand up over time on a long mission.

      • Using COTS processors for short lived missions in space is not a new approach and has been done numerous times in the past. The radiation environment on Mars is actually not very stringent, for electronics. The main advantage of using a qualified RAD750 mission could be the extended temperature range for which it has been demonstrated to operate. The rover can experience quite extreme temperatures at Mars surface...
        • by religionofpeas ( 4511805 ) on Saturday October 06, 2018 @04:55AM (#57436656)

          Even if radiation on Mars surface is limited, the computers still have to survive the trip from Earth to Mars. The problem is that the cosmic radiation causes permanent damage to the crystalline structure of the semiconductors, building up over time until they stop working completely. COTS processors simply wouldn't survive, and redundancy doesn't help much if they are all getting damaged at the same rate.

          • by ls671 ( 1122017 )

            That's why they have so called "military-grade" versions of the COTS. Now soon to be called "cyber-space-grade" I guess...

            So in the end, they are not supposed to be COTS processors.

            https://en.wikipedia.org/wiki/... [wikipedia.org]

            • Most military applications deal with the same radiation environment as consumer electronics, so I wouldn't expect them to be hardened against cosmic radiation.

              • by ls671 ( 1122017 )

                There is lot of bullshit going on with so called "military-grade" CPUs.

                But if you care to read the link in my GP post, this seems to be the real deal, it is called "radiation-hardened":
                https://en.wikipedia.org/wiki/... [wikipedia.org]

                Do you think the guys who sent the rover there were imbeciles or something?

          • by Kjella ( 173770 )

            Even if radiation on Mars surface is limited, the computers still have to survive the trip from Earth to Mars.

            Well if we're assuming it's a manned mission it'll have to have a radiation shielded human habitat that can keep it under the 1 Sv career limit. Since the weighting factor is always >1, that's less than 1 Gy of radiation. That's there and back again so less than 0.5 Gy on the way to Mars. From what I understand that's not a whole lot, particularly since you don't have to worry about transient errors or jumping currents during operation just structural damage. You take them from the habitat, bring them to

      • Comment removed based on user account deletion
        • by religionofpeas ( 4511805 ) on Saturday October 06, 2018 @06:51AM (#57436928)

          Also, triple redundancy only helps to protect against SEE (Single Event Effects), those are cases where an ionizing particles changes some charge, and flips a bit in the memory. These are recoverable errors. The processor can reset the faulty bit, and continue normally.

          The problem is that part of the cosmic radiation consists of heavy element nuclei, flying at near speed of light. These don't just flip some bits, they have enough energy to permanently dislocate atoms in the crystal silicon lattice. Due to this damage, the processor will get higher leakage currents, and will eventually stop working altogether as the damage accumulates.

          Shielding is impractical, because a thin layer of metal (good enough to block gamma radiation), can't stop these highly energetic particles. Even if the particle hits the shielding, it doesn't stop it, instead you get a shower of secondary particles, still strong enough to cause damage. Radiation hardened processors use a different technology that is less sensitive to damage (at the cost of lower logic density)

        • SpaceX's solution of using triple redundancy COTS has a higher power requirement - something you don't want on a solar powered rover being dropped on a planet with less light than Earth.

          Fortunately Curiosity is not such a rover. It's powered by an RTG, not solar panels. That's not to say it doesn't have a power budget, but it certainly could run another computer or two.

  • So somehow, the MMRTG [wikipedia.org] that they installed on this beastie that was supposed to steadily provide over 2kwh of electricity for at least 14 years somehow magically decayed its plutonium fuel source in less than 6?

    I think that this rover has a messed up idea of what "half life" means.

    • They're not talking about the MMRTG but about the Li-Ion batteries that are also on the rover.

    • by K. S. Kyosuke ( 729550 ) on Saturday October 06, 2018 @05:08AM (#57436684)
      The RTG on the rover is "right-sized" to provide the necessary *average* consumption, as anything more would be wasting plutonium. However, since momentary consumption fluctuates, there's a battery buffer to smooth over the load profile.
    • From the linked Wikipedia article

      "The MMRTG is designed to produce 125 W electrical power at the start of mission, falling to about 100 W after 14 years".

      Only a fraction of the 2 KW energy is converted to electricity.
      • by mark-t ( 151149 )

        I said 2khw, not 2kw.... over a day, 100W is 2.4kwh.

        Although I realize that I missed the words "per day"....

        Nonetheless, running out power only 6 years into a mission that is supposed to last at least a decade and a half does not bode well.

        • The MMRTG isn't running out of power. The problem was that the Li-Ion batteries were getting drained by the computer using too much.

      • by MrKaos ( 858439 )

        Only a fraction of the 2 KW energy is converted to electricity.

        Some of the thermal energy is used to keep the rovers computers at operating temperature [nasa.gov] the way Voyager does. It's a common technique that they use to keep space craft functioning.

        The MMRTG [nasa.gov] looks interesting.

        • by jandrese ( 485 )
          The initial mission was for two years [space.com], but that's kind of misleading because it's mostly the point at which the mission managers can write "mission successful" on their end-of-year performance reviews. In practice most people expected it to last a decade or more.
          • by MrKaos ( 858439 )

            The initial mission was for two years [space.com], In practice most people expected it to last a decade or more.

            Thanks for the article. It looks like a case of setting expectations very low. If it is one fifth of practice that doesn't look like a lot of confidence on the part of people running the project.

    • was supposed to steadily provide over 2kwh of electricity for at least 14 years

      Nothing about this mission was supposed to last for 14 years.

      • by mark-t ( 151149 )
        Citation? Because that's not what I read... I had heard that it was going to go for at least 14, and probably longer than that, depending on performance at the time.
        • The initial mission was for two years. It has since been extended to be "as long as it's scientifically viable".
        • Citation?

          NASA. It's not hard to find information about a big public mission like this. The goal of the curiosity mission was multi-fold a lot of which included testing a new landing system. It was primarily an experiment to test a hybrid powered decent system. A bit of science was included as well. The rover had instruments to determine if the conditions for microbiol life exist, and it answered that question within a few months of landing, done. Mission over.... except....

          The mission included a hardware test the ro

          • by mark-t ( 151149 )

            NASA.

            More specifically?

            Because as I said, all the documention I've read says that it's power source was good for at least 14 years of steadily providing 100w of power or more.

            • Documentation of the power source, and actual mission life are two completely different things. You heard that it was capable to going for 14 years, good story. That was never a consideration during design.

  • "Many NASA spacecraft and surface missions have redundant systems built-in," reports ExtremeTech. "Once they've launched from Earth, there's no way to repair damage to critical systems, so it makes sense to double-up on the vital components. That includes Curiosity's computers, which were designed specifically for the harsh environment on Mars."

    I guess nobody at NASA watches Star Trek...

    GILORA: Starfleet code requires a second backup?
    O'BRIEN: In case the first backup fails.
    GILORA: What are the chances that

  • Apple should have purchased VxWorks and made it into the new Mac OS, instead of being taken over by NeXT and taking on all that baggage. This was distinctly possible at the time when Apple was desperately seeking out a new OS to run on their PowerPC architecture.

    If they had done so then today they could be crowing about MacOS-run machines trundling around on Mars. Instead they've got billions of land-apes using their software to share selfies.

    • VxWorks is a real time OS, that's not what you want in a consumer OS.

      • A couple decades ago VxWorks was shipping out a demo boot floppy diskette. You could boot it on an x86 system with a floppy drive and it brought up a whole OS desktop that included networking and a Web browser.

        • I can't find any reference for that. Are you sure you're not thinking of QNX ?

          Either way, it's not impossible to use a real time OS for desktop use. It's just not very practical. A real time OS is usually slower, and less feature-rich, because it is required to be completely predictable in its worst case timing.

          • Right. Sorry about the mixup. QNX has been marketed in the past as a 'real time' solution, so I doubled down on my error.

This place just isn't big enough for all of us. We've got to find a way off this planet.

Working...