Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Mars NASA

The Perseverance Rover CPU Has Similar Specs To a Clamshell Ibook From 2001 (baesystems.com) 109

An anonymous reader writes: NASA's Perseverence rover, which is currently exploring Mars, has as it's CPU a BAE Systems RAD 750 running at a 200 Mhz and featuring 256 Megabytes of RAM with 2 Gigabytes of storage. This is a radiation hardened version of the PowerPC G3, with specs roughly equivalent to the Clamshell Ibook that Reese Witherspoon used in Legally Blond back in 2001. This follows a tradition of old tech on space rovers — the Sojourner rover which explored Mars in 1997 used an Intel 80C85 running at 2 Mhz, similar to what could have been found in the classic Radio Shack TRS-80 model 100 portable from 1983.
In a comment on the original submission, long-time Slashdot reader Mal-2 argues "There's not as much distance between the actual capabilities of a CPU now and twenty years ago as there would be if you made the same comparison a decade ago." In the last 12 years or so, the CPUs have gotten more efficient and cooler-running (thus suitable for portable devices) to a much greater degree than they've actually gained new functionality. Retro computing is either going to stay stuck in the 1990s, or it's not going to be very interesting in the future.
This discussion has been archived. No new comments can be posted.

The Perseverance Rover CPU Has Similar Specs To a Clamshell Ibook From 2001

Comments Filter:
  • Retro computing is either going to stay stuck in the 1990s, or it's not going to be very interesting in the future.

    Maybe this time around we'll learn to be more reliable and efficient with our hardware.

    • It costs more to write software that is efficient, makes effective use of available hardware, and meets functional requirements.

      I'd be happy to work on a team that replaces Windows and Linux with something good. I don't think anyone else will be happy with it when it take us 5 years to write it and we'd have to charge 10x what Windows 10 costs.

      • The biggest problem youâ(TM)ll face is that people will need your new OS to run their Windows and/or Linux software... and reimplementing enough of either of those OSâ(TM)s to accomplish that would take much longer than 5 years (and also youâ(TM)d end up with an OS that inherited most of the design flaws of the OS it was compatible with)

        Alternatively you can write your ideal non-compatible OS and then watch nobody use it, because there is no software to run on it, and watch nobody write soft

        • Linux syscall emulation has been done before. Not there there is actually a very large library of commercial Linux software worth supporting.

          One alternative is to not support pre-existing software at all. An opportunity to sell everyone new software. Now I've turned a technical problem into a marketing opportunity (and thus not engineering's problem). I have been in the industry for quite some time ...

      • "It costs more to write software that is efficient, makes effective use of available hardware, and meets functional requirements."

        Not necessarily. The cost of a line of code is essentially constant, so yes, once you go into optimizations that actually require more code, that will increase the cost. However most of our bloat today comes from unsuitable abstractions. For example using a browser as a GUI toolkit for your desktop app doesn't save any money or code, but it adds huge amounts of bloat.

        • The cost of a line of code is essentially constant,

          That hasn't been my experience in automotive safety. There is quite a bit more process such as bidirection traceability that prevents people from sitting down and banging out some new library routines in an afternoon.

          once you go into optimizations that actually require more code

          Good optimization don't require hardly any more code, just more planning, design, experiments, and metrics.

          However most of our bloat today comes from unsuitable abstractions.

          100% !!

          top to bottom we screw ourselves in the industry by adopting bad abstractions. in a large project, several components can have a sort of "impedance mismatch" where the abstraction a

      • by tlhIngan ( 30335 )

        It costs more to write software that is efficient, makes effective use of available hardware, and meets functional requirements.

        I'd be happy to work on a team that replaces Windows and Linux with something good. I don't think anyone else will be happy with it when it take us 5 years to write it and we'd have to charge 10x what Windows 10 costs.

        It also costs a lot of money to write bug-free code. You have software that has to be pretty much 0 bugs because updating it is risky - it cost hundreds of billions o

    • I'd expect a burst of retru computing in when the date stamp hits the 32-bit mark form the "epoch" time in 1970. We've already passwd one billion seonds, but 4294967296 means an overflow of 32-bit fields. It's likely to be a problem then for retrocumputing, or long-stable operating system emulators.

  • It's a tradition of known-to-be-reliable tech. By definition, that rules out the bleeding edge that gets pushed out to consumers. The qualification process takes a while. Then there's radiation-hardening, which is easier to achieve when the individual components on a processor are larger.
    For Perseverance, there's a third factor: it has much in common with Curiosity (including its computers), whose design started in 2004.

    • I suspect Perseverance did what any good power user would do, and that is uninstall all the cruft first.

      Seriously, though, what do we need eight cores of 4GHz processing power for today?
      1. 3D scene processing and rendering
      2. Multi-megapixel 2D graphics bitbashing for a GUI UI
      3. Overhead for .NET / Java / Metro and other managed or interpreted languages du jour
      4. 200 background processes for advertising, haptic devices, and several different company's auto-updates, many of them mutually tangled and intertwin

      • by Gabest ( 852807 )

        The many cores available transformed how programs are written. There is usually a main thread talking to the user and a pool of threads processing everything else completely transparent to the programmer. I'm thinking of the async/await paradigm and closures that popped up in all major languages. Before that you had to manage and sync every thread yourself, now it's just do this, do that, the more cores the sooner it finishes, and gather the result.

      • A 66MHz 486 ran Windows just fine in 1992, and there is little that Windows today actually does in any real value-added sense than it did then.
        And two or 3 years later it ran Java IDEs just fine on windows. So no idea about your point 3. above ...

        • A 66MHz 486 ran Windows just fine in 1992, and there is little that Windows today actually does in any real value-added sense than it did then.
          And two or 3 years later it ran Java IDEs just fine on windows. So no idea about your point 3. above ...

          There was no IDE for Java when it was released in 1995. If you remember it was basically notepad and a dos session. The first IDE was Visual Cafe, which came out around 1997. Microsoft brought out Visual J++ shortly after. Both of which were written in a real language, not in Java. Next came Visual Age which morphed into Eclipse. That too was written in a real language. There wasn't the processing power to have a performant IDE written in Java for some years after it was released. Netbeans was the f

          • I think you are little bit off :P

            E.g. there was Borland JBuilder or how it was called. And I seriously doubt Visual Caffee was the first one.

            I started working with Java seriously 1997, and then we had several IDEs, and all written in Java. I think Visual Caffee was written in SmallTalk, though.

            There wasn't the processing power to have a performant IDE written in Java for some years after it was released.
            Yes there was. As I wrote my pre thesis and my diploma thesis on the machine I had 1995 ...

            What you mea

      • Obstacle avoidence would be the big advantage of having a rover with a respectable brain. If you could trust a rover to navigate on its own, NASA could cover a wider area-- if for instance the local geology around the landing site proved not to be as interesting as first thought. Imagine if "go check out this other area 200 km to the north" was a viable strategy. Imagine if the rover could take action to survive dust storms.

        Opportunity lasted 15 tears, but it only traveled 45 km.

        The Boston Dynamic artificia

    • Comment removed based on user account deletion
  • I have to wonder what processor SpaceX would use in a rover. Seems to me that every piece of NASA hardware is decades behind current technology. Is it because it takes for-friggin-ever to build the vehicle that they make a design choice using technology that was current at the time and then ten years later they get to build it?

    • Hardening electronics is a thing.

      • by Guspaz ( 556486 )

        SpaceX's approach has been to use off-the-shelf components and to rely on fault tolerance and redundancy rather than special super-expensive hardened gear. It's one of the ways in which they've been able to so significantly undercut everybody else on pricing. At this point, after more than a hundred launches and more than a thousand satellites, I'd say their approach has been pretty proven out.

        NASA has a big problem of doing things because they worked in the past. It ensures that it'll probably work again i

        • by Henriok ( 6762 )
          I'm not a space engineer by any stretch but it's a significant difference in radiation operating within the Earth's magnetic field in LEO, compared to going to Mars and staying there for a decade or so.
    • by war4peace ( 1628283 ) on Saturday February 27, 2021 @03:38PM (#61106416)

      No, it's because radiation hardening is easier to achieve when there's more space between transistors, thus reducing the risk of bit flipping and so on.
      Also, there is little need for multitasking, or very fast response to inputs.

      My 3D printer has an ATmega2560 CPU, yet it works very well and is fast enough. The only woe is its USB-Serial emulator, which only works at 250K BAUD rate. Except for that, it can control multiple motors and process G-Code quickly enough.

      What NASA/JPL have done is a great example of "using the right tool for the job", no fancy stuff that breaks in unexpected ways.

      • The question to ask is, what do you need the performance for? If all you do is execute exact commands sent from Earth much like a 3D printer executes G-code, yeah then you have no use for a desktop scale CPU. On the other hand, if the rover were to have some autonomy in finding it's own path and maybe move a bit faster thanks to that... that would be something to spend compute power on. But I suspect that's not how the rover is built, it's not so much a robot than a remote control car.
        Especially when you a
    • by sjames ( 1099 )

      It's more that the CPU has to be radiation hardened and have a well proven reliability. Chips that run faster and are built with a smaller process are harder to radiation harden. Newer CPUs necessarily have a shorter track record.

      These kinds of trade-offs happen all the time in embedded design.

    • by Junta ( 36770 )

      The issue is that radiation hardened electronics is not a high investment area. The processor they used was released in 2001, and wasn't used in a mission until 2006. No successor at all would be released until 2013/2014. The current rover began design prior to that timeframe. Therefore, they were using the latest and greatest available to them when they designed it, despite the fact that it was 12 year old technology.

      Could they have adopted the newer processor? Perhaps but no need to restart the clock to

    • Probably the same thing they're using on Dragon.
    • I have to wonder what processor SpaceX would use in a rover. Seems to me that every piece of NASA hardware is decades behind current technology. Is it because it takes for-friggin-ever to build the vehicle that they make a design choice using technology that was current at the time and then ten years later they get to build it?

      NASA wants to use proven technology that should prove to be reliable for the duration of the mission or longer; which pretty much eliminates bleeding edge technology unless they are doing a proof of concept type mission. For science missions that are years in the planning the last thing they want is to have something fail and ruin years of work and planned experiments. Add in the lead time too design and build so by the time a mission is launched the tech is already a decade or more old.

      SpaceX, as a privat

    • the problem is that it takes a long time to test and qualify parts for radiation and other environment issues. The design has to be frozen quite a while before launch to allow time for it to be integrated into the spacecraft. Right now my group is working on early stage development of electronics that might be used in a large space telescope in the mid 2030s. It sounds crazy but we are going to radiation qualify the parts over the next 2 years (and need to deal with possible issues if they fail qualifica
    • fast, cheap, reliable: pick any two or pick the same one twice

    • by kamakazi ( 74641 )

      An awful lot of culture and knowledge is lost to the "newer is better" mantra and its variants, such as "more power is better" or "faster is better". I had a clamshell iBook. I realize they are making a vague comparison of CPU power, but let's throw out something else that clamshell did. I could shut the lid, it would suspend (not hibernate) and I could open it up a week later and it would still have enough battery left to do things. You think maybe a slower processor that gets them another couple years

  • I’ve heard of an iBook, it not an IBook. Did Cisco make a laptop?

    • You can blame the stupid USA-style titles for that kind of problem.

      • You can blame the stupid USA-style titles for that kind of problem.

        You mean the font?

        • Obviously, I meant the USA title capitalization rules compared to Canada or Europe.

        • You can blame the stupid USA-style titles for that kind of problem.

          You mean the font?

          I was only teasing, but it occurred to me to check my local newspapers' sites and none of them capitalize every word. Not a single one. Not the paper in the small city that I live in, and not even the bigger papers in the cities.

          I think the blame lies squarely on someone deciding to hold down shift to capitalize that letter, when pretty much everyone knows Apple did not.

          Oh, also the crappy font where you can't tell an 'I' from an 'l'.

          • It is most likely simply an autocorrection error.

            Oh, also the crappy font where you can't tell an 'I' from an 'l'. which is usually not necessary from context, as you can quite see in this particular example :P

        • by Tablizer ( 95088 )

          I always wondered why capital "i" doesn't have bars in sans-serif fonts. The bars should be treated more like T's bar instead of serifs. That would make it easier to interpret, not being confused with lower-case "L" or the digit "1". Slap some dead monk?

    • I’ve heard of an iBook, it not an IBook. Did Cisco make a laptop?

      Maybe they meant a Ukrainian LBook [e-ink-reader.ru]. :-)

  • by david.emery ( 127135 ) on Saturday February 27, 2021 @03:35PM (#61106398)

    All of you talking about how dumb this is should come forward with your own recommendation for a -Radiation Hardened- processor. These are not the same thing you'll find at the liquidation sales at Fry's Electronics.

  • Instead of radiation-hardening the processor, why not harden the enclosure?

    That way you can use COTS stuff in a lead box, which will be substantially less expensive.

    • by hackertourist ( 2202674 ) on Saturday February 27, 2021 @03:56PM (#61106506)

      With the mass budget available for Perseverance, that doesn't work. The radiation we're talking about has so much energy you'd need several tons of lead to shield the processor.

    • That might create secondary particle showers instead.
    • It'll be substantially more expensive because you have to get it to Mars.

      Also one of the main things that's important is simple mass. On earth in addition to the magnetosphere, there's 10 tons of air per square meter. Even a small electronics package needs a lot of mass to shield it.

    • You can try to put the most radiation sensitive electronics in more shielded parts of the spacecraft, but the processor may not be the most sensitive bit
    • Doesn't work, against entire spectrum of particles lead is no better than any other form of mass. You might as well build a concrete bunker around your processor, in which case the rover isn't going anywhere.
  • by fahrbot-bot ( 874524 ) on Saturday February 27, 2021 @04:02PM (#61106526)

    Ibook

    Either this references a Ukrainian LBook [e-ink-reader.ru] or an Apple "iBook" ... but the title capitalization is incorrect for either product name ... Just sayin'.

  • So what you're saying is that it is significantly more powerful than Arduino (ATMega328 and other AVR chip) based robots we have here on Earth? Makes total sense!

    And as other's pointed out, what we need mass computational power for, the rover doesn't do. It is just a remote data collection device, with all of the computational workloads on that data done here on Earth.

    Next think you know, you'll be telling me my microwave is using an outdated 386 CPU to show the clock time!

  • by az-saguaro ( 1231754 ) on Saturday February 27, 2021 @04:46PM (#61106664)

    Responses to this post include comments like "Seems to me that every piece of NASA hardware is decades behind current technology." Plenty of people have already responded, pointing out the need for hardening against vibration, thermal, & radiation that make newest nanometer-process chips less suitable for this. Furthermore, consider what the past 20 years have seen in processor design, and what the requirements of the mission are.

    The speed, bandwidth, and memory requirements are probably not all that demanding on these rovers. They navigate, roll, communicate, use their arm and instruments, and acquire images to beam home, all at a relative snail's pace. The terrain relative navigation that pinpointed a landing spot is probably the highest demand routine the processor will ever run, and that is over. It had one minute to image the area, process the info to match a library of prior ground imagery (as I understand the process), then find a landing solution, while simultaneously acquiring multi-channel data, log that data, calculate and execute rocket firing profiles, unfold hardware, communicate with overhead satellites, and ping the home world. Seems to me that clever programmers who have built the system could find a way to do that well within in 256 MB at 200MHz.

    On the ground, all of its tasks do not seem to be very memory or cpu intensive, especially because the output response is slow. The rover can do autonomous navigation and travel, but it has ample time, hours or days, to figure it out and implement it all. Image processing it seems is largely done back here at JPL, not on board once the images have been captured.

    More important is that the mobo has to be fault tolerant, capable of safe modes and recovery from them, ability to on-the-fly restore its OS, and be fully re-programmable from 100 million miles away on a data signal that is probably about the same as a firefly light seen from hundreds or thousands of miles away. Furthermore, all of the sensors and input channels and output control devices mean that it must act more like a high performance industrial control processor than a whiz-bang video or AI or crypto-currrency processor.

    Processor designs and expanded instruction sets over the past 20 years have added multiple cores, multi-threading, predictive execution pipelines, and high bandwidth instruction sets suited for highly parallel video and 3d-rendering. But, the core elements of a processor are to access memory and io busses, have robust flow control, and have basic arithemetic-logic functions. For those, a circa 2000 processor had everything needed for this mission.

    If the helicopter works, it is perhaps foreseeable that future main rovers and their fleet of drones might need a degree of on-the-fly video processing for navigation, or need for onboard image and signal processing, so perhaps more up-to-date processor architectures might have some merit. But for what we have now, the latest offerings from Intel or AMD seem irrelevant.

    What might be cool, perhaps worthy of a technology demo, would be to take a modern cell phone off the shelf, harden its packaging, and use that as the cheap "brains" of small expendable drone craft, which would permit a cheap fleet of baby rovers and drones without too much additional development. Nasa will not do that, but SpaceX could.

    • OK, what about imaging sensors? Why is NASA's Psyche probe to be launched in 2026 using a sensor that was released in 2005 (the Psyche main imaging camera will be using a KAI-2020, released by Kodak now owned by OnSemi)? There is zero justification for that! They proposed the mission back then and can't change the specifications? That seems a bit ridiculous. They didn't even start building the probe until a couple years ago. They are spending $1 billion dollars and using a cheap obsolete camera that isn't

      • You know your absolutely right. Why don't you submit your resume to NASA so you can educate them on the error of their ways? I'm sure you'll be hired on no time with a mind that brilliant.

        • What are you some kind of jerk? Quit acting like there's a scientific reason they are using ancient technology when it all has to do with money and bureaucracy. They proposed the mission 20 years ago and due to internal processes, they can't change it. Just like how it was found out that Challenger blew up because of "Go fever." Another thing is that even geniuses make mistakes. For example Elon Musk recently admitted to someone on Twitter who suggested it, that SpaceX was dumb not to light all three engine

      • Its really frustrating, but think about it. You are building a ~1B$ spacecraft with a large number of subsystems which ALL need to work. You have a finite budget. If there is an existing fully tested solution that will meet your requirements, you are going to spend your money on the parts where there is no existing solution. You have to freeze the design early so that there is time for full integrated design. Once that is done, any change has the risk of rippling through the rest of the system causing
        • They need 20 years of testing? Come on that's ridiculous. The ROCKET it's being launched on didn't even exist at the time they selected the sensor. So you're telling me the imaging sensor needs more testing that the rocket motors?
          The 1960s and 1970s Pioneer/Voyager missions used the latest microprocessor and imaging technology. The Apollo guidance computer was one of the most advanced computers of its time. Now we have to do 20+ years of testing for image sensor?

          • Not 20 years of testing. We are building electronics, a readout system for a new type of X-ray detector. Project started Jan 2020. We have our first prototype board in hand. If all goes well will be assembled and ready for initial radiation testing late 2021. Then if it passes a realistic prototype in early 22, with radiation testing at a calibrated facility by end of 22. (this mission is going to L2, not low earth orbit, so radiation is a real concern). OK, now we have a technology demonstration (i
        • I think people who think you can simply swap out an old device for a new one haven't done systems engineering on complex systems before. Doing so is basically the polar opposite of reliability.
      • by nadador ( 3747 )

        Sensors used for science need ridiculously low noise budgets. You can't use fancy software to fake it the way you can with a picture on a phone. The photons that show up on the sensor are the science. They also have to be radiation-hardened so that they actually work when they get to where they're going, as total dose radiation is no joke for interplanetary projects. They also need rad-safe glass so that the lenses are cloudy when they get where they're going.

        Every person you designs space electronics dearl

    • by redled ( 10595 )

      The comments on this type of article are always frustrating speculation, parent comment included. There are more modern radiation tolerant processors available but JPL also has 20 years of hardware and software heritage. If it works, don't fix it. It could take 10s of person-years for software updates alone, which will always add some risk too. For the payloads and non-mission-critical components on the rover, there is a ton of stuff used that is no particularly radiation tolerant.

      Good idea to use modern ce

  • An iBook processor would probably last days to weeks before it started having single event upsets (one a bit gets flipped and the operating system loses control of execution, then you have to reset) the problem is you might be right in the middle of taking a picture or drilling and then you have to spend a few days to figure out what went wrong and consider if it is actually a problem by the teams on Earth. It's better if you don't have to reset at all so you use a processor that resets less with radiation.

    • by Agripa ( 139780 )

      An iBook processor would probably last days to weeks before it started having single event upsets (one a bit gets flipped and the operating system loses control of execution, then you have to reset) ...

      I suspect any bulk CMOS digital logic, like practically all processors made for the consumer and industrial markets, would fail from single event upsets almost immediately.

  • That $284K (current price) BAE Systems RAD 750 can function reliably for years in the 24-40 RAD per hour surface of Mar that would give a human a lethal dose in less than an Earth day. Your PC or iphone couldn't function, would be glitching as charged particles caused soft errors from switching until charge buildup in gate oxides made transistors stay permanently open (N-type) or shut off (P type), then it would be brick.

    Making functioning spacecraft and planetary probes is hard and expensive, your consum

    • You do not get a lethal dose on Mars in less than earth day.
      How do you come to that idea?

      Or do you just want to compare the resilience of the camera with a human?

      • You can get a lethal dose on Mars in much less of a day with strong solar flares. All those nice gentle few tens of REM per year (cough, that's actually brutal) are ignoring those bad times. Really, it will kill you unless you dig into rock like a robot mole.

        • with strong solar flares.
          Yes ... perhaps. And you have days of prewarning and stay out of the flare.

          Sorry, aren't you the guy who thinks neither Chernobyl not Fukushima killed anyone? And now a little bit of sun is deadly ... :P

    • https://www.universetoday.com/... [universetoday.com]

      Based on the data from the Martian Radiation Experiment in 2002-2003:

      Average exposure levels would be about 8 RAD per year, but solar events were measured generating 2 RAD in a given day and one even damaged the sensor, oh the irony. 24-40 RAD is likely the per year value with the variance being based on the solar activity although it looks like we still have limited data due to instrument damage and none of them surviving for full solar cycle.

      • Oh yes that's the nice gentle dosage rate enthusiasts like to use until a solar flare comes along, for which Mars has nearly zero protection. But when the sun acts up it's either burrow underground or die time.

        I do believe all Mars colony issues have potential engineering solutions, that rad one is a doozy. Maybe we'll have to send robot heavy construction devices first to build rad shielded shelters first. Maybe water will have to be used for shielding on trip there.

  • Can it run Crysis?

  • But missions somehow costs billions of dollars. For example, the Psyche probe is launching in the year 2026 with an off-the-shelf On-Semi sensor (KAI-2020 CCD) from the year 2014 .. it has been marked obsolete on On-Semi own sensor website for since 2016! Sorry but that is ridiculous, there is zero legit reason for not using a better sensor. Where does the money go? Why are they cheaping out on the sensor?

    • Sorry I meant released in 2005 .. not 2014 .. 2014 is too cutting edge for NASA.

      http://www.stargazing.net/davi... [stargazing.net]

    • "Obsolete" does not mean it does not work. When those devices were built and sold, they were top-of-the-line. "Obsolete" on a company's webpage is more apt to mean "no longer made or supported", because they have now tooled up to make later models. Just like the rest of the hardware on the rover and other satellites, it needs to be highly dependable and fault tolerant, and have a stable response profile that can give dependable data over a long time in a harsh environment.

      For the Mars rover, sensor size

  • by guacamole ( 24270 ) on Sunday February 28, 2021 @06:44AM (#61107900)

    The "Clamshell" iBook (aka, the iBook no straight man would be caught dead using) was the first generation iBook and was produced from 1999 until May of 2001. White there were some updates to specs since the release in 1999, it's incorrect to call it "from 2001". It's from 1999 or at best from 2000. In 2001 this was a lame duck product and the production was winding down.

    The 2nd generation iBook was introduced sometime in 2001, and it indeed can be called "from" 2001. Btw, I recall buying one just to fool around with MacOS. It had I believe 600MHz G3 CPU, 18GB hard drive, and +600MB of RAM.

  • Non-radiation hardened variants of the 750 were also used in the nintendo gamecube and later wii consoles.

Real Programmers think better when playing Adventure or Rogue.

Working...