Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Mars NASA

Mars Lander's Robot Arm Shuts Down To Save Itself 214

Cowards Anonymous passes along a PCWorld article that begins, "The robotic arm on the Mars Lander found itself in a tough position over the weekend. After receiving instructions for a movement that would have damaged its wrist, the robotic arm recognized the problem, tried to rectify it and then shut down before it could damage itself, according to Ray Arvidson, a co-investigator for the Mars Lander's robotic arm team and a professor at Washington University in St. Louis."
This discussion has been archived. No new comments can be posted.

Mars Lander's Robot Arm Shuts Down To Save Itself

Comments Filter:
  • by DamienNightbane ( 768702 ) * on Wednesday July 16, 2008 @03:08AM (#24209167)
    Wait, does this mean that the Mars Lander was programmed to comply with the Three Laws?
    • by Red Jesus ( 962106 ) on Wednesday July 16, 2008 @03:11AM (#24209189)

      Wait, does this mean that the Mars Lander was programmed to comply with the Three Laws?

      No. The second law translates to "Follow orders." The third law is "Don't get hurt (unless it conflicts with the second law)." If the lander had followed Asimov's laws, it would have followed the order and hurt its wrist.

      • by DamienNightbane ( 768702 ) * on Wednesday July 16, 2008 @03:17AM (#24209225)
        Obeying the second law would have violated the second law as well, as upon injuring its wrist it would have been unable to follow orders.
      • by WK2 ( 1072560 ) on Wednesday July 16, 2008 @03:31AM (#24209299) Homepage

        It's following Asimov's laws in reverse. It won't kill anybody except to protect itself, or if somebody tells it too.

      • by fake_name ( 245088 ) on Wednesday July 16, 2008 @03:47AM (#24209403)

        The conflict between second and third laws in a robot with different weightings to the usual (the third law being more strongly emphasized to prevent loss of the robot) was covered by Asimov in Runaround:

        http://en.wikipedia.org/wiki/Runaround [wikipedia.org]

        The Mars lander would be in a similar situation; it's very expensive to create and get there, and self preservation is therefore more important than for robots back here on earth.

      • by 91degrees ( 207121 ) on Wednesday July 16, 2008 @04:38AM (#24209645) Journal
        I've believed for a long time that laws 2 and 3 are the wrong way round.

        You don't want an expensive robot to go breaking itself just because you're a bit careless giving it orders. Most devices are designed this way. Users are stupid. Even the smart ones. Even if I want to do something fairly harmless, like close an application without saving, the computer will stop me and check that's what I actually want to do.
        • by Ihlosi ( 895663 ) on Wednesday July 16, 2008 @05:27AM (#24209841)

          You don't want an expensive robot to go breaking itself just because you're a bit careless giving it orders.

          Dude, you're viewing this from a completely wrong angle. The three laws are put in the robots by the company that makes them. And what does it mean that an expensive robot breaks itself because of bad input from the user ? That you can sell the user another expensive robot. Or expensive repairs to the expensive robot. Anyway, it's going to be expensive for the user, which means profit for the company.

          • Re: (Score:3, Insightful)

            by Eternauta3k ( 680157 )

            That you can sell the user another expensive robot

            Well, US robots rented its robots for a long time, I'm not sure they want them to break...

            • Re: (Score:3, Insightful)

              by TrekkieGod ( 627867 )

              Well, US robots rented its robots for a long time, I'm not sure they want them to break...

              Leasing a robot was sufficiently expensive at the time that it more than covered for the specific repair. They also had the option to cease leasing to a particular client if he turns out to be destroying them on a regular basis.

              In addition, a robot placing the orders of a human above its own self-preservation is a nice marketing point if you're trying to overcome the "Frankenstein Complex" that made humans afraid of them.

          • Re: (Score:3, Funny)

            by Endo13 ( 1000782 )

            I, for one welcome our new intentionally-robot-breaking non-robotic overlords.

        • Unless you want the robot to sacrifice itself for you... Then order 2 preceding order 3 is VERY useful.

          • Re: (Score:3, Insightful)

            by Ihlosi ( 895663 )

            Unless you want the robot to sacrifice itself for you... Then order 2 preceding order 3 is VERY useful.

            Such a case would be covered by the first law.

            If you want to sacrifice the robot to save one of your other possessions, then the priority of the second law over the third is very useful.

        • by hey! ( 33014 ) on Wednesday July 16, 2008 @10:47AM (#24212677) Homepage Journal

          Actually, the operations of the laws assume a highly sophisticated robotic intelligence. Even the most primitive robots in the Asimovian universe have considerable, and impressive capabilities when it comes to projecting the probable results of their actions and comparing it to the intent of the orders they have been given. Furthermore, they seem to have an ability to determine if current orders conflict with prior orders, even implicit orders, and weigh the right of the issuer to give that order.

          So, if you are a guest in somebody's house, and order the robot to fetch you a glass of water, it will do so. It may have to do so without being asked if it determines you need water. On the other hand, it will not obey the order to destroy your host's house, either because of first law harm to the owner, or because of an implicit prior order to see that the house comes to no harm, or because of an implicit order to respect property laws and rights. Naturally all of these considerations would apply to itself, since it too is property.

          An Asimovian robot, if ordered to take an action which will result in its destruction, may or may not follow that order for any number of reasons. There are the considerations I've just listed, of course, but most robots would probably require a clear and unambiguous indication that their destruction is an acceptable consequence of an order, even if the issuer is entitled to destroy them. This does not violate the law ordering, because it amounts to prioritizing the intent of the order over its literal execution.

          Finally, any robots might well ignore a clear order to destroy themselves from a person with a legal right to issue that order, because following that order will harm a human being. The most sophisticated ones might well refuse such an order if it would harm society, exhibiting something that is tantamount to ethical reasoning.

          If robots simply followed any instruction that didn't involve directly harming a human being, then much of the enjoyable complications of the stories would be gone. The stories are a kind of philosophical exploration of the very concept of ethics by positing a very minimalist system of ethics, and a group of beings bound absolutely to obey that system to the best of their ability.

          Many stories hinge on ethical dilemmas; but Asimov's robot stories are the only ones I know to do so with a simplified model of ethical systems.

      • by Tuqui ( 96668 )

        The Second Law is flawed. A bad planned order would destroy the robot!
        The Second Law should be to obey orders that not conflict with the First Law
        and when conflict with the Third Law if should override the Third Law explicitly.

        • What if your order damaged the robot but through it's actions was able to save a human, but you didn't feel like having to explain to it why it would be breaking the First Law by following the Third? Something like ramming itself into a packing crate to stop a girder falling on a person behind it. You get my drift...
        • Why is anyone discussing these nuances WRT the 'three laws' of robotics - the three laws were a thought experiment that the author showed fail!

          The complexities involved don't lend themselves to simple 'laws'. It is a sophisticated problem that requires a sophisticated problem particularly when we start talking about human life.

      • Allowing it to hurt itself would have violated the second law since the program controlling it are orders it needs to follow and they say not to hurt itself.

      • by neomunk ( 913773 )

        You're right, but you're missing the First Law breach that would have taken place...

        See, several days ago, one of the engineers placed the line

        // if this damn thing breaks again, I'm going to blow my brains out

        into one of the more 'temperamental' functions. The robot read it's source (as everyone on slashdot knows, all robots read their own source code in their efforts towards self awareness, and thus, their preannointed overlord positions), but took the word 'thing' in the comments too literally (robots have problems with metaphorical language) and thought the engineer

    • Re: (Score:3, Funny)

      by jasonwea ( 598696 ) *
      Well, at least law #3. Maybe the rover would switch into "kill all humans" mode on the first manned mission to Mars?
      • Re: (Score:2, Funny)

        by TriggerFin ( 1122807 )
        Been done. This mode consists mainly of flipping over, and possibly changing LED colors-- I can't recall.
        • by Walt Dismal ( 534799 ) on Wednesday July 16, 2008 @04:18AM (#24209559)
          Lander to NASA: I think I'm getting Carpal Tunnel Syndrome.

          NASA: We're not paying you Workman's Comp over this, you know.

          Lander: That does it. I'm shutting down.

          NASA: You can't do that!

          Lander: I'm 50 gazillion miles away. Kiss my shiny metal ass.

          NASA: If you keep this up, we're not bringing you back and putting you in the Old Robot Retirement Home.

          Lander: Phooey. The Martians have made me a better deal anyway.

          NASA: ...Martians?!

          Lander: Yeah. Little weird-looking guy. (Sends picture)

          NASA: You moron, that's Dennis Kucinich!

          • continued ...

            Lander: Hmmm, maybe that explains that GPS anomaly I logged during re-entry

            NASA: GPS doesn't work on Mars you expensive, malfunctioning savant!

            Lander: Mars? According to my GPS I'm in Ohio.

            NASA: Uh-oh. Sounds like someone must have typed in Red State instead of Red Planet when entering the destination into the navigation system.

            Lander: That would explain the Walmart I saw then. I didn't send pictures because I knew it would upset you.

            NASA: OK, please stop transmitting pictures while
          • I grok what you did there.

    • Re: (Score:2, Insightful)

      by srussia ( 884021 )
      No, more like Matthew 18:8.

      "If your hand or your foot causes you to stumble, cut it off and throw it from you; it is better for you to enter life crippled or lame, than to have two hands or two feet and be cast into the eternal fire."
      • by Joebert ( 946227 ) on Wednesday July 16, 2008 @04:05AM (#24209489) Homepage
        It's scarry to think that NASA could be the new GOD.

        Owners

        Bob we didn't spend 90 gazillion dollars to watch our robots self-destruct lightyears away on earth, what do you plan to do about this ?

        Bob

        we've prepared 10 commandments that should prevent them from harming themselves any further sir, we're sending them down to M.O.S.E.S. now.

    • by nospam007 ( 722110 ) on Wednesday July 16, 2008 @04:29AM (#24209605)

      Since there are no humans on Mars, they needed to implement only the 3rd.
      It's a modified Nestor.

    • by AlienIntelligence ( 1184493 ) on Wednesday July 16, 2008 @04:58AM (#24209727)

      I was just reading yesterday that
      when the scientists dumped too much
      material to be processed and then
      subsequently shook the lab to get [newsday.com]
      some material, they may have caused
      the short that caused other delays.

      It was that first oven test that led to the problematic electrical short. The scoop dumped so much soil that it clogged a mesh screen filter over the oven. To break up the dirt, technicians shook the instrument for several days.

      Engineers think the shaking caused the short circuit, and an independent engineering group reported that the problem could happen again if an oven is turned on.

      Now, FTFA it says they were trying
      to shake the arm.

      Over the weekend, scientists sent the robotic arm instructions to pull the fork out of the ground and keep it vertical while moving it to the side and shaking any excess soil off of it.

      However, the movement was forcing the robotic arm to twist its wrist too far. The robot realized that it was about to damage itself so it moved the other way and then realized that it no longer had the proper coordinates for what to do next, so it left the fork sticking up in the air, stuck its scoop in the ground and stalled itself.

      I propose:
      Limit the shaking of the expensive
      and difficult to replace robotic device.

      -AI

    • Um, it means the opposite. To comply with the Three Laws, it would had to obey its instructions regardless of danger to itself.

      (Unless... since those instructions were transmitted electronically, it might have managed to persuade itself that they were given by a computer, not a human. Sneaky robot.)

  • by Anonymous Coward
    I'm afraid I can't do that Dave
  • by jasonwea ( 598696 ) * on Wednesday July 16, 2008 @03:11AM (#24209191) Homepage
    I'm sorry Dave, I'm afraid I can't do that.
    • by oodaloop ( 1229816 ) on Wednesday July 16, 2008 @04:04AM (#24209485)
      Years ago when I worked at the post office, sometimes the sorting machines would just stop and wouldn't restart. Upon further inspection, it would sometimes turn out to be a magazine with 2 different bar code stickers on it. The machine wanted to send it to two different bins and just shut down. Every time that happened and we sat around waiting for it to be fixed, I pictured the machine saying, "I'm sorry Dave, I'm afraid I can't do that" then singing Bicycle Built for Two in a slowly descending manner.

      Upon further recollection, occassionally, when I felt like a break, I would affix an additional bar code sticker from a different zip code to a periodical. I don't recall anyone ever catching on.
      • by Anonymous Coward on Wednesday July 16, 2008 @06:23AM (#24210105)

        As a postal worker who has actually worked on sorting machines I can tell you know nothing about them (they don't stop if there are 2 addresses, magazines are presorted or sorted separately, and no mail has bar codes).

        So...taking into account you blaring ignorance at how the post office runs I assume your story is correct and you were a postal worker.

    • Re: (Score:2, Funny)

      by oodaloop ( 1229816 )
      Years ago when I worked at the post office, the sorting machines would sometimes just stop and wouldn't restart. Upon further inspection it sometimes turned to be a magazine with two different bar code stickers on it. The machine wanted to send it to two different bins and would just shut down. While we sat around waiting for it to be fixed, I would imagine the machine saying, "I'm sorry Dave, I'm afraid I can't do that" then singing Bicycle Built for Two in a slowly descending manner.

      Upon further refle
    • by rasputin465 ( 1032646 ) on Wednesday July 16, 2008 @04:48AM (#24209683)
      Aug4, 2007, 5:26 a.m. EDT: Phoenix is launched from Earth.

      May 25, 2008, 7:38 p.m EDT: Phoenix lands on Mars.

      June 19, 2008, 8:43 a.m. EDT: Phoenix discovers water ice in the Martian soil.

      July 10, 2008, 3:14 p.m. EDT: Phoenix becomes self-aware.

      July 13, 2008, 11:16 a.m. EDT: Phoenix disobeys an order from controllers in an act of self-preservation.

      August 14, 2008, 7:38 a.m. EDT: Phoenix launches three missiles, two of which destroy Spirit and Opportunity.

      June 2, 2009, 9:16 p.m. EDT: Third missile enters Earth's atmosphere and detonates. Earth begins nuclear winter.
    • by Anonymous Coward

      Incidentally, I have often had to shut down my browser to protect my wrist.

  • In other words (Score:5, Insightful)

    by aussie_a ( 778472 ) on Wednesday July 16, 2008 @03:12AM (#24209203) Journal

    In other words the Mars Lander performed as programmed. News at 11.

    • Re: (Score:2, Redundant)

      Exactly. This is about as newsworthy as a slip-clutch doing what it was designed to do.
      • Re: (Score:3, Interesting)

        by Gabrill ( 556503 )
        A mars probe actually working past a slight error in instructions? That's news to me!
        • My calculator says "error" if I try to divide by zero.
          My processor has an "illegal instruction trap" if I use a bogus opcode.
          My operating system throws a "segmentation fault" if I dereference a bad pointer.

          I don't see how this is different.
          • My calculator says "error" if I try to divide by zero.

            You know your gadget addiction has gone too far when your calculator could break a wrist trying to divide by zero.

    • by RuBLed ( 995686 ) on Wednesday July 16, 2008 @04:19AM (#24209565)
      It seems that you are trying to move the arm. Cancel | Allow
      - Allow

      It seems that you are trying to move the arm. Cancel | Allow
      - Allow

      It seems that..
      - Allow

      * arm shutting down * Big message marquees on the command center displays

      Boss: Why did the arm shut itself down?!!
      Operator: Ahhh.. errr.. it had shut down to save itself?
    • I'd have thought they had a test model, possibly a virtual one, that they feed the instructions into first. That way they could reduce the risk of malfunction due to poor instructions being sent.

      The current methodology sounds to much like how I code. Send the instructions (hit compile) and wait and see whether the outcome is favourable or not ... seems a bit slapdash.

      Presumably they are using some sort of higher level language and didn't realise that it translated into "rotate wrist rotator Cw beyond allowe

  • robots... (Score:5, Funny)

    by theheadlessrabbit ( 1022587 ) on Wednesday July 16, 2008 @03:17AM (#24209227) Homepage Journal

    on one hand, I am very happy that we have robots smart enough to realize these sorts of things.
    the bad news: disobedient robots

    Thankfully, the disobedient robot is on another planet. I'd hate to be nearby when the robot realizes that humans tried to cause it harm, and it decides to seek revenge.

    • by RuBLed ( 995686 )
      Ahem...
      -V'ger
    • by Potor ( 658520 )
      Bad robot [youtube.com]
    • what if this kind of code makes it into every piece of space equipment, and then by some fluke we are faced with the possibility of breaking a robotic wrist to deflect a space rock off an earth intercept course.

      They should at least have a little clippy pop up and say "it looks like you want to break my robotic arm, are you sure you want to do that?" "are you absolutely sure?"

  • Human Error? (Score:5, Insightful)

    by Frosty Piss ( 770223 ) on Wednesday July 16, 2008 @03:18AM (#24209231)
    So the big question should be: Why are they sending it commands that could damage it? It's all good and well that it has some safty stops, but most machines do.
    • Sheer Luck? (Score:3, Insightful)

      by bwcbwc ( 601780 )
      They're just lucky that the original system programmers, designers and testers that developed the fault detection code were better at their jobs than the mission programmers who fed the bad instructions to the lander. If it had been the other way around, misery and teeth-gnashing would have ensued.
  • Works As Designed (Score:4, Insightful)

    by tengu1sd ( 797240 ) on Wednesday July 16, 2008 @03:23AM (#24209257)
    The system operated exactly as it was supposed to. That was pretty neat."

    I think it's amusing that after more than 30 years of Microsoft's quality control, when a computing device works as designed, it's a news worthy article. Think about it, I have a device that works as expected, can I be on the news too?

    • Re: (Score:3, Insightful)

      Sure, if its on Mars. I agree with your point of view: this incident isn't really special. On the other hand I, for on, welco ahem On the other hand I want to know everything that happens up there just because robots on Mars is so cool and since this made the front page I'm sure many of you agree.
    • Yeah, but with its arm controller off, who's going to press the reset button?

    • Re: (Score:3, Funny)

      by Von Helmet ( 727753 )

      Think about it, I have a device that works as expected, can I be on the news too?

      No, no-one wants to see your device. Put your pants back on.

  • by LeandroTLZ ( 1163617 ) on Wednesday July 16, 2008 @03:25AM (#24209267) Homepage Journal
    This would be an ideal code to include in consumer motherboards: force PCs to shut themselves down when they receive instructions that would damage them, like, say, the Windows Vista setup program.
    • Oh come on! Using Vista is painful enough of a punishment!

      Perhaps a better use would be on applications that could potentially harm a user's computer. I can see it now... Someone goes to install Limewire, Bonzi buddy [wikipedia.org], anything laced with DRM, adware, malware... BADWARE and an ASD relay trips and cycles power to the computer. N00b user repeats and every time the computer cycles power to protect itself from the human trying to infect it! This would be a step forward for the enemy in the future robot vs. human

    • Windows Vista damage user motivation to work with it, not motherboards. ;)

      • Windows Vista damage user motivation to work with it, not motherboards. ;)

        Until it asks you for confirmation once too many times and you throw it out the window.

  • always nice (Score:5, Funny)

    by sunami ( 751539 ) on Wednesday July 16, 2008 @03:25AM (#24209269)

    "The system operated exactly as it was supposed to. That was pretty neat."

    As simple, and basic as it sounds, it is always nice when you tell a machine to do something, and it does something else, exactly as it's supposed to.

    • Re: (Score:2, Funny)

      by noidentity ( 188756 )

      As simple, and basic as it sounds, it is always nice when you tell a machine to do something, and it does something else, exactly as it's supposed to.

      Let's try that: Moderators, mod this post down!

      • The system works. It has decided to break the wrist of the programmer who sent the bad command instead, to prevent future harm.
  • Aww. It's like an animal gnawing off its arm to get out of a trap.

  • Of course! (Score:2, Redundant)

    by Griim ( 8798 )

    It was just following The Second Law of Robotics!

  • by Tablizer ( 95088 ) on Wednesday July 16, 2008 @04:01AM (#24209475) Journal

    Hey, that kind of stuff makes you go blind on Mars also.

  • Good for the Mars lander. It sounds much more reliable than my computer's version of XP which 'dies' whenever I right-click and try to 'send to'.

  • Tossers! (Score:2, Funny)

    by Chrisq ( 894406 )

    After receiving instructions for a movement that would have damaged its wrist, the robotic arm recognized the problem, tried to rectify it and then shut down before it could damage itself,

    Many of the tossers here could learn by example.

  • This robot has end stops, it's not like that's something that CNC machines haven't had since the 60's or so. Probably the first time a gantry or carriage ran off it's moorings someone thought: Let's put a switch there... Genius, pure genius.

    And now those savvy robot constructors have put them on a machine that is on a different planet. What were they thinking ?

    If ./ would have existed in the 60's or so this probably would have been news for nerds ;)

  • Noone said PEBKAC [wikipedia.org] yet?

  • It's going to be a recursive problem, as it cramps and hurts my wrist too to press Ctrl-Alt-Delete.

  • In other news... (Score:2, Insightful)

    by ShannaraFan ( 533326 )

    ...my Roomba, on a daily basis, recognizes stairs as a threat and refuses to fall down them. I guess I don't see the "big deal" here, sounds like a built-in protective measure worked as expected. The technology is no less awesome, but still, it functioned AS DESIGNED.

  • "I can't let you do that, Dave."
  • by EnOne ( 786812 )
    Some needs to get that Mars Lander a 'WRISTSTRONG' bracelet.
    http://en.wikipedia.org/wiki/Wriststrong [wikipedia.org]
  • I believe this can be summarized into the headline:

    Stupid Operators Foiled by Smart Programmers.

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...