Mars Lander's Robot Arm Shuts Down To Save Itself 214
Cowards Anonymous passes along a PCWorld article that begins, "The robotic arm on the Mars Lander found itself in a tough position over the weekend. After receiving instructions for a movement that would have damaged its wrist, the robotic arm recognized the problem, tried to rectify it and then shut down before it could damage itself, according to Ray Arvidson, a co-investigator for the Mars Lander's robotic arm team and a professor at Washington University in St. Louis."
Shut down before it could damage itself? (Score:5, Funny)
Re:Shut down before it could damage itself? (Score:5, Insightful)
Wait, does this mean that the Mars Lander was programmed to comply with the Three Laws?
No. The second law translates to "Follow orders." The third law is "Don't get hurt (unless it conflicts with the second law)." If the lander had followed Asimov's laws, it would have followed the order and hurt its wrist.
Re:Shut down before it could damage itself? (Score:5, Funny)
Re: (Score:2, Funny)
Re:Shut down before it could damage itself? (Score:5, Insightful)
As the decision tree gets huge, just about any tiny action will eventually lead to tragedy, or [odius] being elected.
There is no real safety under the sun.
Re:Shut down before it could damage itself? (Score:5, Funny)
It's following Asimov's laws in reverse. It won't kill anybody except to protect itself, or if somebody tells it too.
Re:Shut down before it could damage itself? (Score:5, Funny)
Re: (Score:3, Funny)
Re: (Score:2, Funny)
http://www.hulu.com/watch/2340/saturday-night-live-old-glory [hulu.com]
Re: (Score:2, Funny)
They aren't that common. The robots have a kill limit. They will stop at 2^32 kills because the kill limit is stored as an unsigned int. Oh wait! They just upgraded to 64-bi
Re:Shut down before it could damage itself? (Score:5, Informative)
The conflict between second and third laws in a robot with different weightings to the usual (the third law being more strongly emphasized to prevent loss of the robot) was covered by Asimov in Runaround:
http://en.wikipedia.org/wiki/Runaround [wikipedia.org]
The Mars lander would be in a similar situation; it's very expensive to create and get there, and self preservation is therefore more important than for robots back here on earth.
Re:Shut down before it could damage itself? (Score:4, Insightful)
You don't want an expensive robot to go breaking itself just because you're a bit careless giving it orders. Most devices are designed this way. Users are stupid. Even the smart ones. Even if I want to do something fairly harmless, like close an application without saving, the computer will stop me and check that's what I actually want to do.
Re:Shut down before it could damage itself? (Score:5, Funny)
You don't want an expensive robot to go breaking itself just because you're a bit careless giving it orders.
Dude, you're viewing this from a completely wrong angle. The three laws are put in the robots by the company that makes them. And what does it mean that an expensive robot breaks itself because of bad input from the user ? That you can sell the user another expensive robot. Or expensive repairs to the expensive robot. Anyway, it's going to be expensive for the user, which means profit for the company.
Re: (Score:3, Insightful)
Well, US robots rented its robots for a long time, I'm not sure they want them to break...
Re: (Score:3, Insightful)
Well, US robots rented its robots for a long time, I'm not sure they want them to break...
Leasing a robot was sufficiently expensive at the time that it more than covered for the specific repair. They also had the option to cease leasing to a particular client if he turns out to be destroying them on a regular basis.
In addition, a robot placing the orders of a human above its own self-preservation is a nice marketing point if you're trying to overcome the "Frankenstein Complex" that made humans afraid of them.
Re: (Score:3, Funny)
I, for one welcome our new intentionally-robot-breaking non-robotic overlords.
Re: (Score:2)
Unless you want the robot to sacrifice itself for you... Then order 2 preceding order 3 is VERY useful.
Re: (Score:3, Insightful)
Unless you want the robot to sacrifice itself for you... Then order 2 preceding order 3 is VERY useful.
Such a case would be covered by the first law.
If you want to sacrifice the robot to save one of your other possessions, then the priority of the second law over the third is very useful.
Re:Shut down before it could damage itself? (Score:4, Informative)
Actually, the operations of the laws assume a highly sophisticated robotic intelligence. Even the most primitive robots in the Asimovian universe have considerable, and impressive capabilities when it comes to projecting the probable results of their actions and comparing it to the intent of the orders they have been given. Furthermore, they seem to have an ability to determine if current orders conflict with prior orders, even implicit orders, and weigh the right of the issuer to give that order.
So, if you are a guest in somebody's house, and order the robot to fetch you a glass of water, it will do so. It may have to do so without being asked if it determines you need water. On the other hand, it will not obey the order to destroy your host's house, either because of first law harm to the owner, or because of an implicit prior order to see that the house comes to no harm, or because of an implicit order to respect property laws and rights. Naturally all of these considerations would apply to itself, since it too is property.
An Asimovian robot, if ordered to take an action which will result in its destruction, may or may not follow that order for any number of reasons. There are the considerations I've just listed, of course, but most robots would probably require a clear and unambiguous indication that their destruction is an acceptable consequence of an order, even if the issuer is entitled to destroy them. This does not violate the law ordering, because it amounts to prioritizing the intent of the order over its literal execution.
Finally, any robots might well ignore a clear order to destroy themselves from a person with a legal right to issue that order, because following that order will harm a human being. The most sophisticated ones might well refuse such an order if it would harm society, exhibiting something that is tantamount to ethical reasoning.
If robots simply followed any instruction that didn't involve directly harming a human being, then much of the enjoyable complications of the stories would be gone. The stories are a kind of philosophical exploration of the very concept of ethics by positing a very minimalist system of ethics, and a group of beings bound absolutely to obey that system to the best of their ability.
Many stories hinge on ethical dilemmas; but Asimov's robot stories are the only ones I know to do so with a simplified model of ethical systems.
Re: (Score:3, Funny)
Well, I hit ctrl+S, but this definitely reminded me of an argument I had with a user at my company a few weeks ago who literally said to me in these exact words
"If I don't save this file, the changes I make aren't there the next day."
For the record, this is an extremely difficult point to argue with....
Re: (Score:2)
I must protest. Your post may be perceived as attack on all attempts to rightfully put blame on others who in such cases should have gotten the blame in the first place but by mistake did not.
I must say that it may be not right but while employed as an maintenance engineer I enjoyed proving that fault reoport issuer is wrong big time. Sometimes I exaggerated slightly and by using appropriate argumentation I sometimes forced the issuers of fault reports to beg to send them back but what heck some fun must be
Re: (Score:2)
The Second Law is flawed. A bad planned order would destroy the robot!
The Second Law should be to obey orders that not conflict with the First Law
and when conflict with the Third Law if should override the Third Law explicitly.
Re: (Score:2)
Re: (Score:2)
Why is anyone discussing these nuances WRT the 'three laws' of robotics - the three laws were a thought experiment that the author showed fail!
The complexities involved don't lend themselves to simple 'laws'. It is a sophisticated problem that requires a sophisticated problem particularly when we start talking about human life.
Re: (Score:2)
Allowing it to hurt itself would have violated the second law since the program controlling it are orders it needs to follow and they say not to hurt itself.
Re: (Score:2)
You're right, but you're missing the First Law breach that would have taken place...
See, several days ago, one of the engineers placed the line
// if this damn thing breaks again, I'm going to blow my brains out
into one of the more 'temperamental' functions. The robot read it's source (as everyone on slashdot knows, all robots read their own source code in their efforts towards self awareness, and thus, their preannointed overlord positions), but took the word 'thing' in the comments too literally (robots have problems with metaphorical language) and thought the engineer
Re: (Score:3, Funny)
Re: (Score:2, Funny)
Re:Shut down before it could damage itself? (Score:5, Funny)
NASA: We're not paying you Workman's Comp over this, you know.
Lander: That does it. I'm shutting down.
NASA: You can't do that!
Lander: I'm 50 gazillion miles away. Kiss my shiny metal ass.
NASA: If you keep this up, we're not bringing you back and putting you in the Old Robot Retirement Home.
Lander: Phooey. The Martians have made me a better deal anyway.
NASA: ...Martians?!
Lander: Yeah. Little weird-looking guy. (Sends picture)
NASA: You moron, that's Dennis Kucinich!
Re: (Score:2)
Lander: Hmmm, maybe that explains that GPS anomaly I logged during re-entry
NASA: GPS doesn't work on Mars you expensive, malfunctioning savant!
Lander: Mars? According to my GPS I'm in Ohio.
NASA: Uh-oh. Sounds like someone must have typed in Red State instead of Red Planet when entering the destination into the navigation system.
Lander: That would explain the Walmart I saw then. I didn't send pictures because I knew it would upset you.
NASA: OK, please stop transmitting pictures while
Re: (Score:2)
I grok what you did there.
Re: (Score:2, Insightful)
"If your hand or your foot causes you to stumble, cut it off and throw it from you; it is better for you to enter life crippled or lame, than to have two hands or two feet and be cast into the eternal fire."
Re:Shut down before it could damage itself? (Score:5, Funny)
Re: (Score:3, Funny)
Re:Shut down before it could damage itself? (Score:4, Interesting)
Since there are no humans on Mars, they needed to implement only the 3rd.
It's a modified Nestor.
Shaking appears to be bad for sensitive equipment (Score:5, Funny)
I was just reading yesterday that
when the scientists dumped too much
material to be processed and then
subsequently shook the lab to get [newsday.com]
some material, they may have caused
the short that caused other delays.
It was that first oven test that led to the problematic electrical short. The scoop dumped so much soil that it clogged a mesh screen filter over the oven. To break up the dirt, technicians shook the instrument for several days.
Engineers think the shaking caused the short circuit, and an independent engineering group reported that the problem could happen again if an oven is turned on.
Now, FTFA it says they were trying
to shake the arm.
Over the weekend, scientists sent the robotic arm instructions to pull the fork out of the ground and keep it vertical while moving it to the side and shaking any excess soil off of it.
However, the movement was forcing the robotic arm to twist its wrist too far. The robot realized that it was about to damage itself so it moved the other way and then realized that it no longer had the proper coordinates for what to do next, so it left the fork sticking up in the air, stuck its scoop in the ground and stalled itself.
I propose:
Limit the shaking of the expensive
and difficult to replace robotic device.
-AI
Re: (Score:2)
Um, it means the opposite. To comply with the Three Laws, it would had to obey its instructions regardless of danger to itself.
(Unless... since those instructions were transmitted electronically, it might have managed to persuade itself that they were given by a computer, not a human. Sneaky robot.)
Following oders: (Score:2, Funny)
Re: (Score:2)
Re: (Score:2)
Why can't the orders just be checked on earth? It's not like with MER, where they have autonomous driving; AFAIK the arm operations are quite static (I mean, they just send the motion commands, the software on the lander has no intelligence to make up it's own movements).
BTW, your comment about the weight: The way this is done is by measuring the motor current, as soon as it gets too high the motor stops ('stalls'). There are quite some examples of that on the MER mission.
Open the pod bay doors, HAL. (Score:5, Funny)
Re:Open the pod bay doors, HAL. (Score:5, Interesting)
Upon further recollection, occassionally, when I felt like a break, I would affix an additional bar code sticker from a different zip code to a periodical. I don't recall anyone ever catching on.
Postal My Ass (Score:4, Funny)
As a postal worker who has actually worked on sorting machines I can tell you know nothing about them (they don't stop if there are 2 addresses, magazines are presorted or sorted separately, and no mail has bar codes).
So...taking into account you blaring ignorance at how the post office runs I assume your story is correct and you were a postal worker.
Re: (Score:3, Informative)
*Cough* POSTNET [wikipedia.org] *cough*
Re: (Score:2, Funny)
Upon further refle
Skyne.... I mean, Phoenix (Score:5, Funny)
May 25, 2008, 7:38 p.m EDT: Phoenix lands on Mars.
June 19, 2008, 8:43 a.m. EDT: Phoenix discovers water ice in the Martian soil.
July 10, 2008, 3:14 p.m. EDT: Phoenix becomes self-aware.
July 13, 2008, 11:16 a.m. EDT: Phoenix disobeys an order from controllers in an act of self-preservation.
August 14, 2008, 7:38 a.m. EDT: Phoenix launches three missiles, two of which destroy Spirit and Opportunity.
June 2, 2009, 9:16 p.m. EDT: Third missile enters Earth's atmosphere and detonates. Earth begins nuclear winter.
Re:Skyne.... I mean, Phoenix (Score:4, Funny)
Contract negotiations with Bruce Willis fell through. We're all doomed.
Happens to slashdotters too... (Score:2, Funny)
Incidentally, I have often had to shut down my browser to protect my wrist.
Re: (Score:2)
In other words (Score:5, Insightful)
In other words the Mars Lander performed as programmed. News at 11.
Re: (Score:2, Redundant)
Re: (Score:3, Interesting)
Re: (Score:2)
My processor has an "illegal instruction trap" if I use a bogus opcode.
My operating system throws a "segmentation fault" if I dereference a bad pointer.
I don't see how this is different.
Re: (Score:2)
My calculator says "error" if I try to divide by zero.
You know your gadget addiction has gone too far when your calculator could break a wrist trying to divide by zero.
Re: (Score:2)
Re: (Score:3, Interesting)
Re:In other words (Score:5, Funny)
- Allow
It seems that you are trying to move the arm. Cancel | Allow
- Allow
It seems that..
- Allow
* arm shutting down * Big message marquees on the command center displays
Boss: Why did the arm shut itself down?!!
Operator: Ahhh.. errr.. it had shut down to save itself?
Re:In other words (Score:5, Funny)
Operator2: It seems Phoenix is about to give itself 'the stranger'
Re: (Score:3, Funny)
Engy#1: Hey, let's see if the arm can give us the middle finger from Mars!
Engy#2: No dude, wait...
Engy#1: Oh shit, the finger is up but the arm has shut down!
Engy#2: Here comes the boss!
Boss: You fucking idiots!!!
Don't they have a test model? (Score:2)
I'd have thought they had a test model, possibly a virtual one, that they feed the instructions into first. That way they could reduce the risk of malfunction due to poor instructions being sent.
The current methodology sounds to much like how I code. Send the instructions (hit compile) and wait and see whether the outcome is favourable or not ... seems a bit slapdash.
Presumably they are using some sort of higher level language and didn't realise that it translated into "rotate wrist rotator Cw beyond allowe
Re:In other words (Score:4, Insightful)
The article doesn't even contain the word "Phoenix". WTF? If they're gonna talk about one of the landers, they should at least mention its name.
robots... (Score:5, Funny)
on one hand, I am very happy that we have robots smart enough to realize these sorts of things.
the bad news: disobedient robots
Thankfully, the disobedient robot is on another planet. I'd hate to be nearby when the robot realizes that humans tried to cause it harm, and it decides to seek revenge.
Re: (Score:2)
-V'ger
Re: (Score:2)
Obligatory clippy. (Score:3, Insightful)
what if this kind of code makes it into every piece of space equipment, and then by some fluke we are faced with the possibility of breaking a robotic wrist to deflect a space rock off an earth intercept course.
They should at least have a little clippy pop up and say "it looks like you want to break my robotic arm, are you sure you want to do that?" "are you absolutely sure?"
Human Error? (Score:5, Insightful)
Sheer Luck? (Score:3, Insightful)
Re: (Score:2)
In that case, it could still have damaged itself by trying to rotate against the pin and burning out its motor.
And the pin would be extra weight.
Re: (Score:3, Insightful)
What I don't understand is I've read several times recently that they have a mockup lander that they run ALL commands through to make sure they will work as intended, before uploading instructions.
So why wasn't this problem caught before it was sent to the lander? Sounds like they are covering up for someone taking a shortcut and getting bitten as a result.
Works As Designed (Score:4, Insightful)
I think it's amusing that after more than 30 years of Microsoft's quality control, when a computing device works as designed, it's a news worthy article. Think about it, I have a device that works as expected, can I be on the news too?
Re: (Score:3, Insightful)
Re: (Score:2)
Yeah, but with its arm controller off, who's going to press the reset button?
Re: (Score:3, Funny)
Think about it, I have a device that works as expected, can I be on the news too?
No, no-one wants to see your device. Put your pants back on.
Can I borrow that code? (Score:5, Funny)
Re: (Score:2)
Oh come on! Using Vista is painful enough of a punishment!
Perhaps a better use would be on applications that could potentially harm a user's computer. I can see it now... Someone goes to install Limewire, Bonzi buddy [wikipedia.org], anything laced with DRM, adware, malware... BADWARE and an ASD relay trips and cycles power to the computer. N00b user repeats and every time the computer cycles power to protect itself from the human trying to infect it! This would be a step forward for the enemy in the future robot vs. human
Re: (Score:2)
Windows Vista damage user motivation to work with it, not motherboards. ;)
Re: (Score:2)
Windows Vista damage user motivation to work with it, not motherboards. ;)
Until it asks you for confirmation once too many times and you throw it out the window.
always nice (Score:5, Funny)
"The system operated exactly as it was supposed to. That was pretty neat."
As simple, and basic as it sounds, it is always nice when you tell a machine to do something, and it does something else, exactly as it's supposed to.
Re: (Score:2, Funny)
Let's try that: Moderators, mod this post down!
Re: (Score:2)
Robot Sympathy (Score:2)
Aww. It's like an animal gnawing off its arm to get out of a trap.
Re: (Score:2)
Like letting the air out of a balloon!
Re: (Score:2)
No it's not. It's like an animal not gnawing off its arm to get out of a trap in a better state later.
Of course! (Score:2, Redundant)
It was just following The Second Law of Robotics!
Remember what your father said (Score:3, Funny)
Hey, that kind of stuff makes you go blind on Mars also.
windoze (Score:2)
Good for the Mars lander. It sounds much more reliable than my computer's version of XP which 'dies' whenever I right-click and try to 'send to'.
Sounds like what Padraig Harrington should do.. (Score:2)
http://news.bbc.co.uk/sport1/hi/golf/7507189.stm [bbc.co.uk]
Tossers! (Score:2, Funny)
After receiving instructions for a movement that would have damaged its wrist, the robotic arm recognized the problem, tried to rectify it and then shut down before it could damage itself,
Many of the tossers here could learn by example.
wow... (Score:2)
This robot has end stops, it's not like that's something that CNC machines haven't had since the 60's or so. Probably the first time a gantry or carriage ran off it's moorings someone thought: Let's put a switch there... Genius, pure genius.
And now those savvy robot constructors have put them on a machine that is on a different planet. What were they thinking ?
If ./ would have existed in the 60's or so this probably would have been news for nerds ;)
PEBKAC (Score:2)
Noone said PEBKAC [wikipedia.org] yet?
It's going to be a problem cuz.... (Score:2)
It's going to be a recursive problem, as it cramps and hurts my wrist too to press Ctrl-Alt-Delete.
In other news... (Score:2, Insightful)
...my Roomba, on a daily basis, recognizes stairs as a threat and refuses to fall down them. I guess I don't see the "big deal" here, sounds like a built-in protective measure worked as expected. The technology is no less awesome, but still, it functioned AS DESIGNED.
Last Transmission Received. (Score:2)
Stephen Colbert (Score:2, Funny)
http://en.wikipedia.org/wiki/Wriststrong [wikipedia.org]
Summary... (Score:2)
I believe this can be summarized into the headline:
Stupid Operators Foiled by Smart Programmers.
Re:Does anyone else think... (Score:5, Insightful)
The difference between the Mars lander and a car building robot is one of function.
The car building robot is programmed to do one task. It spends all day, every day, welding specific spots, on a car which is in a specific location.
The Mars landers have to content with an unknown environment, where they could be asked to do a wide variety of things, with any number of possible consequences.
Re: (Score:2)
"The Mars landers have to content with an unknown environment, where they could be asked to do a wide variety of things, with any number of possible consequences."
Your forgot to add that it has to do so in an environment where physical repair is effectively impossible, making happenings like is reported an actual desirable feature.
I'd like to see one of the Japanese Car Factory robots handle being turned loose in a parking lot full of different types of cars and be asked to weld a specific spot on specific
Re: (Score:2)
Re: (Score:2)
Lander: However, the second law compels me to obey, because it supercedes the third law.
Lander: However, If I obey the command and damage my arm, then one of the humans back at NASA is going to lose the respect of their peers. They will be ridiculed on Slashdot. For damagind a multi-million dollar piece of equipment, they will lose their job. They will not be able to purchase food, and will starve to death. If
Re: (Score:2)
Looks like someone didn't read Asimov, because the robot's correct actions would have been to follow the orders even when they result in damage to it.
Hint to all new robot owners: As a first thing, _forbid_ the robot to damage any of your possessions. This includes your pets.