Murphy's Law Rules NASA 274
3x37 writes "James Oberg, former long-time NASA operations employee, now journalist, wrote an MSNBC article about the reality of Murphy's Law at NASA. Interesting that the incident that sparked Murphy's Law over 50 years ago had a nearly identical cause as the Genesis probe failure. The conclusion: Human error is an inevitable input to any complex endeavor. Either you manage and design around it or fail. NASA management still often chooses the latter."
Re:Mark my words (Score:1, Insightful)
Re:Mark my words (Score:5, Insightful)
When a computer program crashes it's usually down to the human(s) who programmed it, and in the rare occasions it's a hardware glitch and it was humans who designed the hardware, so we're still to blame either directly or indirectly.
I suppose it's like the argument about whether bullets kill or the human who pulled the gun's trigger.
Re:interesting but it's not really true (Score:5, Insightful)
Then I remember Apollo 1, that killed 3 astronauts, and Apollo 13, that nearly killed 3 more.
To invoke Heinlien, Space is a harsh mistress.
To invoke Sun Tsu, success in defense is not based on the likelyhood of your enemy attacking. It is based on your position being completely unassailable.
Good Point (Score:5, Insightful)
This is a very good point, and I wish more people would realize it.
For software development, the application is: Just because you can write 200 lines of correct code does not mean you can write 2 * 200 lines of correct code. Always have someone verify your code (not yourself, because you read over your errors without noticing them).
That is NOT correct. (Score:5, Insightful)
This is hindsite at its best, and is the classic comment by beareaucrats who have no concept of what cutting edge design is about. F1 race cars, Racing Sailboats, Nuclear Reactors - NO design is failsafe, and NO design is foolproof. Especially a one off design that isn't mass produced. Even mass produced designs have errors, like in the Auto Industry. It is a simple fact of life that engineers and managers balance Cost and Safety constantly.
What you SHOULD be comparing this against is other space agencies that launch a similar number of missions and sattelites - i.e. other real world examples.
Expecting perfection is not realistic.
NASA deserves more chances at failure (Score:2, Insightful)
There's a contradiction in that above statement
That said nothing wrong with building in redundancy and failsafes
In space probes redundancy comes at the cost of number of unique mission goals and financial cost.
Sometimes you just have to eat the failure, thats what insurance is for. We in the public shouldn't always expect NASA to have 100% failure free (non human) missions and then extract harsh punishment on them which invariably gets passed down to engineers and not management decision makers.
Wuith the current attitude NASA of old would have been shutdown in the first couple of years for wasting tax payer money. Luckily there was competition with Soviets.
Re:interesting but it's not really true (Score:3, Insightful)
Maybe we should just make lots of cheap crappy probes, and hope and expect most to fail instead of one really good expensive one with the hope that it will suceed.
where would we be without mistakes... (Score:5, Insightful)
Recently I took a class on AI (insemination, not intelligence) and apparently the two biggest breakthroughs by Dr. Polge, in preserving semen were due to mistakes. First, his lab mislabeled glycerol as fructose and they were able to find a good medium for suspension. Secondly, he blew off finishing freezing semen to go get a few pints and didn't make it back to the lab until the next day thus discovering that it was actually better to not freeze the stuff right away.
Mistakes are some of the best parts of science and life in general. It's best to try to make more mistakes (i.e. take risks) than it is to try and always be right. (unless you are obsessive compulsive).
Re:Mark my words (Score:4, Insightful)
Re:interesting but it's not really true (Score:5, Insightful)
Then you double check the checkers, and so on... that's the point of the article... humans will err... Like Demming said... "you can't inspect quality into a process."
Human Factor (Score:4, Insightful)
Unless you consider the fact that often in large organizations, the left hand typically has no clue what the right hand is doing. I work at Lockheed Martin, and typically I'm involved in situations where one group makes an improvement that then none of the other groups know about, changes/decisions are poorly documented (if at all) so nobody knows where the process is going, people making poor decisions due to lack of proper procedures from management about what to do, teams not being co-located, poor information about which people have the necessary knowledge to solve a particular problem, or any number of things that confuses the engineering process, to the detriment of the product. Most of these situations are caused by a lack of communication throughout the organization as a whole.
This is a serious problem, and it needs to be acknowledged by the people in a position to make a difference.
Re:interesting but it's not really true (Score:5, Insightful)
Now suppose this output is double-checked by another engineer, who also has a 5% chance of error. 95% of the first engineer's errors will be caught, but that still leaves a .25% chance of an error getting through both engineers.
No matter what the percentages, no matter how many eyes are involved, the only way to guarantee perfection is to have someone with a zero percent chance of error...and the chances of that happening are zero percent. Any other numbers mean that mistakes will occur. Period.
I remember reading a story somewhere about a commercial jet liner that took off with almost no fuel. There are plenty of people whose job it is to check that every plane has fuel...but each of them has a probability of forgetting. Chain enough "I forgots" together, and you have a plane taking off without gas. At the level of complexity we're dealing with in our attempts to throw darts at objects xE7 kilometers away, it is guaranteed that mistakes will propagate all the way through the process.
Nasty Remark (Score:3, Insightful)
I find this remark very unfair. It is a really nasty snide attitude to it, like "we are perfect - why can't you be."
Come on guys, NASA is trying to do some really difficult and ground breaking stuff here. Cut them some slack.
Re:That is NOT correct. (Score:3, Insightful)
But this isn't about design. It's about implementation. In each of the examples, the failure occurred because of incorrect assembly of key components.
Having said that - there IS an issue of design brought up by the article. That is, the design of a system should not allow for catastrophic configuration. In several examples, failure occurred when sensors (accelerometers) were installed backwards. Those devices should have been designed with some sort of keying system that only allows installation in the intended configuration. Heck - one of the accelerometers' configuration could only be determined after x-raying the device!
armchair rocket science (Score:5, Insightful)
For anyone wanting to yack about poor performance... put your money where your mouth is. I just get sick of all the constant nagging.
Re:interesting but it's not really true (Score:2, Insightful)
Re:That is NOT correct. (Score:5, Insightful)
You only get to play the hindsight card the first time this kind of screw-up happens. If you actually read the article you'll see that Oberg (who isn't a beauracract but a 22-year veteran of mission control and one of the world'd experts on the Russian space program) is indicting NASA for having a management structure that leads to technical amnesia: the same type of oversight failure keeps happening again and again.
Oberg is not alone in this. The Columbia Accident Report despairingly noted the similities between Columbia and Challanger: both accidents where caused by poor management but what was worse with Columbia was that NASA had failed to really internalise the lessons of Challanger, or heed the warning flags about management and technical problems put up by countless internal and external reports.
Sure, space is hard. But it's not helped by an organization that has institutionalised technical amnesia and abandoned many of its internal checks and balances (at least this was the case at the time of the Columbia report, maybe things have changed).
And if you really want to compare against other agencies, NASA's astronaut bodycount does not compare favorably against the cosmonuat bodycount...
Sadly, your post is a classic comment by slashdotters who have no concept what effective technical management of risky systems looks like. (Hint: not all cutting edge designs get managed the same way. There's a difference between building racing sailboats and spaceships. This is detailed in the Columbia accident report. Read it and get a clue).
You'd think so. (Score:3, Insightful)
But I guess it sort of applies to your software analogy as well. There have been a few companies who have discovered that its cheaper to have paying customers find the flaws in their software, rather than do any kind of formalized testing before release.
Re:we're living in an impefect world (Score:4, Insightful)
I'm not sure I buy that completely. While it certainly would help to have a single SME go over the entire vehicle, I doubt such a person could exist and complete the checks in a reasonable amount of time. The guy who checks the computer code is probably not going to be an expert in metal fatigue, nor electrical engineering. Even if you could find some sort of uber-genius who had expert knowledge of every system, he or she would have to work serially. If they started at component "1" of 654224166 and went down the line in order, the checks they started with would be out of date by the time they finished.
Re:You'd think so. (Score:3, Insightful)
Not only that, but it's actually beneficial to produce and ship buggy software. Bugs have to be fixed, and who can fix them better than the people who wrote the code? So, it makes sense for programmers to leave flaws in their programs. Companies that ship flawed products can make customers pay for upgrades that also fix bugs, or get good karma by providing bugfixes for free. In the process they get publicity, and the world can see they're not sitting still and their products have not been abandoned.
Re:Mark my words (Score:3, Insightful)
In testing, you WANT it to fail! (Score:3, Insightful)
NASA does test everything. He didn't mention in the article, but I would be almost certain that the accelerometers were tested, and passed the tests; but that the tests themselves were improper.
Re: interesting but it's not really true (Score:3, Insightful)
That doesn't follow. It's only true if the two errors are completely independent, which is a very big 'if'. In practice, the chances are that some types of error are more likely than others, and that the processes/standards/ways of thinking which are common to both will also affect the types of errors they make. All of which makes it more likely that if one engineer has made a particular mistake, another engineer might make that same mistake as well. So the second engineer will catch less than 95% of the first engineer's errors -- maybe a lot less.
Of course, things are far better when there's little or no commonality between the two engineers -- different companies, processes, methods, approaches, and cultures will all help to make their work independent, and help to reduce the errors that get missed. Anyone know if NASA does anything like this?
I believe it's common practice in some mission-critical situations to use three different systems, each built from the ground up by three entirely separate groups of people, with nothing by the specification in common, for exactly this reason.
It's not number of errors caught but importance (Score:4, Insightful)
Most people choose the latter. (Score:2, Insightful)
If you want to see this in action, find your favorite developer and ask the following: "What does your program do, and how doe it do that?" Prepare for a long response
Then ask: "How does it break?" You will most likely get a blank look. You may get a list of things the program doesn't do (missing or removed features), or possibly a list of known bugs, but you will almost never get an answer detailing the failure modes of the program itself. That is, they will not be able to tell you what happens when various assumptions are wildly wrong.
Answering those sorts of questions requires thinking in the negative (not necessarily negative thinking), which is an entirely different mode of thought. It's also much less pleasant. After all, considering the destruction of the beautiful thing you've built is not a psychologically easy task.
Re:That is NOT correct. (Score:3, Insightful)
You've got good points. But you're being unfair on this one. Even the Rutan notes [thespacereview.com] that the X-15's capabilities far outstrip Spaceship One. That, and X-15 provided some of the basic building blocks in aero and astronomics on which Spaceship One could be built. Furthermore, Spaceship One enjoyed numerous high-performance off-the-shelf materials that didn't exist during the X-15's time. Comparing the two provides some interesting historical perspective. But the two programs are apples and oranges.
Easy to answer... (Score:2, Insightful)
These people are quite *insane*. they may be brilliant, but still bonkers. They have the most power and money of everyone on the planet. They hire the smartest people they can find, and reseaqrch advanced weaponry. All governments spend a huge amount of time and money and resopurces on this. they hire the smartest scientists and engineers they can find for this task. then the hire the people who psychologically and intellectually are the most prone to use these devices that the scientists and engineewrs create. these people are given more power than "ordinary" citizens, they are tasked with killing people and breaking peoples things, using these advanced machines. This weaponry, consisting of mechanical machines augmented with electrical and chemical advances, are *exactly* designed to "harm humans" and they DO harm humans with this machinery. Happens every day around the globe, by the thousands. Literally thousands of humans a day are killed, and many more horribly mutilated and injured. And the way the system has evolved, it is rigged to always have the megalomaniacs wind up "in charge" and all populations have a certain percentageof "ask no questions" order followers.
So, stuff happens,evil wicked nasty horrible screaming stuff. This leads to this "fear" which isn't in the least bit an irrational fear for anyone sane to have. It's because it's reality.
Lately, we can read that they want to automate and robotosize this even further, and to take these machines as far as they can push it with near unlimited budgets and millions of man hours of advanced research. It is not a "tin foil hat" phenomenon for folks to notice that. We also have a veifiable past track record to show that yes indeed, these megalomaniacs tame scientists and engineers and order followers screw up, we get what is called "unintended consequences" and "collateral damage", as if the intended consequences and planned-for damages aren't bad enough. So we as ordinary humans all around the globe who really do not have a beef with joe over there all get to have these "benefits", and we notice that we don't want those sorts of benefits, but there's not a thing we can do about it, because this advancing technology system is rigged in favor of those who like and enjoy and profit-from doing harm.
You see, we DO have a lot of at least technically "competent people programming the machines", the problem is, they ARE designed to harm. And it's set up to be self perpetuating/advancing and is based from the git-go on forced wealth transference, ie, "theft" and it goes down hill from there into every worse things..
Re: interesting but it's not really true (Score:3, Insightful)
Obviously, the trick is to minimize the odds, but you can't eliminate them.
Re:interesting but it's not really true (Score:5, Insightful)
Second, the stable of competent contractors that existed in the 1940-1960 time frame is gone. North American, Grumman, McDonnell, dozens of others that could be named have been absorbed into 2-3 borg-like entities. The result is less competition, less choice, less innovation, few places for maverick employees to go, and in the end worse results from outsourcing.
sPh
Re:That is NOT correct. (Score:3, Insightful)
HOW IT HAPPENS (Score:4, Insightful)
Re:Mark my words (Score:3, Insightful)
Funny. When I read the article, I had exactly the same sentiment, but for the opposite reason:
"As long as humans build/program the machines, the machines will fail/crash before they can kill too many people" :)