Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Science

Why Being Wrong Makes Humans So Smart 311

Hugh Pickens sends in an excerpt in last week's Boston Globe from Kathryn Schulz's book Being Wrong: Adventures in the Margin of Error. "The more scientists understand about cognitive functioning, the more it becomes clear that our capacity to make mistakes is utterly inextricable from what makes the human brain so swift, adaptable, and intelligent. Rather than treating errors like the bedbugs of the intellect — an appalling and embarrassing nuisance we try to pretend out of existence — we need to recognize that human fallibility is part and parcel of human brilliance. Neuroscientists increasingly think that inductive reasoning undergirds virtually all of human cognition. Humans use inductive reasoning to learn language, organize the world into meaningful categories, and grasp the relationship between cause and effect. Thanks to inductive reasoning, we are able to form nearly instantaneous beliefs and take action accordingly. However, Schulz writes, 'The distinctive thing about inductive reasoning is that it generates conclusions that aren't necessarily true. They are, instead, probabilistically true — which means they are possibly false.' Schulz recommends that we respond to the mistakes (or putative mistakes) of those around us with empathy and generosity and demand that our business and political leaders acknowledge and redress their errors rather than ignoring or denying them. 'Once we recognize that we do not err out of laziness, stupidity, or evil intent, we can liberate ourselves from the impossible burden of trying to be permanently right. We can take seriously the proposition that we could be in error, without deeming ourselves idiotic or unworthy.'"
This discussion has been archived. No new comments can be posted.

Why Being Wrong Makes Humans So Smart

Comments Filter:
  • Duh (Score:3, Insightful)

    by somersault ( 912633 ) on Monday June 21, 2010 @08:31AM (#32639326) Homepage Journal

    Once we recognize that we do not err out of laziness, stupidity, or evil intent, we can liberate ourselves from the impossible burden of trying to be permanently right.

    Sometimes people do "err" out of laziness, stupidity of evil intent!

    We can take seriously the proposition that we could be in error, without deeming ourselves idiotic or unworthy

    Any suitably intelligent person already knows that failures are as much a part of learning as always being "right". And sometimes we do make really silly mistakes by overlooking things that should have been obvious. I know I do. Then again, often what is obvious to me, isn't to others..

  • Old, old news (Score:5, Insightful)

    by Rogerborg ( 306625 ) on Monday June 21, 2010 @08:34AM (#32639346) Homepage
    I'm sure we've all noticed that the people who make the biggest mistakes get promoted the fastest [wikipedia.org].
  • by erroneus ( 253617 ) on Monday June 21, 2010 @08:39AM (#32639394) Homepage

    I have known this for most of my life. The name reflects the idea. I'm not afraid of being wrong... at least not as much as others seem to be.

    The depth of the value of errors goes much further than the topic describes. The animal brain itself is a noisy collection of errors. The reason correct processing happens at all is because nearly all possibilities are explored in neural pathways to get to the correct responses. Once correct responses are identified, neural pathways to the correct response are established. This is what we call learning in the lowest level sense of the word.

    I have always found it amusing and interesting that computers work the way they do. They work in ways that are the complete opposite of the animal neuromechanism. Computers, originally derived from numerical processing devices, rely on accuracy and seek to prevent errors in every way possible. Memory is storage rather than a path. In a way, computers are our biggest hangups about being wrong put into mechanical practice.

    I find it to be far from ironic that we are now trying to get computers to "learn" under these conditions. The fact that it doesn't work particularly well. When every measure is taken to always be right, how can a machine learn? It is also far from surprising to me to see that people who are so afraid of being wrong are also the least capable of learning anything new or useful or being able to adapt to new circumstances. It all fits neatly within my own observations about mistakes and learning.

  • by AnonymousClown ( 1788472 ) on Monday June 21, 2010 @08:42AM (#32639418)

    Mistakes can cost us time and money, expose us to danger or inflict harm on others, and erode the trust extended to us by our community.

    Or being ridiculed and humiliated by assholes who gain a false sense of superiority by belittling people over mistakes - many times trivial ones. Which then leads the other person to dig their heals in, argue pedantic points to stay "right" which then leads to counter pedantic arguments from the other, and round and round we go!

    But hey! That's what you get when you post on Slashdot or work in IT.

  • Unfortunately... (Score:5, Insightful)

    by fuzzyfuzzyfungus ( 1223518 ) on Monday June 21, 2010 @08:42AM (#32639420) Journal
    While it might be true that "we do not err out of laziness, stupidity, or evil intent", it is definitely also the case that laziness can and does lead to ignoring procedural correctness that would have caught error, stupidity can and does delay the recognition of error until it has had time to balloon into something more serious, and evil intent can cause the willfull application of anything that laziness or stupidity would lead to; but carried on much more intelligently(and thus dangerously). Not to mention, of course, that little class of statements we know as "lies", which are essentially calculated to cause errors in those receiving them.

    Obviously, in a trivial sense, nobody wakes up in the morning and says "Gosh, I sure do feel like really fucking up today!"; but some people take measures that reduce the probability of error(and, where possible, measure it) and others do not. Just because virtually all human reasoning, outside of (some) math and syllogisms, is inductive does not imply that all human reasoning is on equally firm ground. In fact, given that deductive logic is useful pretty much only in certain types of math and in carefully controlled toy situations, the ability to distinguish various statements of inductive logic by quality or probability is probably the most vital aspect of epistemology as an applied science...
  • Re:Duh (Score:5, Insightful)

    by plover ( 150551 ) * on Monday June 21, 2010 @08:43AM (#32639426) Homepage Journal

    It's pretty obvious that BP didn't intend to cause a spill. But when you get to be as big as BP, the size of the potential mistakes grows. If the point of the article is that we're going to make mistakes no matter what, then the logical conclusion is that nobody should be permitted to get big enough where their mistakes could cause more than xxx of damage, where xxx could be monetary, human lives, ecological impact, or whatever.

    I don't think that will be the answer, however.

  • Re:Duh (Score:5, Insightful)

    by tverbeek ( 457094 ) on Monday June 21, 2010 @08:51AM (#32639512) Homepage

    This isn't talking about overlooking things. It's talking about the human ability to make decisions without being able to know all of the necessary facts, the ability to reach a conclusion that could be incorrect... but is still probably correct. That's something that computers cannot do (at least not yet).

  • Re:Rogue_rat (Score:5, Insightful)

    by Anonymous Coward on Monday June 21, 2010 @08:52AM (#32639526)

    The article doesn't claim that bigger errors equal greater intellect. It just says that the characteristics of the brain that makes humans intelligent also make us error-prone. And I don't think all errors are necessarily failures. Sometimes being wrong can be fortuitous.

  • Re:Be Careful (Score:5, Insightful)

    by hedwards ( 940851 ) on Monday June 21, 2010 @08:55AM (#32639548)
    The problem is most of the time people on some level know that it's a bad idea. I'm sure somebody had lingering doubts that cutting corners on safety equipment was a bad idea. Some people definitely realized that the absence of WMDs detected by the weapons inspector could be indicative of them not being there.

    As for the poor voting to cut the taxes of the rich, some people are just so damned stupid and stubborn that they probably shouldn't be allowed to vote. Not because they get it wrong, but because they refuse to actually learn anything from it. It's like those morons that keep pushing for fewer and fewer regulations, then use the inevitable catastrophe as evidence that they didn't go far enough.
  • by justinlee37 ( 993373 ) on Monday June 21, 2010 @08:58AM (#32639578)

    Culture notwithstanding, the conclusions regarding the probabilistic nature of inductive reasoning are insightful. It is important to understand that complex tasks and systems of belief are the result of trial and error; of making mistakes. Regardless of whatever superstitious or fallacious beliefs various cultures might have (and they all have them), this is an immutable fact of cognition, behavior, and psychology in general.

    So I don't think it's that the conclusions don't make sense in an Eastern culture. It's simply that, as you describe it, this aspect of Eastern culture makes no sense at all to begin with. You can't do everything perfectly the first time around.

  • by ibsteve2u ( 1184603 ) on Monday June 21, 2010 @08:59AM (#32639594)
    ...for Lucy is never wrong. (There is some kind of circular logic there...pumpkin-shaped, possibly.)
  • by commodore64_love ( 1445365 ) on Monday June 21, 2010 @09:04AM (#32639646) Journal

    Or deliberately ignoring your own engineers saying, "This is a bad idea. The wellhead will blow out." Then try to act all surprised to discover the engineers knew what they were talking about, and blame the engineers instead of your own stupidity Mr. BP Manager.

  • by Kiaser Zohsay ( 20134 ) on Monday June 21, 2010 @09:19AM (#32639806)

    Working with computers, we deal with hard facts, boolean truth values. Separating what we know from what we don't know is a large portion of this job. Therefore, being right about what you do and do not know is important.

    To me personally, being right is so important that I will admit when I am wrong, so I can be right about that.

    I realize that my experience is not typical. I learned reasonably early in life to put little stock in other peoples opinions of me and my actions, largely because the opinions of others are non-deterministic with respect to my actions. I did not realize it at the time, but by doing so I freed myself from the burden of lifelong guilt and shame. Over the last couple of years, I have been able to see the difference that this freedom has made in my point of view, and it really is astounding.

    The stigma of mistakes in Eastern culture is not absent from Western culture, just more subtle.

  • Re:Old, old news (Score:4, Insightful)

    by thesandtiger ( 819476 ) on Monday June 21, 2010 @09:44AM (#32640092)

    Dilbert jokes aside, people who take more risks are going to be more likely to have spectacular successes as well. For the most part, at lower levels in a corporate hierarchy, people can fail at trying something but it generally can't *really* hurt the company. They can also succeed at trying something, and it may have a rather large effect on the company, or be seen as signs that this person is an up and comer.

    Low risk of spectacular failure + decent chance of large success = promotions. The smart ones tone down the risk taking a bit once they can do real damage, and become much better at risk assessment and mitigation.

    I can honestly say that in the last 2 years I've made probably 3-4 times as many "mistakes" on the job (ideas that seemed worth looking into but didn't pan out, changes to systems that seemed promising on paper but actually were 1/2 as good as our current methods in practice) for every success I've had. But the successes have been disproportionately large (ideas that allowed us to do research in ways/with populations that we previously had a hard time getting access to, implementation of systems that cut the amount of time needed to do data management across *all* projects by 50% or more, etc.) and as a result I've been bumped up 3 steps in the hierarchy to what in the corporate world would be a vice presidency but at my university is a directorship. And since I've taken on that position I've been a bit more risk averse, and when I do set up a new program I take steps to make sure that even if it fails the negative impact is minimal - I've adjusted the risk profile of the work I do so that I can now keep the job I've got, while still being able to move forward.

    Meanwhile, I can look at other people who started at the same time and level I did, and they're still at that entry spot because, while they've done solid work and made fewer errors than I, they also haven't really done anything that stands out as a demonstration that they have the potential to do a lot more.

    And it makes sense, too. Who is going to be the better leader, or the better person to bring an organization to the next level: someone who plays it safe or someone who stumbles a few times but also manages to come up with some really good ideas and makes them happen?

    Of course, this kind of thinking can backfire when the powers that be see someone who takes all kinds of risks but never manages to make them pay out. If your management is snowed by someone who claims they'll be able to do big things but doesn't have a solid, defensible track-record of actually making things happen, you have the prototypical PHB who'll do everything he or she can to sabotage the work those under him or her do so that when it comes time to be accountable for the failures they can point at their staff and say they're trying *really hard* to motivate those lazy peons, but some people just aren't educable...

  • by L4t3r4lu5 ( 1216702 ) on Monday June 21, 2010 @09:52AM (#32640192)
    Why is the "BP manager" currently out on a yacht at some annual event instead of sat in court, desperately defending himself from a public prosecutor with a battery of lawyers funded by the US government, WWF, and GreenPeace? No doubt they all want a piece of his personal fortune... Especially the lawyers.

    He shouldn't be out sailing, he should be taking a plea bargain involving a few hundred million dollars in personal fines and 15 years in a federal prison.
  • by Shabazz Rabbinowitz ( 103670 ) on Monday June 21, 2010 @09:53AM (#32640194)
    we need to recognize that human fallibility is part and parcel of human brilliance

    Yet another reason why the idiots pushing for instant replays in baseball should STFU.
  • Re:VERY old news (Score:5, Insightful)

    by nine-times ( 778537 ) <nine.times@gmail.com> on Monday June 21, 2010 @10:15AM (#32640454) Homepage

    Scientists are constantly rediscovering and proving ideas that philosophers talked about hundreds or thousands of years ago. Sometimes they're even discovering the ideas that long ago stood as the underpinnings of the science that they're studying, arguably making the whole thing slightly circular.

    Still, there's value in rediscovering old ideas (especially when they're good ideas) and there's value in proving them more rigorously or developing a more specific understanding of how these things work. Plus, when I see a story like this, I'm always suspicious that the reporter is oversimplifying.

  • Re:Duh (Score:2, Insightful)

    by Anonymous Coward on Monday June 21, 2010 @10:53AM (#32641094)

    >>>It's pretty obvious that BP didn't intend to cause a spill.

    Is it? I'm hearing stories coming-out where engineers wrote e-mails warning this blowout would happen. But the managers, based-upon their vast PoliSci degree knowledge, pushed forward anyway with drilling. Later engineers' emails read like this: "I told you this would fucking happen."

    Management being in denial about dire warnings from engineers and other workers is still not "intent to have their drilling rig blow-up and start spewing petroleum". On the contrary their intent was to cut-corners on safety to reduce their costs and thus increase profits, getting away with it without any causing a crisis. It seems most criminals and senior executives have at least one thing in common, they are eternal optimists when it comes to taking risks!

  • by PitaBred ( 632671 ) <slashdot&pitabred,dyndns,org> on Monday June 21, 2010 @11:03AM (#32641268) Homepage

    Isn't that what management does best? Takes credit for success, and passes blame for failure? It's the only way to get into the Fortune 100 C*O offices that I'm aware of.

  • Re:Duh (Score:1, Insightful)

    by Anonymous Coward on Monday June 21, 2010 @11:08AM (#32641346)

    I think the gap is that humans apply fuzzy logic constantly, in new and changing situations, with their senses providing a non-stop stream of raw data. Then, they build meta-models of the world from all their fuzzy assertions, and build models on top of models on top of models in an ever more sophisticated set of relationships. THAT's actually that part that seems to be difficult for machines -- building "mental models" off of fuzzy data, and continually refining those models -- basically learning in an unbounded way.

  • by Hognoxious ( 631665 ) on Monday June 21, 2010 @11:15AM (#32641494) Homepage Journal

    Why is the "BP manager" currently out on a yacht at some annual event instead of sat in court, desperately defending himself from a public prosecutor with a battery of lawyers funded by the US government, WWF, and GreenPeace?

    Due process, or some other such legal technicality.

  • by Scrameustache ( 459504 ) on Monday June 21, 2010 @11:31AM (#32641738) Homepage Journal

    Why is the "BP manager" currently out on a yacht at some annual event instead of sat in court, desperately defending himself

    Same reason the Union Carbide guy, who killed tens of thousands of people in Bohpal, is living off his life in luxury in the U.S: The system is made by the rich. for the rich.

  • by argStyopa ( 232550 ) on Monday June 21, 2010 @11:33AM (#32641780) Journal

    Kathryn Schulz's book makes a great case for understanding why being wrong is so intrinsic to being human...unfortunately, and ironically, she's got it 180-degrees-wrong.

    Where she fails is her conclusion: it's not that BEING WRONG is what makes us so successful, adaptive, and smart. It's the 'trying again to be right' bit.

    Being wrong is easy. Being right is much, much harder, and probably requires trial and error. But if you're satisfied with being wrong, you don't keep trying. While the idea that 'being wrong is human' is all nice and friendly, ACCEPTING being wrong without any sense of negative consequence is staggeringly, blindingly stupid. Without gradations of consequence (ie more and more serious consequences for more and more serious failures), life doesn't even make sense.

    "Schulz recommends that we respond to the mistakes (or putative mistakes) of those around us with empathy and generosity and demand that our business and political leaders acknowledge and redress their errors rather than ignoring or denying them. "

    Sorry, but that's just stupid. This is the same sort of touchy-feely crap that's infected modern American public schools. "It's ok, little Timmy, you just keep trying to figure out what 2+2 is. You're still a valuable and precious little snowflake."

    Why should Timmy ever bother to figure out 2+2 if he never NEEDS to get it right? Whether it's reward-based or something more simple like shame, there MUST be a disincentive to be wrong. Anything else is simply asinine.

    So you send your husband out to get dinner; instead of buying food for your children, he spends the money on porn and beer. Ah well, you should respond with generosity and empathy, right?

    Can you imagine if her methodology was followed? "It's ok BP, we all know that drilling for oil is hard work, and can "It's ok, Mr President. You just spent well over a $trillion on an ostensible economic rescue plan, but aside from simply not working, it pretty much all ended up in your friends' and political allies' pockets. We won't be angry, we won't even be annoyed. We'll respond with generosity and empathy. Perhaps you could take another $trillion from our kids' and grandkids' future and try again? Maybe this time you'll succeed?"

  • by smi.james.th ( 1706780 ) on Monday June 21, 2010 @12:41PM (#32642760)
    Well, in theory that is the system that is in place in most Western countries. I'd venture a guess that in most cases, it's not in the interests of the people in charge to have justice run its course properly.

    The other thing that one would need to consider is, despite what everyone said, who is at fault? Was it an accident? Or was it on account of negligence or evil intent? (Or stupidity as the article says...)

    Not just referring to the BP case specifically there, but in general. Things like that are IMO difficult to determine conclusively.
  • Re:Duh (Score:3, Insightful)

    by pavera ( 320634 ) on Monday June 21, 2010 @12:50PM (#32642894) Homepage Journal

    I don't think the point was "don't let people get big enough" cause really... some professions just have to take on massive amounts of risk. How much is it worth if a plane crashes? should we only be allowed to fly a plane with 5 passengers? Do airlines have to start doing risk analysis based on the earning potential of passengers to stay below some arbitrary threshold (that is how legal claims are processed in case you don't know... in an accident/death situation the damage is calculated based on how much you would have made if you stayed alive... among other things)? No..

    The article argues that in any profession that is high risk (airlines, medical, deep sea oil drilling) there should be mandatory error correction systems in place. Airplanes do not crash all the time cause they have lots of redundant systems and checklists, and a regulator that at least sometimes pays attention.

    I watched as they put the latest containment cap on the BP well... and my first thought was "well if they just made this stupid thing a little more modular, this would be an easy fix". Right below where they cut off the pipe, there is a connection with some pretty big bolts... seems to me that if there were a valve below that, they could temporarily divert the oil out the side of the thing, unscrew that pipe, screw on a new pipe that goes all the way to the surface, and then close the valve and done, oil flowing safely to the surface... I assume the reason they can't do that is because of the pressure from the oil flowing through the pipe... I'm probably wrong there... My only experience with fluid under pressure is sprinkler systems and household plumbing. Seems to me that would cost maybe a couple thousand dollars and would be a common sense fail safe to have on any oil well...

    Anyway, the point of the article is that high risk things should be surrounded with systems that mitigate errors or prevent them, redundancies, checklists, etc... To keep humans from.. uh... applying our flawed human reasoning to problems

  • Re: (Score:4, Insightful)

    by Duane13 ( 1340371 ) on Monday June 21, 2010 @12:54PM (#32642962)
    As an engineer, I would honestly rather upper management to be as far away as possible, that's how the real work gets done. Show me an engineer that wants a CEO breathing down their neck and I'll show you an average engineer that wants to brown nose with management. Also what is a CEO supposed to do? What in his background would leave you to believe that other than signing 20 BILLION dollars into escrow for repairs/claims that he would be more effective at the scene? I'm not a fan of big business, but people are just looking for a reason to crucify him. I don't go to BP, that's what I do to show my disapproval.
  • by ktappe ( 747125 ) on Monday June 21, 2010 @01:17PM (#32643208)

    Or deliberately ignoring your own engineers saying, "This is a bad idea. The wellhead will blow out."

    If there were engineers who believed the wellhead would blow out because of the course they were taking, they should be held liable for the deaths of their coworkers, because it was their job to stop it, especially if management thought the job was safe.

    Hold it. It was management who was pushing pushing pushing to get that well pumping ASAP, and management who told operators that 2 instead of 3 concrete plugs would be sufficient. It as also management who did not ensure both batteries in the BOP were functional/charged. For you to throw this all on engineers when there are numerous reports of management forcing an unsafely accelerated schedule is ludicrous and shows that you are less than impartial on the topic.

    To be clear, blow outs happen.

    To be clear: blow outs can be prevented if standard safety procedures are not bypassed.

    That is where I take issue with the claims in the parent article. It assumes all humans are interested in being intelligent and learning from mistakes. That is far too optimistic a view. The article actually says 'Once we recognize that we do not err out of laziness, stupidity, or evil intent...' But people DO err out of those reasons (I equate greed with 'evil intent' when the person knows their actions has a significant likelihood to harm/kill others, which is exactly what happened in BP's case.) It would be a major mistake to assume nobody in the future will put greed ahead of safety and make a mistake via that incorrect choice. This repeating pattern is not a sign of intelligence.

  • Re:Duh (Score:3, Insightful)

    by Bigjeff5 ( 1143585 ) on Monday June 21, 2010 @01:18PM (#32643226)

    Yeah yeah, but what humans do and what computers do are miles apart.

    A simple attempt to search Google for something should tell you that pretty quickly.

  • by DutchUncle ( 826473 ) on Monday June 21, 2010 @01:53PM (#32643680)
    The disconnect between this concept and most people's thinking explains why scientists and engineers rarely advance well into management and politics.

    - As a scientist or engineer, it is acceptable - even required! - to incorporate new data and adapt your thinking, even reach different conclusions.

    - As a manager or politician, such behavior reflects weakness or lack of principle, sometimes called "flip-flopping".

    In my experience the latter approach seems to be the *typical* perspective of normal people (non-engineers), who would rather "stay the course" and "finish what they started" even when they openly admit that they would have chosen differently now. The contrapositive concept of "what did he know and when did he know it", with the understanding that someone who chose badly may have made a reasonable decision based on the information available AT THE TIME, is often displayed pro forma and then trampled upon.
  • Re:Up to a point (Score:3, Insightful)

    by fredmosby ( 545378 ) on Monday June 21, 2010 @02:35PM (#32644206)
    What really annoys me about this oil spill is that politicians are spending all their time asking "Who can we blame?" when they should be asking "How can we prevent this in the future?". Blaming people makes for good press, but directly addressing the problem is much more effective.
  • by jc42 ( 318812 ) on Monday June 21, 2010 @03:12PM (#32644658) Homepage Journal

    Actually, had they told anybody, the job would stop. Every employee has the authority to stop a job - any job. There aren't some jobs that some people can stop and some jobs that other people can stop, anybody can stop a job for safety on a BP rig (or any BP facility). That gets pounded into your head day from day 1 - if you see something that you think is unsafe, you stop it, and everybody gets together and double-checks the plan and makes sure they haven't missed anything that would make it unsafe.

    Heh; yeah; that's the official policy in lots of companies. But I've worked a number of places where, when I asked around to find the people who had done that, I quickly learned that those people no longer worked there. It doesn't take a genius to make the right inference from this.

    It also doesn't take a genius to understand that if something does go wrong, you were present, you'll be one of the people taking the blame for the problem.

    The old-timers just grin and say something like "So you've finally figured out how it all works around here."

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...