Forgot your password?
typodupeerror
Science

Why Being Wrong Makes Humans So Smart 311

Posted by kdawson
from the something-so-right dept.
Hugh Pickens sends in an excerpt in last week's Boston Globe from Kathryn Schulz's book Being Wrong: Adventures in the Margin of Error. "The more scientists understand about cognitive functioning, the more it becomes clear that our capacity to make mistakes is utterly inextricable from what makes the human brain so swift, adaptable, and intelligent. Rather than treating errors like the bedbugs of the intellect — an appalling and embarrassing nuisance we try to pretend out of existence — we need to recognize that human fallibility is part and parcel of human brilliance. Neuroscientists increasingly think that inductive reasoning undergirds virtually all of human cognition. Humans use inductive reasoning to learn language, organize the world into meaningful categories, and grasp the relationship between cause and effect. Thanks to inductive reasoning, we are able to form nearly instantaneous beliefs and take action accordingly. However, Schulz writes, 'The distinctive thing about inductive reasoning is that it generates conclusions that aren't necessarily true. They are, instead, probabilistically true — which means they are possibly false.' Schulz recommends that we respond to the mistakes (or putative mistakes) of those around us with empathy and generosity and demand that our business and political leaders acknowledge and redress their errors rather than ignoring or denying them. 'Once we recognize that we do not err out of laziness, stupidity, or evil intent, we can liberate ourselves from the impossible burden of trying to be permanently right. We can take seriously the proposition that we could be in error, without deeming ourselves idiotic or unworthy.'"
This discussion has been archived. No new comments can be posted.

Why Being Wrong Makes Humans So Smart

Comments Filter:
  • Rogue_rat (Score:4, Funny)

    by RogueRat (1710322) on Monday June 21, 2010 @08:24AM (#32639272)
    Interesting way of looking at our failures. So... let's see if BP uses this to prove their genius.
    • Don't worry. They probably learned a lot from it.
    • Re:Rogue_rat (Score:5, Insightful)

      by Anonymous Coward on Monday June 21, 2010 @08:52AM (#32639526)

      The article doesn't claim that bigger errors equal greater intellect. It just says that the characteristics of the brain that makes humans intelligent also make us error-prone. And I don't think all errors are necessarily failures. Sometimes being wrong can be fortuitous.

      • VERY old news (Score:5, Interesting)

        by Brain-Fu (1274756) on Monday June 21, 2010 @09:38AM (#32640018) Homepage Journal

        David Hume [wikipedia.org] pointed all of this out hundreds of years ago. And he backed up all his claims with plenty of evidence that was readily available at the time.

        I wonder if Kathryn Schulz's is aware of this?

        • Re:VERY old news (Score:5, Insightful)

          by nine-times (778537) <nine.times@gmail.com> on Monday June 21, 2010 @10:15AM (#32640454) Homepage

          Scientists are constantly rediscovering and proving ideas that philosophers talked about hundreds or thousands of years ago. Sometimes they're even discovering the ideas that long ago stood as the underpinnings of the science that they're studying, arguably making the whole thing slightly circular.

          Still, there's value in rediscovering old ideas (especially when they're good ideas) and there's value in proving them more rigorously or developing a more specific understanding of how these things work. Plus, when I see a story like this, I'm always suspicious that the reporter is oversimplifying.

    • IT's not up to BP to use this. Its up to us to make it god damn sure that no matter how big you are skirting around laws and safety procedures is not possible.
  • Duh (Score:3, Insightful)

    by somersault (912633) on Monday June 21, 2010 @08:31AM (#32639326) Homepage Journal

    Once we recognize that we do not err out of laziness, stupidity, or evil intent, we can liberate ourselves from the impossible burden of trying to be permanently right.

    Sometimes people do "err" out of laziness, stupidity of evil intent!

    We can take seriously the proposition that we could be in error, without deeming ourselves idiotic or unworthy

    Any suitably intelligent person already knows that failures are as much a part of learning as always being "right". And sometimes we do make really silly mistakes by overlooking things that should have been obvious. I know I do. Then again, often what is obvious to me, isn't to others..

    • stupidity of evil intent!

      *cough* that wasn't a typo.. it was a moral judgement. Yes.

    • Re:Duh (Score:5, Insightful)

      by plover (150551) * on Monday June 21, 2010 @08:43AM (#32639426) Homepage Journal

      It's pretty obvious that BP didn't intend to cause a spill. But when you get to be as big as BP, the size of the potential mistakes grows. If the point of the article is that we're going to make mistakes no matter what, then the logical conclusion is that nobody should be permitted to get big enough where their mistakes could cause more than xxx of damage, where xxx could be monetary, human lives, ecological impact, or whatever.

      I don't think that will be the answer, however.

      • Re: (Score:3, Informative)

        >>>It's pretty obvious that BP didn't intend to cause a spill.

        Is it? I'm hearing stories coming-out where engineers wrote e-mails warning this blowout would happen. But the managers, based-upon their vast PoliSci degree knowledge, pushed forward anyway with drilling. Later engineers' emails read like this: "I told you this would fucking happen."

        • Re: (Score:2, Insightful)

          by Anonymous Coward

          >>>It's pretty obvious that BP didn't intend to cause a spill.

          Is it? I'm hearing stories coming-out where engineers wrote e-mails warning this blowout would happen. But the managers, based-upon their vast PoliSci degree knowledge, pushed forward anyway with drilling. Later engineers' emails read like this: "I told you this would fucking happen."

          Management being in denial about dire warnings from engineers and other workers is still not "intent to have their drilling rig blow-up and start spewing petroleum". On the contrary their intent was to cut-corners on safety to reduce their costs and thus increase profits, getting away with it without any causing a crisis. It seems most criminals and senior executives have at least one thing in common, they are eternal optimists when it comes to taking risks!

      • Re: (Score:3, Insightful)

        by pavera (320634)

        I don't think the point was "don't let people get big enough" cause really... some professions just have to take on massive amounts of risk. How much is it worth if a plane crashes? should we only be allowed to fly a plane with 5 passengers? Do airlines have to start doing risk analysis based on the earning potential of passengers to stay below some arbitrary threshold (that is how legal claims are processed in case you don't know... in an accident/death situation the damage is calculated based on how muc

    • Re:Duh (Score:5, Insightful)

      by tverbeek (457094) on Monday June 21, 2010 @08:51AM (#32639512) Homepage

      This isn't talking about overlooking things. It's talking about the human ability to make decisions without being able to know all of the necessary facts, the ability to reach a conclusion that could be incorrect... but is still probably correct. That's something that computers cannot do (at least not yet).

      • Re:Duh (Score:4, Interesting)

        by somersault (912633) on Monday June 21, 2010 @09:01AM (#32639608) Homepage Journal

        the ability to reach a conclusion that could be incorrect... but is still probably correct.

        That sounds a lot like fuzzy logic [wikipedia.org] to me..

      • Re:Duh (Score:4, Informative)

        by dominious (1077089) on Monday June 21, 2010 @09:44AM (#32640090)

        That's something that computers cannot do (at least not yet).

        Wake up and smell the coffee:
        http://en.wikipedia.org/wiki/Naive_Bayes_classifier [wikipedia.org]

        Also, search for Machine Learning, Statistical Learning Theory, Artificial Intelligence, Neural Networks, Fuzzy Logic, Support Vector Machines, etc.

        • Re: (Score:3, Insightful)

          by Bigjeff5 (1143585)

          Yeah yeah, but what humans do and what computers do are miles apart.

          A simple attempt to search Google for something should tell you that pretty quickly.

      • by TheLink (130905)
        But computers can do that. Especially computers programmed by humans.

        What computers still aren't so good at is: automagically making models/simulations of the world, to the extent of including "others" and "self", and use those models to help decide what to do.

        Even many animals can do that - they may not be as good as humans in some ways but they are far better than current AIs.

        I believe much human perception is anticipation, your brain keeps the simulations running, and if the world keeps matching well eno
    • Re:Duh (Score:5, Interesting)

      by edumacator (910819) on Monday June 21, 2010 @09:18AM (#32639790)

      I think the potential benefit isn't for those who are confidently intelligent. They see mistakes as a means of learning. The real benefit is for people who are tremendously insecure. They see mistakes and try to explain them away, or blame them on something else, negating the possible positive benefit of seeing why the mistake happened. For instance, they may have overlooked something. Instead of noticing that and learning to look for it the next time, they shy away from looking at the fault in detail.

      I see this kind of thing all the time with my students. They misread something, and if I comment on it, no matter how nicely, the shut down because they don't like to be wrong because they think it makes them seem stupid. When in reality, they are trying to use inductive reasoning, which is a huge part of my goal. But...they miss the learning opportunity when they close down.

      This article will make its way into my introductory lessons now. It will supplement the big sign on my door that says, "There is nothing wrong with being wrong."

    • Sometimes people do "err" out of laziness, stupidity of evil intent!

      Well I think there's also another issue: When a mistake is serious enough, it might not matter whether it was made out of laziness, stupidity, or evil intent. It might be a sign that the person who made the mistake isn't capable or qualified. In that case, you may want to remove that person from their position of responsibility and find someone else who can do the job.

      There's another issue: sometimes people don't learn from their mistakes. For various reasons, people sometimes repeat the same mistakes o

  • by Anonymous Coward on Monday June 21, 2010 @08:32AM (#32639328)

    I'm never wrong.
    I thought I was once, but it turns out I wasn't.

  • Old, old news (Score:5, Insightful)

    by Rogerborg (306625) on Monday June 21, 2010 @08:34AM (#32639346) Homepage
    I'm sure we've all noticed that the people who make the biggest mistakes get promoted the fastest [wikipedia.org].
    • by Exitar (809068)

      And CEOs that bankrupt their companies become CEOs of larger companies.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      This was foreseen by Laurence Peter as "in a hierarchy every employee tends to rise to his level of incompetence." [http://en.wikipedia.org/wiki/Peter_Principle]

      • Re:Old, old news (Score:4, Informative)

        by dkleinsc (563838) on Monday June 21, 2010 @10:48AM (#32641004) Homepage

        Well, not exactly. The difference between the Peter Principle and the Dilbert Principle is that the Peter Principle has to do with the promotion of people who are competent at their current job, while the Dilbert Principle is concerned with the promotion of people who are incompetent at their current job.

        If you actually read "The Peter Principle" (which is quite funny as well as insightful), you'll find out that Lawrence Peter describes this phenomenon as "Percussive Sublimation" (a.k.a. being kicked upstairs). He also describes one case in which the company in question, who's operations were based in LA created a new "Head Office" in New York and promoted all the useless people to the "Head Office". As he describes it, the people in the Head Office are busy drafting memos, scheduling meetings, conferring with each other, etc, while everyone back in LA actually gets the work done without having to worry about all the drones.

    • Re:Old, old news (Score:4, Insightful)

      by thesandtiger (819476) on Monday June 21, 2010 @09:44AM (#32640092)

      Dilbert jokes aside, people who take more risks are going to be more likely to have spectacular successes as well. For the most part, at lower levels in a corporate hierarchy, people can fail at trying something but it generally can't *really* hurt the company. They can also succeed at trying something, and it may have a rather large effect on the company, or be seen as signs that this person is an up and comer.

      Low risk of spectacular failure + decent chance of large success = promotions. The smart ones tone down the risk taking a bit once they can do real damage, and become much better at risk assessment and mitigation.

      I can honestly say that in the last 2 years I've made probably 3-4 times as many "mistakes" on the job (ideas that seemed worth looking into but didn't pan out, changes to systems that seemed promising on paper but actually were 1/2 as good as our current methods in practice) for every success I've had. But the successes have been disproportionately large (ideas that allowed us to do research in ways/with populations that we previously had a hard time getting access to, implementation of systems that cut the amount of time needed to do data management across *all* projects by 50% or more, etc.) and as a result I've been bumped up 3 steps in the hierarchy to what in the corporate world would be a vice presidency but at my university is a directorship. And since I've taken on that position I've been a bit more risk averse, and when I do set up a new program I take steps to make sure that even if it fails the negative impact is minimal - I've adjusted the risk profile of the work I do so that I can now keep the job I've got, while still being able to move forward.

      Meanwhile, I can look at other people who started at the same time and level I did, and they're still at that entry spot because, while they've done solid work and made fewer errors than I, they also haven't really done anything that stands out as a demonstration that they have the potential to do a lot more.

      And it makes sense, too. Who is going to be the better leader, or the better person to bring an organization to the next level: someone who plays it safe or someone who stumbles a few times but also manages to come up with some really good ideas and makes them happen?

      Of course, this kind of thinking can backfire when the powers that be see someone who takes all kinds of risks but never manages to make them pay out. If your management is snowed by someone who claims they'll be able to do big things but doesn't have a solid, defensible track-record of actually making things happen, you have the prototypical PHB who'll do everything he or she can to sabotage the work those under him or her do so that when it comes time to be accountable for the failures they can point at their staff and say they're trying *really hard* to motivate those lazy peons, but some people just aren't educable...

      • by Rogerborg (306625)
        Counterpoint: the significant difference between you, and the smarter guys who are still quietly cleaning up your messes, is that your failures got you noticed.
  • Be Careful (Score:3, Interesting)

    by sonicmerlin (1505111) on Monday June 21, 2010 @08:39AM (#32639384)

    I think people focus their criticism more on those that make errors that seem glaringly obvious to everyone else. We tend to call those "stupid" errors. It's true however people tend to become far too critical of others who seem to be unable to reach the same conclusions at a high speed that we have already come to.

    On the other hand, there are obvious mistakes that should not be conflated with probabilistic errors due to inductive reasoning. When the heads of BP cut corners that result in a giant explosion, a several month long oil leak, and billions of dollars in damage to the environment and people's lives, we can attribute that to gross negligence.

    When a politician decides to engage in 2 costly wars while lowering taxes for the rich, or when a majority of society elects politicians who repeatedly punish the poor and middle class while rewarding the rich, and then complain about not having enough money to support their expensive lifestyles, you can attribute that to stupidity.

    • Re:Be Careful (Score:5, Insightful)

      by hedwards (940851) on Monday June 21, 2010 @08:55AM (#32639548)
      The problem is most of the time people on some level know that it's a bad idea. I'm sure somebody had lingering doubts that cutting corners on safety equipment was a bad idea. Some people definitely realized that the absence of WMDs detected by the weapons inspector could be indicative of them not being there.

      As for the poor voting to cut the taxes of the rich, some people are just so damned stupid and stubborn that they probably shouldn't be allowed to vote. Not because they get it wrong, but because they refuse to actually learn anything from it. It's like those morons that keep pushing for fewer and fewer regulations, then use the inevitable catastrophe as evidence that they didn't go far enough.
    • Re: (Score:2, Funny)

      >>>When a politician decides to engage in 2 costly wars while lowering taxes for the rich

      What about a politician that drives the national debt from 10.5 trillion (105,000 per US household, approximately) to 13 trillion (~$130,000 per) after only 1.5 years in office? Never has our debt grown this fast. Not even under Ronnie Raygun.
      .

      >>>or when a majority of society elects politicians who repeatedly punish the poor and middle class while rewarding the rich

      90% of income taxes are paid by the

      • Re:Be Careful (Score:5, Interesting)

        by drinkypoo (153816) <martin.espinoza@gmail.com> on Monday June 21, 2010 @09:45AM (#32640108) Homepage Journal

        90% of income taxes are paid by the 1% richest earners. 99% are paid by the 10% richest. Yes I know - an inconvenient fact but also happens to be true (came direct from the IRS).

        The simple truth [lcurve.org] is that they should pay much more. If you want to hold all the wealth, why shouldn't you pay all the taxes? The idea that a few can make almost all the money and yet accept less than their share of the stewardship (through various tax dodges including ye olde capital gains) is ridiculous no matter how you examine it. The top 10 taxpayers in the year 2000 paid taxes on only 50% of their income, another fact straight from the IRS. Typical wageearners who work for some corporation have to pay taxes on nearly 100% of their income. Now what's fair?

        • Re:Be Careful (Score:4, Informative)

          by Solandri (704621) on Monday June 21, 2010 @01:57PM (#32643708)
          That's a really poor way to look at it, since you're basing the vertical scale based on the one wealthiest person, throwing off the scale of the rest of your graph making it impossible to estimate an integral. Play with the income tax stats yourself [irs.gov]. In 2007:

          The lower half(below $40k) representing 45.8% of taxpayers accounted for 9.1% of taxable income, and 5.6% of income tax revenue.
          The top half ($40k-$1 mil), representing 53.2% of taxpayers accounted for 70.4% of taxable income, and 58.3% of income tax revenue.
          The upper crust (over $1 mil) are 0.9% of taxpayers and accounted for 20.5% of taxable income, and 36% of income tax revenue.

          So the bulk of income tax revenue comes from the moderately wealthy, those making $40k-$1 mil.* Arguing that the wealthiest individual doesn't pay enough, as your l-curve site does, and using that as a reason to raise income taxes on the moderately wealthy doesn't really make a lot of sense since the people you're proposing to raise taxes on aren't the wealthiest individual. Cranking up the tax rate on people with incomes over $10 mil (a "merely" 33-foot tall stack of $100 bills 0.72 inches from the goal line according to your site) may make you feel better, but it won't increase income tax revenue significantly since they only represent 8.2% of taxable income and 9.8% of current income tax revenue. It's very difficult to raise income tax revenue significantly without dipping into the lower-upper class (to $100k as Obama campaigned on) and upper-middle class ($40k-$99k). (And no, arguing that they're using tax dodges so their gross income is much higher than their taxable income doesn't work either. I ran those numbers as well and the people with the biggest ratio of gross to taxable income were in the $4k-$12k range. Those earning $1+ mil had the smallest ratio. Apparently the AMT is working.)

          *(The cutoffs are somewhat arbitrary; I chose them because they broke up taxpayers into roughly 50% blocks. Feel free to pick $30k or $50k or whatever you like from the IRS figures and run the numbers yourself. The median seems to be around $45k.)
  • by erroneus (253617) on Monday June 21, 2010 @08:39AM (#32639394) Homepage

    I have known this for most of my life. The name reflects the idea. I'm not afraid of being wrong... at least not as much as others seem to be.

    The depth of the value of errors goes much further than the topic describes. The animal brain itself is a noisy collection of errors. The reason correct processing happens at all is because nearly all possibilities are explored in neural pathways to get to the correct responses. Once correct responses are identified, neural pathways to the correct response are established. This is what we call learning in the lowest level sense of the word.

    I have always found it amusing and interesting that computers work the way they do. They work in ways that are the complete opposite of the animal neuromechanism. Computers, originally derived from numerical processing devices, rely on accuracy and seek to prevent errors in every way possible. Memory is storage rather than a path. In a way, computers are our biggest hangups about being wrong put into mechanical practice.

    I find it to be far from ironic that we are now trying to get computers to "learn" under these conditions. The fact that it doesn't work particularly well. When every measure is taken to always be right, how can a machine learn? It is also far from surprising to me to see that people who are so afraid of being wrong are also the least capable of learning anything new or useful or being able to adapt to new circumstances. It all fits neatly within my own observations about mistakes and learning.

    • by Hognoxious (631665) on Monday June 21, 2010 @08:47AM (#32639480) Homepage Journal

      I have known this for most of my life. The name reflects the idea.

      Indeed. In more ways [merriam-webster.com] than one.

    • by commodore64_love (1445365) on Monday June 21, 2010 @09:24AM (#32639852) Journal

      What annoys me is that managers expect perfection from imperfect being. I remember in my second year as an engineer I was testing an FPGA using a self-designed testbox. By a simply drawing a line in the wrong place I had connected 28 volts to 4 of the pins, which then blew-out the FPGA.

      Rather than say "Ooops. Fix it and try again," the managers totally over-reacted and stopped work on the project. We wasted two weeks on this simple error. Thousands of dollars in man-hours because of a damaged $200 part. Rdiculous. I identified the problem within just a few hours and had it fixed by the next day, but the managers went into panic mode and forbade me from entering the lab until a 2 week review was finished.

      They would not allow for error.

    • I have always found it amusing and interesting that computers work the way they do. They work in ways that are the complete opposite of the animal neuromechanism.

      Well it makes sense. We developed computers to do the things that we're bad at, such as fast error-free calculation and perfect storage of information. If computers worked the way the human mind worked, then it would have the same problems as the human mind, and we'd be better off getting a person to do those things.

  • by AnonymousClown (1788472) on Monday June 21, 2010 @08:42AM (#32639418)

    Mistakes can cost us time and money, expose us to danger or inflict harm on others, and erode the trust extended to us by our community.

    Or being ridiculed and humiliated by assholes who gain a false sense of superiority by belittling people over mistakes - many times trivial ones. Which then leads the other person to dig their heals in, argue pedantic points to stay "right" which then leads to counter pedantic arguments from the other, and round and round we go!

    But hey! That's what you get when you post on Slashdot or work in IT.

    • by justinlee37 (993373) on Monday June 21, 2010 @09:07AM (#32639686)

      Man. That totally reminds me of how much I hate this one dude at work. He gets this stupid-ass grin on his face whenever he thinks he's telling you something you don't know, and it makes me want to knock the smug bastard's teeth out of his head.

      At least he's a socially inept moron with a stupid-sounding voice, so the cosmic joke is on him.

      • The next time it happens, mid sentence say as apathetically as you can "I don't care." and walk off to carry on with your work.

        Nothing is more satisfying than letting a smug git know that his audience is not impressed.
      • by brian0918 (638904)

        At least he's a socially inept moron with a stupid-sounding voice, so the cosmic joke is on him.

        Only insofar as he considers social acceptance his ultimate value/goal. If he does not, then the joke is on you.

    • by wurp (51446)

      I agree with you that getting a sense of superiority because you caught someone else's mistake is itself a mistake. Discounting someone's argument because they made an error unrelated to the relevance or efficacy of the argument is likewise a mistake.

      However, when someone points out an error and you take it as an insult, you are doing exactly what this research is telling you not to do. The *point* is that we need a willingness to make mistakes *and a willingness to learn from them*.

      If you're unwilling to

  • Unfortunately... (Score:5, Insightful)

    by fuzzyfuzzyfungus (1223518) on Monday June 21, 2010 @08:42AM (#32639420) Journal
    While it might be true that "we do not err out of laziness, stupidity, or evil intent", it is definitely also the case that laziness can and does lead to ignoring procedural correctness that would have caught error, stupidity can and does delay the recognition of error until it has had time to balloon into something more serious, and evil intent can cause the willfull application of anything that laziness or stupidity would lead to; but carried on much more intelligently(and thus dangerously). Not to mention, of course, that little class of statements we know as "lies", which are essentially calculated to cause errors in those receiving them.

    Obviously, in a trivial sense, nobody wakes up in the morning and says "Gosh, I sure do feel like really fucking up today!"; but some people take measures that reduce the probability of error(and, where possible, measure it) and others do not. Just because virtually all human reasoning, outside of (some) math and syllogisms, is inductive does not imply that all human reasoning is on equally firm ground. In fact, given that deductive logic is useful pretty much only in certain types of math and in carefully controlled toy situations, the ability to distinguish various statements of inductive logic by quality or probability is probably the most vital aspect of epistemology as an applied science...
    • by Inda (580031)
      You say that but a guy I work with had never been involved with a court case in all his twenty years in project management. One day, he woke up and said "I'm going to miss a step, fuck this up and see where it leads" just to gain the experience. It does happen.

      I'm forever "being wrong". I often make an untrue statement to see if I'm corrected or if I'm fishing for information. Yes, I am that cunt.
  • So, it is possibly false... In my opinion it's the best working model you can come up with. I have yet to encounter anything that isn't "possibly false"... There is no such thing as an absolute truth, it only exists if you blind yourself to all other truths. But since humans are apparently built to account for the possibility of failure there should be no problem with a 'good enough' truth...

    It only raises one important question: Why are people fighting, kicking and screaming, every step of the way when t
  • by tverbeek (457094) on Monday June 21, 2010 @08:44AM (#32639450) Homepage

    To #ERR is human, to forgive divine.

  • that we could be in error, without deeming ourselves idiotic or unworthy."

    i guess Schulz has never read a comment board

    • by hedwards (940851)
      Probably doesn't even own a computer. Or watch TV or well interact with anybody at all. Fox is in and of itself evidence that there's a huge market for entertainment of morons and idiots.
  • Oh Baby! (Score:4, Funny)

    by Vinegar Joe (998110) on Monday June 21, 2010 @08:46AM (#32639468)

    If loving you is wrong, I don't want to be right!

  • by Rie Beam (632299) on Monday June 21, 2010 @08:46AM (#32639470) Journal

    So that's why I feel smarter after staying at a Holiday Inn.

  • by DNS-and-BIND (461968) on Monday June 21, 2010 @08:47AM (#32639482) Homepage
    None of these conclusions make sense in an Eastern shame culture/honor culture. These conclusions, do, however, dovetail nicely with Western guilt culture. Correctly pointing out the mistakes of others can result in massive loss of face for the correctee. This will have real consequences for the finger-pointer. Publically admitting that you were wrong and redressing your errors is career suicide in many places throughout the world. I see it all the time, Westerners are shocked that their culture of "it's OK to make mistakes and it's a positive thing to admit when you are wrong" doesn't apply everywhere.
    • Up to a point (Score:5, Interesting)

      by SmallFurryCreature (593017) on Monday June 21, 2010 @08:51AM (#32639522) Journal

      Look at what happens in Japan when a major mistake is make and in the west. Has anyone from BP taken accountability? Has anyone from Boeing ever laid down their jobs because they killed a couple of hundred people with their bad decision? Has any airline director every left? No.

      But in Japan the higher ups DO feel that they are at fault for mistakes.

      Your explenation of western attitude often becomes: A fault is nobodies fault.

      • Re: (Score:3, Interesting)

        by DNS-and-BIND (461968)

        The "fault is nobody's fault" is exactly what we're talking about! Don't resign in disgrace or commit suicide, just go on like nothing has happened. What BP is doing is crass modern Western shamelessness. Why is that that BP is the first thing that pops into mind? Can we have a higher discussion without interjecting the crisis of the month?

        Besides, responsibility has been taken already, so if there are any screwups, we already know who to blame: "I ultimately take responsibility for solving this crisi

        • by Kjella (173770)

          Saying you take responsibility for fixing something is entirely different than the blame game of whose fault it is. Particularly all those that present it like you are incompetent fuckups, I'm the knight in shining armor are extremely frustrating, since 99% of the time they're just looking to kick a man that's down. BP will take a beating at least as bad as their misdeeds already, Obama is just scoring polictically. Not unlike corporate politics.

      • Re: (Score:3, Insightful)

        by fredmosby (545378)
        What really annoys me about this oil spill is that politicians are spending all their time asking "Who can we blame?" when they should be asking "How can we prevent this in the future?". Blaming people makes for good press, but directly addressing the problem is much more effective.
    • by justinlee37 (993373) on Monday June 21, 2010 @08:58AM (#32639578)

      Culture notwithstanding, the conclusions regarding the probabilistic nature of inductive reasoning are insightful. It is important to understand that complex tasks and systems of belief are the result of trial and error; of making mistakes. Regardless of whatever superstitious or fallacious beliefs various cultures might have (and they all have them), this is an immutable fact of cognition, behavior, and psychology in general.

      So I don't think it's that the conclusions don't make sense in an Eastern culture. It's simply that, as you describe it, this aspect of Eastern culture makes no sense at all to begin with. You can't do everything perfectly the first time around.

    • I see it all the time, Westerners are shocked that their culture of "it's OK to make mistakes and it's a positive thing to admit when you are wrong" doesn't apply everywhere.

      Surprising really, because such thinking doesn't actually apply in the Western world either. If you make mistakes in the West--even minor ones--you will hounded out of your position by a feral media. Unless you're in a position of considerable corporate power obviously.

    • by Yvanhoe (564877)
      That's ok that they do that mistake. They'll learn...
    • Re: (Score:3, Insightful)

      by Kiaser Zohsay (20134)

      Working with computers, we deal with hard facts, boolean truth values. Separating what we know from what we don't know is a large portion of this job. Therefore, being right about what you do and do not know is important.

      To me personally, being right is so important that I will admit when I am wrong, so I can be right about that.

      I realize that my experience is not typical. I learned reasonably early in life to put little stock in other peoples opinions of me and my actions, largely because the opinions o

    • it is kind of insulting to talk of eastern culture as this "shame culture/ honor culture", or a western "guilt" culture

      it implies there is no shame/ honor in the west, and no guilt in the east. it also implies motivations in the east, or west, can be understood with simplistic facile concepts

      what your words above really say is that some people, yourself, are simpleminded: that you buy into overly broad brushstrokes, surface level pop psychology ideas about other groups of people you know little about

      i don't

  • I'm almost always open to the possibility that I'm wrong on any subject. The way I look at it is if I'm wrong about something, and someone has given me the correct information, I'm better off for it.
  • by ibsteve2u (1184603) on Monday June 21, 2010 @08:59AM (#32639594)
    ...for Lucy is never wrong. (There is some kind of circular logic there...pumpkin-shaped, possibly.)
  • by antifoidulus (807088) on Monday June 21, 2010 @09:03AM (#32639636) Homepage Journal
    and they were right, just look at how many times he was wrong!
  • Not only are we stupid, We don't even know how stupid we are! [nytimes.com]

    • "Think about this; think about how stupid the average person is, and then realize that half of 'em are stupider than that."
      - George Carlin, Doin' It Again (1990)

      i'm part of that half! :)
  • by Per Wigren (5315) on Monday June 21, 2010 @09:18AM (#32639798) Homepage
    Often, the only way to get answers to your questions on the internet is to claim things about the subject you know are wrong. Then heaps of people will jump on you to tell you what is correct.
  • Heuristics (Score:3, Interesting)

    by DynaSoar (714234) on Monday June 21, 2010 @09:24AM (#32639842) Journal

    Shulz is precise, just not quite accurate in her descriptions, assertions and conclusions.

    It's not (just) inductive reasoning that produces the humans' results, it's heuristics. We create the fastest good enough result rather than the best possible result more slowly. The former proved conclusions that are correct enough but very fast, which evolution favors over slower but more accurate decision making. You can be right as god, but if you get ate you're just very right poop.

    Heuristics works in all directions, top-down, bottom-up and side-to-side. Inductive, deductive and all the rest is labels we developed much later to try to describe what we could figure out about what's really going on in our heads. We can do those things because they're all part of how we work, but on the fly we never work in only one direction. Heuristics develops chains of thought according to associations, and so can fill in the chain (more often, the tree)

    There are some things that defy logical reasoning, such as language. We can use reasoning to figure out how to talk about the arrangement in memory of the items we can recall and so talk about, but learning to communicate happens far faster than learning can account for. Hence "generative grammar" and the utterly arbitrary nature of language production. Such things are predetermined in the way of species specific behaviors. We are genetically predisposed for these, and no logic could possibly keep up. This could be hardwired heuristics, though nobody can prove that as yet, but it certainly acts like it.

    So, heuristics, not induction, plus hardwired exceptions. Thus, we're never right, but we're right enough (to varying degrees) fast enough to survive.

    Top Shulz's cake with that frosting, and her precision becomes accurate also when it comes to our (neuroscientists) present best picture of how we think.

    It's not in the article above, but thinking that's always completely right has the major failing of being unable to produce novel responses. Heuristics allow the adaptability which novel situations require (another ability favored by evolution as well as Dr. Chandra), and which allows for creativity.

    Sounds like a very good book. Adequately correct too. Must have been written heuristically.

  • Once upon a time the U.S. Army brass came up with a policy called 'No mistakes, no excuses.'

    'No excuses' we could understand, but 'no mistakes?' On a battlefield? What stupid little Ivy League wonk came up with this idiocy?

    So, we all became liars.

    • It's good that you said the US Army. I was about to ask if it was a quote from Field Marshal Haig.
  • "Once we recognize that we do not err out of laziness, stupidity, or evil intent, we can liberate ourselves from the impossible burden of trying to be permanently right."

    This is almost too self-referential, but the fact that most mistakes are honest does not mean that all mistakes are honest. That would be an error of inductive reasoning. And in fact that inductive reasoning (assumption of honesty/ fair play/ empathy) is exactly the vulnerability that makes sociopath-type behavior rewarding. It is, in short

  • we need to recognize that human fallibility is part and parcel of human brilliance

    Yet another reason why the idiots pushing for instant replays in baseball should STFU.
  • Machine Learning? (Score:2, Interesting)

    by shabtai87 (1715592)

    I think that anyone who has dabbled in machine learning would not be too shocked (weather by Hume's version or this post). It's the error term in machine learning, adaptive filtering, etc. that really drives the learning. As a stupid but simple example: Least Mean Squares in adaptive filtering (essentially gradient descent over the error surface).

  • My neighbour must be a fucking genius.

  • Susie: "Yeah, that's it. Your grades are low because you're too _smart_ for the class." Calvin: "Believe it, lady. You know how Einstein got bad grades when he was a kid? Well mine are even _worse_!"
  • by littlewink (996298) on Monday June 21, 2010 @11:10AM (#32641384)
    Induction is reasoning from factual evidence to some conclusion. But the primary mode of human reasoning is called "abduction" and differs from induction. To illustrate, consider that a valid deductive inference has three elements: a rule which when applied to a single case produces a conclusion (the -> means "implies"):

    DEDUCTION: Rule + Case -> Conclusion

    • Rule: All the beans from this bag are white.
    • Case: These beans are from this bag.
    • Conclusion: These beans are white.

    Induction and Abduction use the elements in a different way:

    INDUCTION: Case + Conclusion -> Rule

    • Case: These beans are from this bag.
    • Conclusion: These beans are white.
    • Rule: All the beans from this bag are white.

    ABDUCTION: Conclusion + Rule -> Case

    • Conclusion. These beans are white.
    • Rule. All the beans from this bag are white.
    • Case. These beans are from this bag.

    Only deduction provides a valid inference. But humans default to using abduction and learn induction and deduction only slowly through formal training.

  • by argStyopa (232550) on Monday June 21, 2010 @11:33AM (#32641780) Journal

    Kathryn Schulz's book makes a great case for understanding why being wrong is so intrinsic to being human...unfortunately, and ironically, she's got it 180-degrees-wrong.

    Where she fails is her conclusion: it's not that BEING WRONG is what makes us so successful, adaptive, and smart. It's the 'trying again to be right' bit.

    Being wrong is easy. Being right is much, much harder, and probably requires trial and error. But if you're satisfied with being wrong, you don't keep trying. While the idea that 'being wrong is human' is all nice and friendly, ACCEPTING being wrong without any sense of negative consequence is staggeringly, blindingly stupid. Without gradations of consequence (ie more and more serious consequences for more and more serious failures), life doesn't even make sense.

    "Schulz recommends that we respond to the mistakes (or putative mistakes) of those around us with empathy and generosity and demand that our business and political leaders acknowledge and redress their errors rather than ignoring or denying them. "

    Sorry, but that's just stupid. This is the same sort of touchy-feely crap that's infected modern American public schools. "It's ok, little Timmy, you just keep trying to figure out what 2+2 is. You're still a valuable and precious little snowflake."

    Why should Timmy ever bother to figure out 2+2 if he never NEEDS to get it right? Whether it's reward-based or something more simple like shame, there MUST be a disincentive to be wrong. Anything else is simply asinine.

    So you send your husband out to get dinner; instead of buying food for your children, he spends the money on porn and beer. Ah well, you should respond with generosity and empathy, right?

    Can you imagine if her methodology was followed? "It's ok BP, we all know that drilling for oil is hard work, and can "It's ok, Mr President. You just spent well over a $trillion on an ostensible economic rescue plan, but aside from simply not working, it pretty much all ended up in your friends' and political allies' pockets. We won't be angry, we won't even be annoyed. We'll respond with generosity and empathy. Perhaps you could take another $trillion from our kids' and grandkids' future and try again? Maybe this time you'll succeed?"

  • by DutchUncle (826473) on Monday June 21, 2010 @01:53PM (#32643680)
    The disconnect between this concept and most people's thinking explains why scientists and engineers rarely advance well into management and politics.

    - As a scientist or engineer, it is acceptable - even required! - to incorporate new data and adapt your thinking, even reach different conclusions.

    - As a manager or politician, such behavior reflects weakness or lack of principle, sometimes called "flip-flopping".

    In my experience the latter approach seems to be the *typical* perspective of normal people (non-engineers), who would rather "stay the course" and "finish what they started" even when they openly admit that they would have chosen differently now. The contrapositive concept of "what did he know and when did he know it", with the understanding that someone who chose badly may have made a reasonable decision based on the information available AT THE TIME, is often displayed pro forma and then trampled upon.

As far as we know, our computer has never had an undetected error. -- Weisert

Working...