Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Science

We Learn Faster When We Aren't Told What Choices to Make (scientificamerican.com) 32

Michele Solis, writing for Scientific American: In a perfect world, we would learn from success and failure alike. Both hold instructive lessons and provide needed reality checks that may safeguard our decisions from bad information or biased advice. But, alas, our brain doesn't work this way. Unlike an impartial outcome-weighing machine an engineer might design, it learns more from some experiences than others. A few of these biases may already sound familiar: A positivity bias causes us to weigh rewards more heavily than punishments. And a confirmation bias makes us take to heart outcomes that confirm what we thought was true to begin with but discount those that show we were wrong. A new study, however, peels away these biases to find a role for choice at their core. A bias related to the choices we make explains all the others, says Stefano Palminteri of the French National Institute for Health and Medical Research (INSERM), who conducted a study published in Nature Human Behaviour in August that examines this tendency. "In a sense we have been perfecting our understanding of this bias," he says.
This discussion has been archived. No new comments can be posted.

We Learn Faster When We Aren't Told What Choices to Make

Comments Filter:
  • No kidding. (Score:2, Insightful)

    by mark-t ( 151149 )
    It is long been widely recognized that people learn faster from failure than we do from success.

    Why is this new study only discovering what has been common knowledge for at least as long as I've been alive?

    • Re:No kidding. (Score:5, Informative)

      by Aighearach ( 97333 ) on Thursday October 01, 2020 @03:04PM (#60562082)

      No, you're stating something that is wrong and the article would tell you that if you read it, or if you understood the issue.

      You already believe what you said to be true. So you weigh evidence of it as supporting you, and ignore evidence that doesn't support it. And in this case, you presume that evidence you don't know the details of supports it, even when it disproves it.

      Your confirmation bias is entirely bulletproof. Congratulations?

      • by olau ( 314197 )

        Well, you aren't really talking about the same thing.

        This study is talking about a highly controlled experiment where you cannot get more information out of a failure than a success, on average. It is studying how the mind reacts to information presented to it.

        In the real world, failures may in fact have a higher information content, so even if you are a bit slower at learning from them in general, the overall rate of learning can be higher.

        • by sjames ( 1099 )

          That would suggest that at MOST the study showed nothing either way about learning from failure, yet OP concluded the opposite based on exactly the sort of bias the study did provide insight into.

        • No, if you think there is an information differential, you'd need to demonstrate that first.

          You don't start from "maybe an unsupported claim is true." It isn't supported. It isn't fit for evaluation.

          And unsurprisingly, it also is mysteriously consistent with the same bulletproof confirmation bias as above.

          (Not relevant, since your claim was unsupported, but generally in information theory there is a lot more information available after successes, because your success has more similar results to other people

          • by mark-t ( 151149 )

            An information differential can be inferred simply based on the amount of knowledge that is gained trying to perform a new task.

            A rat that is put in a maze and makes lots of mistakes will, obviously, know the maze more thoroughly than a rat that happens to make it through correctly with less errors, and the former rat is more likely able to use his past failures as a basis to avoid making similar mistakes in the future with a new maze, while a rat who made fewer mistakes is more likely to continue to

            • A rat that is put in a maze and makes lots of mistakes will, obviously, know the maze more thoroughly than a rat that happens to make it through correctly with less errors, and the former rat is more likely able to use his past failures as a basis to avoid making similar mistakes in the future with a new maze, while a rat who made fewer mistakes is more likely to continue to make other mistakes when presented with a new task.

              Strange, because I'd conclude exactly the opposite.

              A rat that had (by luck) found t

              • by mark-t ( 151149 )

                There's nothing you've said that was wrong, but the rat that was made less mistakes has, at the end of the whole event,, learned LESS than the rat which made more mistakes along the way.

                There's a tipping point too,, where the rate of failure is so high that it outweighs the rate at which one could otherwise be learning and getting better, but I am addressing the general case.

                It's no coincidence that the person with the most home runs also has the most strike outs.

            • An information differential can be inferred simply based on the amount of knowledge that is gained trying to perform a new task.

              That's right, you just weigh your bias, and if it has weight, it was something real!

              • by mark-t ( 151149 )

                If should be obvious that a rat that has explored more of a maze will know more about the maze, and in turn be better equipped to utilize that knowledge in future mazes.

                And just like rats, we too almost invariably learn faster by failure than we do by success.

                • Anything that you presume because it seems obvious is your bias.

                  Stop assuming. That isn't the way to understand science. There are too many layers of non-intuitive details for that to work.

    • by sjames ( 1099 )

      Case in point, TFA actually said we weight experience more from successes and outcomes that confirm our pre-existing beliefs. That bias disappears when choice is removed, but the overall weight of the experience is reduced as a result.

    • by gweihir ( 88907 )

      It is long been widely recognized that people learn faster from failure than we do from success.

      Why is this new study only discovering what has been common knowledge for at least as long as I've been alive?

      Because "common knowledge" is not so common as you think.

  • Success vs failure (Score:2, Interesting)

    by gurps_npc ( 621217 )

    When you succeed you are confirming what you already believed. When you fail, you are discovering your prior belief to be false and, if you are not too stubborn, you can come up with a new belief.

    This means that any learning after a success is small while you have the possibility to learn entire new concepts after a failure.

    Of course, this does not always happen. People have a strong tendency to ignore the failure come up with an unlikely excuse for it, rather than admit you were wrong. Particularly in

  • I agree that people will learn on their own what works - that's not really anything new.

    I disagree that we "learn slower" when we're forced to make a choice - I think you've just shown test bias to authority situations (computer mandated choice) vs personal agency - aka the Milgram Experiment.

    • I disagree that we "learn slower" when we're forced to make a choice - I think you've just shown test bias to authority situations (computer mandated choice) vs personal agency - aka the Milgram Experiment.

      "The science can't be correct, it didn't confirm my bias!" Well done, excellent level of self awareness you've achieved. /s

      • Ah yes - the appeal to authority fallacy. Well done /s

        • No. But you could look up "appeal to authoritay fallacy" if you were curious about what it means.

          • Sigh. Yes.

            When writers or speakers use appeal to authority, they are claiming that something must be true because it is believed by someone who said to be an "authority" on the subject.

            And now you'll hide behind "b-but I didn't really make that assertion"

            And yet, you did.

  • When we win we make a big deal, when we lose, we try again and again and again ...

  • but where does getting burned by a hot pan fall? I've done that exactly once, and I'd call it a failure not a success. So in some cases that cause physical pain, the failure is done once, lesson is learned.
    • by aitikin ( 909209 )

      but where does getting burned by a hot pan fall? I've done that exactly once, and I'd call it a failure not a success. So in some cases that cause physical pain, the failure is done once, lesson is learned.

      Depends. Were you told to put your hand on the hot pan or did you do it yourself?

      • Well the abstract posited two things. We learn better on our own and that positive reinforcement is stronger than negative. I questioned the second. And I think when it comes to physical pain, possibly life threatening, we learn better from negative.
  • "In trials that showed the outcomes for both symbols after a choice was made, subjects learned more from their chosen symbol when it gave a higher reward and when the unchosen one would deduct a point. That is, in this free-choice situation, they learned well from obtained gains and avoided losses.

    That result looked like a confirmation bias, with people embracing outcomes - positive or negative - that confirmed they were right."

    So if a wrong answer subtracts one point but a correct answer can award a variable number of points, isn't remembering the answers that give the largest gain more important than remembering the ones that give a small gain or a negative point? The net consequence of choosing wrong between a 10 and a 1 are much larger than choosing wrong between 1 and -1.

    And how is learning to play the game well a "confirmation bias"? And how are they getting confirmation bias from a negative outcome that proves they're ri

  • You know the cliche "Better to run away from a not-Tiger 100 times than not to run away from a yes-Tiger once." Isn't that pretty much what's going on here, but perhaps with the "loss coefficient" of not running from the real Tiger reduced a bit?

  • Sci Am is but a shadow of its former self. Injecting itself into US presidential politics was the final nail in its coffin. The articles already had become quite poor in content.

  • Quite the generalization. Personally I am obsessive about how things don't work or might not work. I don't claim to be bias free, but I am the "unhappy path" guy at my work. Even when nothing goes wrong, I still note how it may have gone wrong or could have gone even better, and write up stories for system enhancements to investigate. Not only system improvements, but also process improvements.

    One difference in my approach is I make a very detailed and exact mental model, and virtually any deviation stand

Business is a good game -- lots of competition and minimum of rules. You keep score with money. -- Nolan Bushnell, founder of Atari

Working...