Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Science

MRI Study Shows We're Wired to Cooperate 42

ibi writes "The NYT reports that humans apparently have an inborne bias towards cooperation. People who cooperated during standard Prisoner's Dilemma tests registered high levels of activity in the pleasure centers of their brains. This result was the opposite of what the researchers were expecting. (But I bet they were testing students rather than their advisers :-)"
This discussion has been archived. No new comments can be posted.

MRI Study Shows We're Wired to Cooperate

Comments Filter:
  • Next up in the MRI testing queue:

    Slashdotters posting intelligent, informative replies vs. Trolls.

    'splain that one...
  • standard Prisoner's Dilemma tests...

    I am not a number, I am a free man!

    I will not be pushed, filed, stamped, indexed, debriefed or numbered!
  • Well yeah. (Score:4, Interesting)

    by pagercam2 ( 533686 ) on Tuesday July 23, 2002 @12:42PM (#3937992)
    How could humans have developed a society, the arts and modern civiliation if they wanted to be independant. Of course it feels good to help others and there is security in working as a group, be it cavemen or street kids forming a gang. Humans are social, this seems crazy to assume otherwise. As described in "A Beautiful Mind" which was a pretty good movie but a little too much about mental illness and not the accomplishments of the main character. The main characters "big idea" is that if everyone is out for themselves everyone just ends up fighting each other and everyone looses, if on the other hand you work together and cooperate no one gets the ideal goal but everyone does well and evolution is about survival not being the strongest.
  • stupid researchers (Score:4, Insightful)

    by tps12 ( 105590 ) on Tuesday July 23, 2002 @12:43PM (#3937997) Homepage Journal
    Why is this the opposite of what the researchers were expecting? Game theory was not invented by evil capitalists, it was developed to describe observed situations and quantify rational decisions. It is trivial to demonstrate that cooperation (or "tit for tat") is the winningest strategy in an infinitely repeated Prisoner's Dilemma. It should come as no surprise that humans have evolved to choose the winning strategy in such situations.

    Another Prisoner's Dilemma: if a moderator mods me down, and I am insightful, then we both lose (me right now, and the mod in metamoderation). But if he mods me down and I am trolling, then he wins and I lose. And if he mods me up and I am trolling, then I win and he loses. However, if he mods me up and I am insightful, then we break even again.

    So which is it, punk?
    • by bowronch ( 56911 )
      It would be interesting to know whether the subjects in the test knew how many rounds there were going to be in the games... If i recall from axelrod [amazon.com], in a finite game the best strategy is to defect at the end...
    • There's not enough information there to determine what the best moderation would be. In all your cases, the amount won is equal. In either case, you either win or lose. One win is a good as another, so it doesn't matter how you moderate.

      Now, if the question had been stated so that if you mod down and win the payoff is $10, but if you mod up and win the payoff is $100, then it's easy to decide which is the right decision.

  • by dh003i ( 203189 ) <`dh003i' `at' `gmail.com'> on Tuesday July 23, 2002 @01:32PM (#3938334) Homepage Journal
    Try doing the same study with lawyers, executives, and politicians. Lets see, put

    Bill Gates
    Steve Jobs
    Hillary Rosen
    Jack Valentini
    Fritz Hollings
    Whaley (from Enron)
    Johny Cochraine
    Gary Wennig (from Global Crossings)

    in a room together. See if they all manage to cooperate.
  • "If we put some C.E.O.'s in here, I'd like to see how they respond," Dr. Kilts said. "Maybe they wouldn't find a positive social interaction rewarding at all."

    I think this is a bit ignorant, or maybe a bit incomplete.

    Yeah, this test could be a good model of a free market system. When we all act in self-interest within the parameters of the golden rule things work out pretty darn well. But I think most CEOs fall safely into this category. They are working with their community (i.e. corporation) to produce a product that is in demand while employing the people needed to produce it. Hiring, firing, setting prices are all part of the game that keeps an economy efficient and the most productive long term. Cooperation at it's best. So a few CEOs lately have given the position a very bad rap. It's really to bad...
  • If you didnt cooperate from the beginning, its just logical that you would lose the trust of the other person. Then both of them would begin to defect to ensure gaining something at least. Why would anyone not want to get the most out of it by trying to cheat the other when its obvious it would just backfire? The test did give results, but I dont think for the right reason.
  • unexpected? (Score:4, Interesting)

    by bob_jenkins ( 144606 ) on Tuesday July 23, 2002 @02:45PM (#3938963) Homepage Journal
    I was skimming "Game Theory Evolving", which walks through various hypotheses for why humans act the way they do. That's the result they came up with, that people are programmed to play tit-for-tat. I'm not sure if the initial bias was towards cooperation, but I think it was.

    I recall that they found that "homo reciprocans", who does to you what you do to them, matched people's behavior best, even in one-time situations where the other guy would never get the chance to do to you what you did to them. Also found that even a small group such people could survive and prosper in a sea of selfish people by sticking together.

    Another result was that people model every situation as analogous to previous situations, and they treat one-time psychological experiments as "us against them", where "them" is the researchers.
  • by jsimon12 ( 207119 ) on Tuesday July 23, 2002 @02:49PM (#3938988) Homepage
    Uh does this really say much about mens brains? The study was entirely female, other studies [go.com] have shown that mens and womans brains are very differnt.
    • Perhaps it shows that women are more willing to cooperate? ;) Or perhaps the test invalidates itself in that nobody who tends towards non-cooperation would have cooperated in taking the test in the first place!
  • It would be most interesting to find at what price level would the cooperation break down. What if defecting brought $1000, mutual cooperation $100 and mutual defection $25?

    Changing the reward structure will completely alter the game. It will also show another dimension to the problem. When the stakes are low it maybe that the pleasure derived from cooperating is taking over but what about when it's more? What about when it's life and death?

    I wonder...
  • by ehud42 ( 314607 ) on Tuesday July 23, 2002 @03:05PM (#3939107) Homepage
    It's obvious that if everyone cooperates, the group as a whole benefits. However, once you understand the behaviour required for the group to suceed, and if the group is large enough and the defectors small enough, then the defectors can easily win big.

    Take driving for example. We have a major road running from the downtown to the outskirts of our city that is 4 lanes wide (most of the time). The curb lane allows for parking during the day but not during rush hour. So it should be open during rush hour, however, there is always a car stopping or parking or a city bus is lumbering along. The congestion that arises from large numbers of vehicles constantly trying to merge from 4 to 3 lanes would reak havoc on the overall system, therefore the most effecient strategy for the masses is to stick to just 3 lanes.

    The speed limit is 60kph. During rush hour, the actual speed in the 3 lanes when volumes get heavy is more like 50kph. With the open lane (parking lane - or as I like to call it, the Express Lane), you can easily do 70 - 80kph (interesting side note: I've been driving this route for years without ever seeing anyone pulled over during the rush hour). However, if too many people 'defect', the average speed in the Express Lane drops to 30-40kph. Do you take the express lane?

    Being a defector, most days I am able to get ahead of the masses - saving many minutes off my travel time. The risks? If too many join me, (or if I don't pay attention to slower/stopped traffic ahead of me) there is a dramatic reduction in the average speed. In other words, I can loose big time.

    BTW, before I'm flamed as being a offensive / dangerous driver, allow me to explain my 3 priorities for getting home in order of descending importance:

    • Get home safely.
    • Do not do anything that causes other drivers to have to react defensively.
    • Get home as fast as possible.
    I'm aggressive, or defensive. I'm assertive.
  • For those who don't know what it is, and, like me, don't feel like registering to get into the NYT, here's the prisoners' Dilemma [vub.ac.be].

  • Cooperation is usually analysed in game theory by means of a non-zero-sum game called the "Prisoner's Dilemma" (Axelrod, 1984). The two players in the game can choose between two moves, either "cooperate" or "defect". The idea is that each player gains when both cooperate, but if only one of them cooperates, the other one, who defects, will gain more. If both defect, both lose (or gain very little) but not as much as the "cheated" cooperator whose cooperation is not returned. The whole game situation and its different outcomes can be summarized by table 1, where hypothetical "points" are given as an example of how the differences in result might be quantified.

    [ Action of A/Action of B | Cooperate | Defect ]
    [ Cooperate Fairly |good[+ 5] |Bad[- 10 ]
    [ Defect |Good[+ 10] |Mediocre 0]

    Table 1: outcomes for actor A (in words, and in hypothetical "points") depending on the combination of A's action and B's action, in the "prisoner's dilemma" game situation. A similar scheme applies to the outcomes for B.

    The game got its name from the following hypothetical situation: imagine two criminals arrested under the suspicion of having committed a crime together. However, the police does not have sufficient proof in order to have them convicted. The two prisoners are isolated from each other, and the police visit each of them and offer a deal: the one who offers evidence against the other one will be freed. If none of them accepts the offer, they are in fact cooperating against the police, and both of them will get only a small punishment because of lack of proof. They both gain. However, if one of them betrays the other one, by confessing to the police, the defector will gain more, since he is freed; the one who remained silent, on the other hand, will receive the full punishment, since he did not help the police, and there is sufficient proof. If both betray, both will be punished, but less severely than if they had refused to talk. The dilemma resides in the fact that each prisoner has a choice between only two options, but cannot make a good decision without knowing what the other one will do.

    Such a distribution of losses and gains seems natural for many situations, since the cooperator whose action is not returned will lose resources to the defector, without either of them being able to collect the additional gain coming from the "synergy" of their cooperation. For simplicity we might consider the Prisoner's dilemma as zero-sum insofar as there is no mutual cooperation: either each gets 0 when both defect, or when one of them cooperates, the defector gets + 10, and the cooperator - 10, in total 0. On the other hand, if both cooperate the resulting synergy creates an additional gain that makes the sum positive: each of them gets 5, in total 10.

    The gain for mutual cooperation (5) in the prisoner's dilemma is kept smaller than the gain for one-sided defection (10), so that there would always be a "temptation" to defect. This assumption is not generally valid. For example, it is easy to imagine that two wolves together would be able to kill an animal that is more than twice as large as the largest one each of them might have killed on his own. Even if an altruistic wolf would kill a rabbit and give it to another wolf, and the other wolf would do nothing in return, the selfish wolf would still have less to eat than if he had helped his companion to kill a deer. Yet we will assume that the synergistic effect is smaller than the gains made by defection (i.e. letting someone help you without doing anything in return).

    This is realistic if we take into account the fact that the synergy usually only gets its full power after a long term process of mutual cooperation (hunting a deer is a quite time-consuming and complicated business). The prisoner's dilemma is meant to study short term decision-making where the actors do not have any specific expectations about future interactions or collaborations (as is the case in the original situation of the jailed criminals). This is the normal situation during blind-variation-and-selective-retention evolution. Long term cooperations can only evolve after short term ones have been selected: evolution is cumulative, adding small improvements upon small improvements, but without blindly making major jumps.

    The problem with the prisoner's dilemma is that if both decision-makers were purely rational, they would never cooperate. Indeed, rational decision-making means that you make the decision which is best for you whatever the other actor chooses. Suppose the other one would defect, then it is rational to defect yourself: you won't gain anything, but if you do not defect you will be stuck with a -10 loss. Suppose the other one would cooperate, then you will gain anyway, but you will gain more if you do not cooperate, so here too the rational choice is to defect. The problem is that if both actors are rational, both will decide to defect, and none of them will gain anything. However, if both would "irrationally" decide to cooperate, both would gain 5 points. This seeming paradox can be formulated more explicitly through the principle of suboptimization.

    blatantly ripped from: http://pespmc1.vub.ac.be/PRISDIL.html
  • From the article: "...scientists have discovered that the small brave act of cooperating with another person, of choosing trust over cynicism, generosity over selfishness, makes the brain light up with quiet joy."

    Several major world religions have taught for centuries that helping other people is a good thing, even if it is apparently detrimental to your own interests -- and the people who make the jump from listening to actually practicing this have noticed that it brings them joy.
  • I ran across a good cartoon at Strange Matter [strange-matter.com] which seems to have gone down (permenantly?). Either way the caption is something along the lines of "The true cause of the extinction of the neanderthals" and has a group standing arround and one announcing "From now on all our survival decisions will be made by committee." :)
  • by Anonymous Coward on Tuesday July 23, 2002 @07:55PM (#3941128)
    In college, we were doing a bit of prisoner's dilemma/game theory in micro-economics. A friend of mine in the class had heard we would be playing a game in class based on this, and that which ever person had the highest point total at the end would win girl scout cookies.

    Well, with that on the line, we set to work. Each time, I would pick cooperate, and he'd choose to screw me over. It was really no surprise that at that point he was in the lead in the class. After this was done, there was a second round using the entire class (and a majority to decide which way things would go), and through a few smart decisions, he cinched it and won the cookies.

    As we were leaving class, a couple said to me (noting my poor performance earlier) "Wow, you're really not very good at that are you?" So, I pulled out my half of the girl scout cookies and laughed and said "I think I did alright."
  • It would be interesting to see how their brains responded (more than what choices they make). But I guess that's what the researchers were talking about with the CEO's.
  • Wow, this supports what I've been thinking for many years -- that excessive personal ambition and competetive spirit are mental aberrations. The very nature of these defects drives those who suffer from them into positions of power, where they make life miserable for the rest of us.

The rule on staying alive as a program manager is to give 'em a number or give 'em a date, but never give 'em both at once.

Working...