Philip Zimbardo, the creator of the famed Stanford Prison Experiment (don’t worry, I’ll describe it later), is giving a lecture on terrorism and Abu Ghraib.

Zimbardo notes that he was a high-school classmate of Stanley Milgram, perhaps the best-known social psychologist. Milgram was the one who conducted the classic experiments on obedience to authority. He would invite a subject in and explain to them that they were helping him research the effects of memory. A confederate would be hooked up to an electrical chair in another room. The subject would then be asked by the lab-coat-wearing experimenter to give increasingly large electric shocks to the confederate as punishment for getting the memory questions wrong. In response, the confederate would scream in agony, ask to be let out, shout that he had a heart condition, and finally just stop responding.

At the time, conventional wisdom was that only a few people — the sadists — would would go all the way, following the orders to increase the voltage even after the confederate stopped responding. Milgram quickly proved conventional wisdom wrong: 65% followed their orders and went all the way. As Zimbardo notes, the popular theory of the time was largely dispositional: people do things because that’s their nature. Milgram provided clear evidence of situationism.

Milgram went on to do other pioneering research, including the small world experiment, where he would give people in Kansas a note for a friend in Cambridge, MA and ask them to get it there simply by passing it through friends. Milgram found that, again despite conventional wisdom of the time, it usually only six intermediaries to make it, which of course gave rise to the phrase “six degrees of separation”.

Sadly, Milgram died of a heart attack at only 51.

Milgram likely moved on from the obedience experiments because they were highly controvertial — many considered them seriously unethical, even though Milgram went to great lengths to inform the subjects the true purpose of the experiment afterwards and make sure they were alright. Zimbardo, however, follows that same path.

Milgram did a number of variants on the Obedience experiments — moving subjects closer to their victims, trying the experiments in an office building away from the prestige of Yale, using women instead of men — but most had little or no success in lowering compliance rates. Two things, however, did change compliance rates. First, if the subject saw other subjects resisting, they became willing to resist as well. Second, if the subject did not throw the switches directly, but simply supervised someone who did, they became far more willing to continue.

The two discoveries clearly have larger societal messages (just a few people resisting can help mobilize others, but increasing bureaucratization can increase compliance in the name of evil), which of course have been confirmed by larger societal studies.

For this, Zimbardo draws the concept of the “good guard” — the man who doesn’t hurt anyone but simply does his job and doesn’t interfere with the hurting. The good guards, Zimbardo notes, are key to the whole thing because if they showed signs of resistance the bad guards would likely begin to resist too. (Again, it’s not hard to extrapolate this to society.)

Zimbardo continues surveying the research and lays out the ten lessons he’s drawn from it on how to get people to commit evil:

  1. Create an ideology where the ends justify the means
  2. Get a contract from the subjects where they agree to comply
  3. Give participants meaningful roles with clear social value
  4. Have the rules be vague and changing
  5. Relabel actors and actions (“order control”, not guards; “monsters”, not people)
  6. Diffuse responsibility so subjects don’t feel liable
  7. Start small but slowly increase the requirements, step by step
  8. Make the leader seem compassionate at first
  9. Permit verbal dissent (“I don’t want to do this; I feel bad”) as long as subjects continue complying
  10. Make it difficult to exit

Further experiments find that people’s inhibitions will be lowered if they or the subjects are “deindividualized” (e.g., they wear uniforms and masks; the subjects wear bags over their heads). In numerous experiments, this doubled the harm participants would voluntarily commit. (Anthropological studies confirm this, finding that cultures with costumes and masks are more violent.)

Similarly, changing how people think of their actions is key. In one experiment, where the experimenter called the victims “nice guys” the amount of punishment subjects inflicted went down. But when he called them “monsters” it went up.

Zimbardo put together all that he had learned into one experiment, the Stanford Prison Experiment, to see how far things could go. Volunteer subjects were recruited and half assigned to be prisoners and half assigned to be guards so that there would be no differences between the two groups. The prisoners were arrested at their home and taken to recently-redecorated basement of the Stanford Psychology department, where they were imprisoned.

There were no windows, so prisoners could not gauge time. Prisoners were strip-searched and forced to wear dress-like clothes. They were given leg shackles, a constant reminder of their status. Guards were given uniforms and mirror sunglasses (so no one could read their emotions) as well as minimal requirements or training.

On only the second day of the experiment, the prisoners tried to resist. Guards responded by calling in reinforcements, attacking the prisoners with fire extinguishers, placing the leaders in solitary confinement, and harassing the rest. They also created a privileged cell for the prisoners who most resisted the rebellion, with special benefits. The next day, they reversed things, putting some of the leaders in the privileged cell (to imply the leader had sold out).

Soon enough, prisoners began going crazy. Guards became so evil and violent that the study had to be prematurely ended.

The relevance to Abu Ghraib should be obvious. And, sure enough, Zimbardo got a chance to testify before the court trying one of the Abu Ghraib guards, arguing that his sentence should be lowered because, as his research had shown, few could have resisted the powerful situational influences, which were surely even more powerful at a real prison with (presumably at least some) real criminals.

He went on to talk a bit about how the administration had weaponized fear with things like the terror alert system. The reason Al-Qaeda hadn’t attacked again, he suggested, was because Bush was doing their job for them, scaring the population with vague threats without clear solutions.

posted October 31, 2004 11:50 AM (Education) (0 comments) #

Nearby

The World Is Watching
The Politics of Lying
Stanford: Day 39
Stanford: Day 40
Stanford: Day 41
Philip Zimbardo on the Psychology of Evil
Stanford: Day 42
Stanford: Day 43

Comments

Subscribe to comments on this post.

Add Your Comment

If you don't want to post a comment, you can always send me your thoughts by email.



Remember personal info?


Note: I may edit or delete your comment. (More...)

Aaron Swartz (me@aaronsw.com)
All text above by me is in the public domain.