Raw Thought

by Aaron Swartz

Causes of Conformance

Institutions require people to do their bidding. A tobacco company must find people willing to get kids addicted to cigarettes, a school must find teachers willing to repeat the same things that they were taught, a government must find public servants willing to enforce the law.

Part of this is simply necessity. To survive, people need money; to get money, people need a job; to get a job, people need to find an existing institution. But the people in these positions don’t usually see themselves as mercenaries, doing the smallest amount to avoid getting fired while retaining their own value system. Instead, they adopt the value system of the institution, pushing it even when it’s not necessary for their own survival. What explains this pattern of conformance?

The most common explanation is an active process of beating people in: politicians get paid campaign contributions (legalized bribes) to meet the needs of the wealthy, employees get bonuses and penalties for meeting the needs of their employers, kids get threatened with time-outs and bad grades if they don’t follow the demands of their teachers. In each case, the people are forced through a series of carrots and sticks to learn the values of the people in charge.

This is a fairly blatant form of conformance, but I suspect it’s by far the least effective. Studies on punishment and rewards show that dealing them out lessens the victim’s identification with the enforcer. Hitting me every time I don’t do my job right may teach me how to do my job, but it’s not going to make me particularly excited about it.

Indeed, punishments and rewards interfere with a much more significant effect: cognitive dissonance. Cognitive dissonance studies have found that simply by getting you to do something, you can be persuaded to agree with it. In a classic study, students asked to write an essay in favor of a certain position were found to agree more with the position than students who could write for either position. Similarly, people who pay more to eat a certain food claim to like it more than people who pay less.

The basic theory is that people work to lessen the disagreement between their beliefs and their actions and in most cases it’s simply easier to change your beliefs. Quitting your job for the government is tough and painful; and who knows if you’ll soon find another? So it’s much easier to simply persuade yourself that you agree with the government, that you’re doing the right and noble thing, that your work to earn a paycheck is really a service to mankind.

Of course, it also helps that everyone you’re surrounded by feels the same way. Culture is another important influence on our beliefs. Another raft of social psychology studies find that people are willing to deny even obvious truths to fit in with a group. In the famous Asch studies on conformance, a group of confederates were seated around a table, with the subject of the experiment on the end. Everyone at the table was given a sheet with three lines, one obviously longer than the other, and then was asked to name the two lines of identical length. All of the confederates gave an obviously wrong answer and by the time the question got to the guy at the end, he ended up conforming and giving the wrong answer as well.

Similarly, spend your days in government offices where people simply take it for granted that they’re doing the right thing, and you’re likely to pick up that tacit assumption yourself. Such ideas are not only frequently stated, they’re often the very foundation of the discussion. And foundational ideas are particularly hard to question, particularly because they’re so taken for granted.

But perhaps the most important effect for conformance is simply selection. Imagine that nobody was corruptible, that all the carrots and sticks in the world couldn’t get someone to do something they thought morally wrong, that they stood fast in the face of cognitive dissonance, and that their moral fiber was so strong that they were able to resist a less conscientious culture. Even then, it wouldn’t make much difference. As long as there was enough variety in people and their moral values, all an organization would need to do is simply fire (or fail to promote) everyone who didn’t play their game.

Everyone knows you climb the corporate ladder by being a “team player”. Those who make a fuss or don’t quite live up to expectations simply get passed over for a promotion. The result is simply that — without any explicit pressure at all — the people in positions of power happen to be the ones who identify with the organization’s aims.

It’s easy to look at the rather more flashy pieces of punishing people for failing to follow orders or living in a culture of conformity. But for those who want obedient employees, sometimes the most effective technique is simply failing to say yes.

You should follow me on twitter here.

December 28, 2006

Comments

“[S]ometimes the most effective technique is simply failing to say yes.”

Indeed. This is organisational Darwinism. As Kevin Kelly wrote in “Out of Control: The Biology of Machines” (1994), we can think of random mutation as an author who only knows the concept “Maybe?” and natural selection as an editor who only knows the word “No”.

posted by Michael Bywater on December 28, 2006 #

Institutions take on a life of their own, then, eh? If the CEO of Exxon/Mobil were to object to his job on moral grounds, he would quickly be replaced with someone who doesn’t. Is he, then, morally responsible for any evils his company does? What about the lobbyist he hires that is just as easily replaced? The senator? The weapons manufacturer?

posted by Andrey Fedorov on December 28, 2006 #

I don’t think that gets him off the hook. If I told you that you should kill your mother or I will, does that make it OK to kill your mother?

posted by Aaron Swartz on December 29, 2006 #

Killing my mother, no.

But it’s not a black/white situation, as we see in practice - replace “killing” with “maximizing profits, possibly affecting” and “your mother” with “third-world poor people that look different”, and “or I will” with “or your competitor will, who’ll then get that $3 million bonus instead of you” - and you’ll see a lot of intelligent people changing their minds.

If you also consider the systems in place that entice corporate executives by the methods you mentioned (approval of their golfing buddies, fancy stockholder meetings, etc.), then their actions become at least understandable, if not morally justified.

Morality is hard to talk about, though, since free will is probably an illusion.

posted by Andrey Fedorov on December 29, 2006 #

“But for those who want obedient employees, sometimes the most effective technique is simply failing to say yes.”

I find that statement interesting in view of this post http://dirtsimple.org/2006/12/real-reasons-there-are-few-women-in-it.html which says that the best way to get more women in IT is to not say yes to assholes (specifically when hiring, but I suspect in general).

posted by James on December 30, 2006 #

The basic theory is that people work to lessen the disagreement between their beliefs and their actions and in most cases it’s simply easier to change your beliefs.

If I read you correctly, I’d like to point out that cognitive dissonance is not a disagreement between beliefs and actions, but disagreement between sets of cognitions. These cognitions can be beliefs or they can be perceptions of one’s actions.

One example that always reminds me of the power of it is the group that Leon Festinger infltrated in the 1950s led by Marion Keech ( see also: http://en.wikipedia.org/wiki/Leon_Festinger )

There, the set of cognitions (that a spaceship sent from God would come to take them away from the evil world and that the collective actions of the group were correct) conflicted with the reality of the situation (no spaceship came). They had to construct new cognition that would rest the dissonance without destroying their old cognitions (that God respected their actions so much, they spared the whole planet).

posted by Steve Pomeroy on January 3, 2007 #

You can also send comments by email.

Name
Site
Email (only used for direct replies)
Comments may be edited for length and content.

Powered by theinfo.org.