YOU ARE HERE: LAT HomeCollections

Right, wrong? In a group, it's harder to tell

In Iraq or at home, social dynamics can make it difficult -- surprisingly so -- for individuals to keep their moral compass.

July 17, 2006|Shari Roan | Times Staff Writer

In his classic tale "Dr. Jekyll and Mr. Hyde," Robert Louis Stevenson writes of what he considered a discomforting fact of human nature: Evil dwells inside every man.

If he's right, it would appear that peers, not potions, are all that's needed to free that evil.

In Iraq, five current or former American soldiers have been charged with raping an Iraqi teenager and killing her and three members of her family. Another has been charged with failing to report the attack.

Closer to home, Fresno police have charged two college-age men -- and are investigating at least six others -- in the alleged gang rape of an 11-year-old.

Murder, rape and other acts of horrifying violence are, of course, as old as humanity. But as a society, we like to think such deeds are the work of vicious, morally bankrupt -- and isolated -- individuals. The recent incidents are shocking precisely because they didn't happen in isolation. They appear to be not the actions of a single madman, but rather groups of people who either participated in the wrongdoing -- or were present and did nothing to stop it.

Though a single perpetrator or ringleader can be dismissed as a psychopath, understanding the bad behavior of a group is considerably more perplexing. It may be difficult to believe that so many people, at one time and in one place, could misplace their ethical compasses, could lose their sense of right and wrong.

But most people are similarly capable of abandoning their principles, say behavior experts and psychologists who study group dynamics. The actions of a group, particularly a group in which members share strong feelings of loyalty, simply overwhelm the individual.

"The research is pretty clear: You put people in a group situation and they tend to do what the group decides," says Donelson Ross Forsyth, an expert in group dynamics and ethical leadership at the University of Richmond in Virginia.

History and psychological studies both bear this out. Germany wasn't chock-full of evil people during the Holocaust. Instead, mostly ordinary, law-abiding Germans followed their leaders in the torture and killing of millions of Jews.

In Rwanda in 1994, once-peaceful neighbors who had never acted with violence turned on each other in horrific acts of brutality.

Even the famous case of Kitty Genovese illustrates the point, Forsyth says. Genovese's 1964 rape and murder was heard or witnessed, to varying degrees, by dozens of New York City neighbors. But the majority neither tried to stop the attack nor called for help.

Psychologists say that most people are unable to act unilaterally -- even when they know a situation is wrong -- if their actions will separate them from the group. The concept is hardly popular, but it helps to explain repeated examples of wrongdoing among groups of people who should know better.

"It's easy to say evil happens because the people who did it are evil rather than asking what made ordinary people evil," says Christopher Browning, a history professor at the University of North Carolina at Chapel Hill who has studied the role of ordinary people in committing atrocities during the Holocaust. "People don't want to look in the mirror and think, 'I could have done that.' But you're not going to explain these things by saying we had an unusual cluster of criminal or evil people."

Ultimate peer pressure

Group dynamics are on especially forceful display in military units, urban police departments, youth gangs, even among sports teams engaged in high-stakes competition. Members of such units are taught to obey authority, protect each other and remain loyal to the group.

But this type of indoctrination can have disastrous consequences when even one member transgresses.

"The need to feel connected to people is very intense" in such groups, says Ervin Staub, a psychology professor at the University of Massachusetts and author of "The Psychology of Good and Evil." "Even when people think 'this is wrong,' to step forward and oppose the whole group -- the people you've been fighting with, the people whose support you depend on for your very own security -- is extremely difficult."

If the group leader is involved, it is even harder for individual members to object, experts say. In a well-known experiment conducted in 1961 by Yale psychologist Stanley Milgram, study volunteers were asked to administer electric shocks to another study participant in the laboratory. Even though the victim shrieked in pain (the "victim" was not actually receiving a shock, but was an actor told to fake pain and fear) the majority of the volunteers, although distressed, obeyed the study supervisor and administered the shocks as ordered.

"Everyone obeyed in that study. Everyone gave some electrical shocks," Forsyth says. "People say, 'Why didn't they stop and think?' Well, they didn't have a chance to stop and think. They didn't think about what is right and wrong."

Los Angeles Times Articles