Getting Good People to Go Bad

A simple command from an authority figure can be enough for people to lose a sense of responsibility for their actions.

How easily can good people be convinced to do bad things? That all depends on how much control that individual feels over his or her own choices, and that "sense of agency" affects the way the brain processes the outcome of those actions, a new study published in the journal Current Biology finds.

In 1963, psychologist Stanley Milgram of Yale University published the results of experiments to gauge obedience to authority, a topic he became interested in following the trials of Nazi war criminals, many of whom defended themselves by claiming they were simply following orders.

Milgram sought to examine how people could be persuaded to act in ways that betrayed their own consciences in an experiment that involved not members of the Nazi party, but rather ordinary college students.

Hearts and Minds: History of Psychological Warfare

At the behest of an authority figure - the scientist running the experiment - student participants asked a series of questions and then delivered increasingly painful electric shocks for wrong answers to an unseen individual, who was in fact an actor in on the study. Despite the controversy over the ethics of Milgram's methods - many of the students showed signs of emotional distress even as they followed orders - his experiment showed that the majority of participants were willing to follow a command even if it went against their own judgment.

Taking the Milgram experiment a step further, researchers at University College London and Université Libre de Bruxelles in Belgium have determined that when coerced into taking an action that adversely affects another person, individuals experience reduced agency, altering their perceptions of cause and effect.

For their study, the researchers conducted a series of experiments. First, an "agent" would deliver mild physical pain or financial harm to a "victim," a decision that was either coerced or made freely. The study participants would then trade places, so they would understand what it felt like to play both roles. For the second part of the study, researchers analyzed the effects of the "coercion" and "free-choice" conditions on brain activity.

Science of Evil: Depravity Scale Ranks Crimes

After analyzing the results, the researchers found that the coercion condition led to a significant increase among study participants in the amount of time the "agents" perceived between an action and its outcome compared with the free choice condition. This sense of disconnect between an action and its consequences was also reflected in participants' brains. When ordered to perform a task, participants' brain activity more closely resembled processing of a passive movement rather than a voluntary action.

Taken together, these studies suggest that the "following orders" defense isn't merely an attempt to avoid punishment, at least not entirely. Individuals coerced into actions process them differently, and this is reflected in a reduced sense of responsibility.

Being told to do something morally objectionable is certainly no defense for indefensible behavior. But the latest study at least helps to explain why people are seemingly so willing to cause harm to another simply because an authority figure told them to.

Video: Was Facebook's Emotional Manipulation Unethical?

Russian President Vladimir Putin speaks to viewers in an online video.