9.20.18: a rebel alliance of quality content
our facebook page our twitter page intrepid media feature page rss feed
FEATURES  :  GALLERYhover for drop down menu  :  STUDIOhover for drop down menu  :  ABOUThover for drop down menu sign in

carrot or stick
examining human desires and response
by tracey l. kelley (@TraceyLKelley)

In a 2011 episode of the TV show "Curiosity", actor/producer/director Eli Roth hosted a reenactment of Stanley Milgram's obedience experiment from 1961. The goal: to determine if society has changed much in 50 years.

In the experiment, volunteers take on roles of a learner and a teacher to test the effects of negative reinforcement on learning. A test administrator in a white coat straps the learner to a machine designed to dispense electric shocks between five and 450 volts. The teacher is seated in another room, away from the learner, in front of a panel. This panel is the shock machine. The teacher asks the learner a series of word association questions, and for every incorrect answer, the teacher announces it is wrong, tells the learner the amount of shock to be delivered, then flips a switch to shock the learner. The learner does not receive the correct answer: the teacher simply moves on to the next question.

The learner is not really being shocked. He is an actor posing under random selection by the administrator.

At 135 volts, after the learner screams out in pain and asks to stop, some volunteer teachers hesitate, question the administrator, and receive the following responses:

1. Please continue.
2. The experiment requires that you continue.
3. It is absolutely essential that you continue.
4. You have no other choice, you must go on.

The result? In 2011, just as in 1961, approximately 65% of the teacher test subjects defied their personal feelings of morality to continue the experiment and issued stronger shocks to the learners, simply because the test administer said so. There was never any threat of punishment to the teachers if they refused to continue, and the participation waiver indicated they could stop at any time.

But they obeyed for a variety of reasons. Milgram indicated that conformity, including the need to belong to a group and the inability or expertise to make decisions, prompts an individual to follow the hierarchy of the group. Another theory is that if people see themselves as simply carrying out someone else's decision, especially if told it's right, they're somehow not responsible for the resulting actions, and will obey without further independent thought.

Roth, producer/director of such horror flicks as Hostel, Hostel II, Grindhouse, and Cabin Fever, seemed genuinely surprised in the 2011 reenactment that more participants didn't walk away from the experiment, and at the number of individuals who continued to administer shocks even over the protests of the learners.

Why was he surprised? His successful livelihood depends on tapping into the darker side of human nature. "People forget that that's what moviegoers are paying for; to be terrified and disturbed," Roth has said. "There's not a single instance of a horror movie actually causing any violence. People know it's fake; that's why they allow themselves to enjoy it. It helps them deal with their fears, the fear of things beyond their control."

Yet environment often prompts our actions. If an individual is frequently in an unhealthy, violent, or controlling environment, it reduces his or her capacity for competent, empathic, and freethinking action. The majority of test subjects in the 2011 reenactment of Milgram's obedience experiment relinquished their independent thought to follow authority, regardless of the distress they felt over the situation. Frighteningly, in some cases, many of the teachers felt the learners, by not knowing the right answers, deserved the punishment. So how quickly does an observer of horror films become desensitized to violence and mayhem? How might empathy for others not only be reduced, but also replaced by the desire for swift and exact justice, regardless of how misguided?

Philip Zimbardo, the creator of the Stanford Prison Experiment and an expert on Abu Ghraib, said while discussing his book, The Lucifer Effect, "The human mind is this exquisite organ which has infinite capacity for any of us to be kind or cruel, selfish or destructive, villains or heroes, and because of that capacity, it really is the situation that moves us in the path to be perpetrators of evil. Most of us do nothing; we're the innocent bystanders. The good thing is some of us are moved to be heroes. So the question is why do good people become evil, and how can we encourage ordinary people become more heroic."

Our basal survival instinct causes us to follow the pack in the event of threat or confusion. Evolution of the species is what allows us to change that response to something more productive, regardless of the situation.

In his book, Drive, Daniel Pink says the key to motivation isn't based on punishment, and no longer lies with the promise of rewards, but more on intrinsic motivation, based on "our innate need to direct our own lives, to learn and create new things, and to do better by ourselves and our world." If you're in an environment that is very carrot and stick-oriented, first in school, then in relationships, then in a company, and so on, you're less likely to be motivated by simply doing good.

Yet to live in a world that is truly whole, it's not enough for you to be doing better; your whole tribe has to be doing better. For all of our observations and desires to lead lives of promise and fulfillment, it all has to begin first with independent thought, following that innate need to create a positive environment: When am I going to step up and do something that matters?

Is there a direct correlation to challenging conventional authority and leading a more hopeful, enlightened life? Possibly. When you examine how quickly humanity degrades in the Milgram and Zimbardo experiments, it's easy to see that people's desire to seek approval, belong to a collective, and correct any perceived wrongs far outweighs the purported need for a common good. In both experiments, participants only needed to ask themselves, "Would I want to feel this way? If not, why am I treating others this way?" That shift of power--the power over one's action verses an authortative power--drastically changes the outcome.

Society is the sum of its parts. If each individual strives to move progressively, with positive, non-harmful intent, and faces challenging situations with reflective understanding, perhaps we'd have less need to prove obedience and celebrate more examples of heroism that doesn't seem out of the ordinary.

*The Stanley Milgram Obedience Experiment
*The Philip Zimbardo Stanford Prison Experiment
*The Lucifer Effect, by Philip Zimbardo
*Drive, by Daniel Pink


Tracey likes to shake things up and then take the lid off. She also likes to keep the peace, especially in a safe, fuzzy place. Writer, editor, producer, yogini, ('cause yoger or yogor simply doesn't work) by day, rabid WordsWithFriends and DrawSomething! player by night. You can follow her on Twitter: @traceylkelley or @tkyogaforyou

more about tracey l. kelley


embracing the rolling wheel of life
by tracey l. kelley
topic: general
published: 2.28.05

of silver knights and fairy tales
pondering the first ten years
by tracey l. kelley
topic: general
published: 4.28.04


no discussion for this column yet.

Intrepid Media is built by Intrepid Company and runs on Dash