Can a Bad Deed Lead to a Good One?By Emiliana R. Simon-Thomas | April 3, 2013 | 0 comments
A new study reveals how our frame of mind shapes our sense of right and wrong.
Imagine you are sitting at the train track switch, suddenly faced with a life-or-death decision.
A train is barreling unstoppably toward five people. The good news? There’s a Y junction in the train’s path, and you could divert the train onto another track. The bad news? Doing so will kill one person standing on the other track. If you switch the train’s trajectory, five people will live—but one will die. What should you do?
Scientists call this test the “Trolley Dilemma,” and they believe that how you respond to it reveals your “moral orientation.” If you don’t switch the track, you’re “rule-oriented”—you abide by certain moral rules, such as refusing to deliberately take someone’s life, no matter the cost. If you do switch the track, you’re “consequence-oriented”—you weigh the moral value of an action by its result, like being willing to kill one person in order to prevent the certain deaths of five others.
Encountering different orientations can bring out our prejudices—if the refusal to switch tracks disgusts you, you’re merely revealing your moral bias toward consequences.
But a series of experiments recently published in the journal Psychological Science suggests you should think twice before judging someone’s moral orientation: Results say that on one day you’ll divert the train and save those five lives—but on another you might not. It all depends on how you are thinking about morality, your past behavior, and yourself.
So how does your thinking shape your moral choices? Read on to better understand your moral orientation.
Which is more important to you—rules or consequences?
The researchers, led by Gert Cornelissen at the Universitat Pompeu Fabra, presented college undergraduates with the Trolley Dilemma, as well as tests of fairness and honesty. Their experiments show that we are all capable of making moral decisions based on either consequences or rules, depending on circumstances. What factors can cause such a switch?
First and foremost, their methods underscore how important framing is to our decisions. The researchers provided written definitions of morality to participants. Half of the definitions were oriented toward consequences (morality = doing what you know will lead to the best outcome for the largest number of people) and the other half toward rules (morality = doing what adheres to a key moral rule).
Then they asked participants to recall a past ethical or unethical act from their own lives—and to describe the impact of that past act, either in terms of consequence (how many people did your behavior affect?) or rules (which rule did your behavior concern?).
After defining morality and asking participants to reflect on past behaviors, the researchers presented the Trolley Dilemma. Participants who received the consequence-oriented moral definition at the outset—and who had been encouraged to think about their past behaviors in terms of impact—readily diverted the train, thus saving four lives. On the other hand, those who received the rule-oriented moral definition—and had been encouraged to think about their past act in terms of instructions—let the five die.
Thus Cornelissen’s team showed that we are all capable of approaching moral decisions with either orientation, depending on how we define morality in the moment—possibly with life-or-death results.
Do you strive for balance or consistency?
The Trolley Dilemma wasn’t the only moral test faced by participants. Each person was also given $10 and told to split it with a partner in whatever way they wished—a test psychologists call the “Dictator Game.”
Of course, the fairest thing to do is split the money evenly. But again, different factors influenced how participants behaved. Those who remembered a past ethical act split the money differently than those who thought of something unethical—and the difference went in opposite directions for consequence-oriented and rule-oriented thinkers.
For consequence-oriented people, thinking of something unethical they’d done in the past made them split the money more fairly, as if they were making up for a past wrong. But rule-oriented people kept more money for themselves when they thought of an unethical past act, as if they were trying to maintain a consistently low moral standard.
Conversely, when they had to recall a good deed of theirs, consequence-oriented people split the money more selfishly—and rule-oriented people shared more equitably.
The researchers say the consequence-oriented people are engaging in “moral balancing,” permitting themselves to behave poorly to balance out a past good (or vice versa). They say the rule-oriented people, by contrast, demonstrate “moral consistency,” apparently trying to stick by their principles no matter what.
Can doing the right thing allow you to do a wrong?
The same tendencies toward moral balancing or moral consistency showed up in another test researchers put to the participants—but this time, without their knowledge.
Once again, researchers defined morality for participants, asked them to remember an ethical or unethical past behavior, and presented them with Trolley Dilemma.
But then, instead of the Dictator Game, participants did a timed mathematical matrix game and kept track of their own scores. Afterward, the researchers compared their self-reported scores with their actual (covertly recorded) scores.
It turned out that consequence-oriented, train-redirecting folks cheated more after remembering a past ethical act. Rule-oriented participants cheated more after remembering an unethical act. And vice versa on both counts. These results were consistent with the previous test—one group strove for balance, the other for consistency.
Through these experiments, the study revealed two factors that can influence your moral decisions. The first involves how morality has been defined for you, in this case around consequences or rules. The second factor depends on memory and whether your past ethical or unethical behavior is on your mind. It’s the interaction between those two factors that shapes your decision.
How can this insight help you to make fairer, more honest, less hurtful decisions?
If you find yourself generally thinking about the larger consequences of your actions (how many people will be hurt by my action?), it’s good to know that you are prone to “moral balancing.” Therefore, you should probably not dwell too often on your past good deeds, as that could make you think you’ve got a free pass to be immoral. By reflecting on a less-than-ideal action from your past, you’ll probably feel motivated to balance things out by acting more ethically in the present.
If, on the other hand, your thinking is guided by rules—what am I supposed to do?—you’re likely to strive for consistency. Thinking of a past ethical act will maximize the moral aptitude of your current decision—and you’ll likely feel inspired to maintain a high level of goodness.
In both cases, the trick is to understand your own thinking—and then to think yourself into acting morally.
Greater Good wants to know:
Do you think this article will influence your opinions or behavior?
About The Author
Emiliana R. Simon-Thomas, Ph.D., is the science director of the Greater Good Science Center.