Let’s imagine that you come to our lab at Northwestern University to do a task. You have sets of numbers in front of you, and you are asked to find the two numbers that add up to exactly 10 for each set. For each correct response, you earn 50 cents.
Now, imagine that we tell you that you can score yourself, and then recycle the paper with your responses, which doesn’t have your name on it. All you have to do is turn in a payment slip with your score, and we pay you.
Would you cheat?
When we conducted these types of studies, after participants left, my research assistant actually dug through the recycling and scored everyone. And we often found that everybody basically cheats a little and earns $2 to $3 extra.
Decades of research point to moral fallibility, that humans are not perfect and are likely to fail in being moral. Everyday people end up violating their own moral values, sometimes unknowingly, and they find numerous ways to rationalize or ignore this behavior. By doing so, they keep their image of themselves as good, honest individuals—so good that the average person thinks they’re more likely to go to heaven than Mother Teresa.
Is there anything to be done about this? First, we need to be aware of all the subtle ways that our moral decisions can be swayed. Then, we can put safeguards in place so we can make better decisions over time—and become better people.
What influences our ethical decisions?
Moral decisions don’t simply come down to a conscious choice to cheat or not. Research has found that certain things consistently influence our choices—whether it’s how we’re feeling or what time of day it is.
In one study, we gave participants the numbers task to complete while listening to anxiety-inducing music from the movie Psycho. In that situation, people are even more likely to exaggerate their performance.
What’s the explanation? Our data suggest that anxiety increases people’s perception of threat, which in turn results in self-interested, unethical behavior. In threatening situations, our brain shifts into a state that facilitates rapid defense mechanisms; our cognitive resources are temporarily diverted so we can quickly respond to the situation and protect ourselves. Because of these self-protective impulses, we are more likely to narrowly focus on our own basic needs and self-interest, rather than being more mindful of ethical principles.
Another factor that matters is time of day. In one study, half of our participants were randomly assigned to do a task in the morning, 8 a.m. to 12 p.m. The other half did it between 2 to 6 p.m. in the afternoon. In this case, we saw more cheating in the afternoon.
This is evidence for people’s inability to regulate their behavior in a tempting situation. The mere experience of everyday living—making decisions, expending physical energy—can reduce our ability to exert self-control as the day progresses. As we become more tired, our morality is compromised.
We’re also heavily influenced by the way people around us behave. We learn vicariously from our peers, our groups, and our leaders. Workplaces can intentionally or unintentionally normalize unethical behavior, which leads to collective corruption. For example, in one paper we showed that the language used by corporations reflects their culture and shapes employees’ behaviors. Specifically, we found that corrupt companies use linguistic obfuscation (language that is difficult to understand) in their values statement, and as a result team members cheat more.
There is other research pointing to even more factors that affect our moral decisions. For example, if people have ambitious goals or have performance pressures, they are more likely to engage in everyday dishonest behavior. These subtle situational forces can swing our moral compass.
Importantly, we often don’t realize the impact of these factors. If I asked you whether you’re more likely to be unethical in the morning or afternoon, you probably wouldn’t think it makes a difference.
In some ways, our brains may be concealing our own dishonesty from us. In another one of my studies, participants who engaged in a task where they had the opportunity to cheat had a much weaker memory of the experience—when and where it happened, how they felt—compared to those who completed a task without the possibility of cheating. This forgetting seems to be one of the psychological tricks that enable us to engage in questionable behavior over time.
Three steps to moral growth
Based on my research, here are some guidelines to help you make more moral decisions and continue growing and learning as an ethical person.
1. Plan for ethical challenges. Since other people play a significant role in our morality, one place to start is to find an ethics mentor. You can seek guidance from someone inside or outside your organization, someone trustworthy to discuss ethical issues with.
Next, you can also manage other people’s expectations of you—whether directly or indirectly. For example, in one of my studies, participants were less likely to ask someone to lie after receiving an email from them with a moral quotation in the signature (something like “Success without honor is worse than fraud”).
Even just including that type of quotation in your email signature is a type of safeguard, so you are less likely to be asked to do something questionable. In this way, showing your character can help stop moral dilemmas from even arising.
“To be ethical doesn’t mean being perfect all the time, but it does mean being dedicated to learning.”
2. Bring awareness to a moral challenge in the moment. There is a lot of evidence of “moral fading,” where we simply don’t pay attention to the moral implications of our decisions. When dilemmas do arise, we have to explicitly look for these moral implications and not narrowly focus on the costs for ourselves. For example, you may be choosing between two products and one might be cheaper, but at the same time you have information that the company is using questionable labor practices. Do you take that into consideration? Do you think about the harm in this context?
Another key is avoiding rationalization. We can be very creative in justifying questionable behavior when there is self-interest involved. We might tell ourselves, “Oh, everyone does this, I’m just following orders, I’m doing this for the greater good, it’s their own fault, they deserve it.”
If you’re aware of these tendencies, you can try three tests to avoid self-deceptive rationalization:
- The publicity test: How would you feel about your local newspaper publishing your choice and your thought process on the front page?
- The generalizability test: How would you feel about everyone acting in this way?
- The mirror test: If you look in the mirror after making the decision, would you be happy with yourself?
Finally, not rushing the decision is important. In a classic study, Princeton Theological Seminary students were less likely to help a stranger who was lying slumped on the ground when they were facing time pressure to go and deliver a lecture.
The traditional advice for making a decision is to sleep on it—and that is helpful to encourage you to think about decisions more carefully. If possible, you can also consult your company’s organizational policies by reading codes of conduct or calling a hotline.
3. Use reflection to learn from moral challenges. To be ethical doesn’t mean being perfect all the time, but it does mean being dedicated to learning. When you make a mistake, you can reflect in order to learn and do better in the future. To adopt an ethical learning orientation, ask yourself, “What can I do to be a better person?”
Sometimes, the problem is that we treat work as a completely separate realm of life. My research suggests that our tendency to separate personal and professional life—what is called “identity segmentation”—leads us to engage in more questionable behavior because we use a different code of conduct at work and at home. When people have an integrated identity across their professional and personal life, that leads to a sense of authenticity and more ethical decisions.
You can also learn by seeking more feedback and getting input on your moral decisions. This is particularly important because at work, managers tend to give much more feedback on performance mistakes rather than moral lapses. And we’re less likely to ask for feedback about our own ethics at work.
Ultimately, we may also have to assess whether our work is a moral fit. Is this the type of organization or job that is a good fit for you? Is this the industry you want to be part of?
I like to think about work as a “moral laboratory.” At its best, it provides opportunities for you to learn and grow in your job, and become your better self.
This essay is based on a talk that is part of the Positive Links Speaker Series by the University of Michigan’s Center for Positive Organizations. The Center is dedicated to building a better world by pioneering the science of thriving organizations.
Comments