The latest issue of Greater Good includes several essays about "the psychology of the bystander." The issue considers why some people do nothing when they witness a crisis, while others spring to action. Why do we all act like bystanders in some situations, but not in others? Are some people less likely to act like bystanders—and if so, is that because of the way they were raised, their religious background, or just the specifics of the situation they find themselves in at a given moment in time?

Included in the issue is an interview I conducted with Philip Gourevitch, the editor of The Paris Review who's also the author of We wish to inform you that tomorrow we will be killed with our families. Based on his reporting in Rwanda and elsewhere around the world, I wanted to ask Gourevitch what he sees as the factors that induce nations to intervene—or as is more often the case, not intervene—in regional violent conflicts around the world.

He had a pretty realpolitik take on international affairs, explaining how lofty humanitarian ideals are often difficult to put into practice. At one point in the interview he explained U.S. reluctance to intervene in regional conflicts, even in instances of genocide, as evidence that we feel a stronger emotional connection to people who live in greater proximity to us.

Advertisement X

So if you find out that people whose existence you had never previously noticed are raping and axe murdering some other such people on the other side of the planet, do you say, "Let's get in the middle of that. If we don't stop it we're all less safe—they're human beings just like us"? Alas, it just doesn't feel that way to most people. Of course, they're human beings, and it's a terrible thing, but the sense of a shared fate is weakened by distance and difference.

I'd have to agree with him. It's a pretty obvious point: Instinctively, we don't feel as strong a moral obligation to people halfway around the world as we do to people next door to us, not to mention our own friends and family.

But I can't say I feel good about that. And I also wonder whether it has to be so. Why exactly do we feel this way? Is it just a result of social conditioning, biological predispositions, or some combination of the two? And perhaps more importantly, even if we are hard-wired to feel this way, does that make it right?

Neuro-psychologist Joshua Greene has some pretty provocative answers to these questions. Greene studies the neurological bases of our moral decision making. In one study, he presented participants with different moral dilemmas. In one, you would imagine driving along a country road when you see a man by the side of the road, his legs covered in blood. This man will probably lose his leg if he doesn't get to a hospital soon, but if you pull over, the blood would do a few hundred dollars worth of damage to the leather upholstery in your car. Should you pull over?

In another scenario, participants would consider receiving a letter from a reputable international aid organization, asking for a donation of two hundred dollars. The letter explains that a two-hundred-dollar donation will allow this organization to provide needed medical attention to some poor people in another part of the world. Would it be morally acceptable to not make the donation?

Greene tested participants' brain activity as they mulled over these dilemmas. He assumed, as would most of us, that most people would find inaction in the first scenario to be morally reprehensible, but not in the second. He wanted to see if this moral distinction was at all reflected in participants' brain activity.

He found a difference in brain activity when people considered "personal" moral dilemmas, where they come into close contact with someone like the man with the bloody leg, as opposed to "impersonal" ones like the request from the aid organization: When people considered the "personal" dilemmas, their brains showed greater activity in areas associated with emotion and social cognition.

What does this mean? In a paper in Nature Reviews Neuroscience, Greene offers an evolutionary interpretation.

Consider that our ancestors did not evolve in an environment in which total strangers on opposite sides of the world could save each others' lives by making relatively modest material sacrifices. Consider also that our ancestors did evolve in an environment in which individuals standing face-to-face could save each others' lives, sometimes only through considerable personal sacrifice.Given all of this, it makes sense that we would have evolved altruistic instincts that direct us to help others in dire need, but mostly when the ones in need are presented in an "up-close-and-personal" way.

So, Greene speculates, due to our evolutionary history, we've developed stronger altruistic instincts toward people in close proximity to us. Until relatively recently, we had no definite proof of anyone else on the planet even existing, so we don't have an evolved sense of moral responsibility to individuals who seem more like abstract ideas than human beings. And thus our moral judgments are actually driven by the more immediate, instinctive, emotional responses we have to moral dilemmas, not the rational, abstract calculations we try to make. Greene goes on to ask, "What does this mean for ethics?"

We are tempted to assume that there must be "some good reason" why it is monstrous to ignore the needs of someone like the bleeding hiker, but perfectly acceptable to spend our money on unnecessary luxuries while millions starve and die of preventable diseases. Maybe there is "some good reason" for this pair of attitudes, but the evolutionary account given above suggests otherwise: We ignore the plight of the world's poorest people not because we implicitly appreciate the nuanced structure of moral obligation, but because, the way our brains are wired up, needy people who are 'up close and personal' push our emotional buttons, whereas those who are out of sight languish out of mind.

Greene stresses that, so far, this is just a hypothesis. And I think there are a couple of different ways to interpret his (admittedly preliminary) findings. On the one hand, you could be rather fatalistic about the whole thing: This is just the way we're wired, so we can't really expect most people to feel a strong, instinctive moral obligation to others who look very different from them and live halfway around the world. When someone chooses not to care about those people in need, there's no sense in condemning them for selfishness. They're just being true to their nature.

But there's another way to read these findings: Just because a moral judgment is instinctive doesn't make it right. We should scrutinize all our moral decisions, even—perhaps especially—the ones that come to us reflexively. In fact, knowing the possible neurological (and even evolutionary) basis of these judgments might help dispel the notion that they're beyond reproach. Most of us know that even if there's a biological basis to them, our whims and predispositions don't always—or even often—direct us to moral and ethical choices, whether they concern out diet, sex lives, or relationships. So it would certainly seem to be a mistake to confuse our instinctive moral judgments for objective truths.

Greene sort of gets at this in the last paragraph of his Nature Reviews Neuroscience paper.

The maturation of human morality will, in many ways, resemble the maturation of an individual person. As we come to understand ourselves better — who we are, and why we are the way we are — we will inevitably change ourselves in the process. Some of our beliefs and values will survive this process of self-discovery and reflection, whereas others will not. The course of our moral maturation will not be entirely predictable, but I am confident that the scientific study of human nature will have an increasingly important role in nature's grand experiment with moral animals.

And so where does all this leave Philip Gourevitch and the future of American interventionism? I don't think it's entirely clear. But I think it can, at the least, offer some hope for a heightened American sense of responsibility for the problems of those far less fortunate than us. How we should act on that sense of responsibility is a different story.

GreaterGood Tiny Logo Greater Good wants to know: Do you think this article will influence your opinions or behavior?

You May Also Enjoy

Comments

blog comments powered by Disqus