In February, a large section of California’s Oroville Dam spillway collapsed due to heavy winter rains, threatening communities below with potentially devastating floods. Later, news agencies reported that the government was aware of the weaknesses of the dam but never took steps to correct them. They simply ignored the potential risks, hoping they’d never be faced with a scenario like the one that unfolded.

Why do so many of us hide our heads in the sand when faced with the possibility of a catastrophic future event?

Mount Sinabung volcano in Indonesia Mount Sinabung volcano in Indonesia

It all comes down to our psychological biases, according to The Ostrich Paradox: Why We Underprepare for Disasters, a new book written by Wharton School professors Robert Meyer and Howard Kunreuther. When considering issues like climate change, the effects of automation on jobs, or how to save for retirement, we tend to focus on the wrong things, use the wrong kind of reasoning, and ultimately make ourselves vulnerable to disaster.

Advertisement X

Meyer’s and Kunreuther’s book is a quick and easy read distilling what science has uncovered about the ways we make decisions, especially under risky circumstances, and how we can make better ones.

Six biases that put us in danger

Thanks to the well-publicized work of Daniel Kahneman and Amos Tversky, many of us know by now that we are governed by two cognitive systems: one that is more automated and drives instinctual decisions—like quickly moving away from danger—and another that involves more considered thought and drives deliberate decisions—like deciding what house to buy. These work in concert so that we can move through everyday life without having to deliberate every single thing we do.

But these systems also have their downsides.

One is that they are less useful when we are confronted with unpredictable problems that are unfamiliar to us, require complex analyses, or are unlikely to occur except in the distant future. In these cases, we tend to rely on six biases that can lead us to misperceive our situation and potentially take wrong action (or avoid action altogether). These are nicely described in The Ostrich Paradox, as follows:

  • Myopia: The tendency to focus more on the short-term costs than the future potential benefits of investments.
  • Amnesia: The tendency to forget the lessons of past disasters.
  • Optimism: The tendency to underestimate the likelihood of future hazards.
  • Inertia: The tendency to maintain the status quo or adopt the default option when uncertain about the future benefits of investing now.
  • Simplification: The tendency to selectively consider only certain factors when making choices involving risk.
  • Herding: The tendency to follow the “wisdom” of the crowd.

In their book, Meyer and Kunreuther describe in more detail how each of these biases might function in the real world. For example, myopia can make a politician decide not to use current funds to strengthen a dam that is in danger of collapsing if there is a catastrophic amount of rain. It’s easy in hindsight to see that repairs to the Oroville Dam should have been done; but myopia—and perhaps optimism and inertia—led planners to forego preparations in advance to head off that disaster. This is not dissimilar to what happens in decisions regarding climate change, and may result in a future calamity.

Why would we tend to be myopic in this regard? Part of it has to do with how our brains work. When we are faced with immediate versus long-term rewards, most of us will go for the immediate rewards, because doing so releases a cascade of feel-good hormones. Similarly, if taking action now to prevent a future disaster involves a difficult hurdle—such as bureaucratic red tape or high financial, social, or political costs—we will tend to focus on the immediate (avoiding public resistance to increased taxes) versus the future (preventing a catastrophe).

The authors also point to research uncovering how these biases unfold in the lab and in the world. For example, they cite a study in which researchers found that Queenslanders did not make decisions about buying insurance based on the apparent risks of disaster, but based on the social norms around buying insurance—a seemingly irrational choice reflecting the herding bias. Similarly, in a lab simulation, the authors found that how much a participant was willing to invest in structural improvements designed to protect against earthquake damage was mostly related to how much their “neighbors” in the simulation spent.

How to overcome your biases (and stay safe)

Luckily, the authors also have some ideas of how people can better manage their biases to make better decisions. One idea detailed in the book is what they call a “behavioral risk audit”—a tool for anticipating biases that may arise when individuals or organizations need to think about the risks that disasters and hazards pose to them and their community. With these biases in mind, we can better anticipate what might get in the way of successful interventions and tweak them accordingly so they’re more successful.

<em><a href=“http://www.amazon.com/gp/product/1613630808?ie=UTF8&tag=gregooscicen-20&linkCode=as2&camp=1789&creative=9325&creativeASIN=1613630808”>The Ostrich Paradox: Why We Underprepare for Disasters</a></em> (Wharton Digital Press, 2017, 132 pages) The Ostrich Paradox: Why We Underprepare for Disasters (Wharton Digital Press, 2017, 132 pages)

In other words, it’s best if decision makers don’t prepare for future consequences only by looking at objective risks and vulnerabilities. Instead, planners should be encouraged “to think first about how individuals in hazard-prone areas are likely to perceive risks and why they might not adopt different preparedness measures.” In that way, planners can better nudge people toward more acceptable remedies for their situation.

For example, consider people who live in flood-prone areas but don’t want to buy high-cost flood insurance. The authors suggest fighting myopia by spreading out the costs of insurance over time through long-term loans, and reducing amnesia by rewarding insurance buyers with yearly rebates for having no flood claims. They also recommend communicating risk in ways that fight the optimism bias—like letting people know there is a 1 in 5 risk of high flooding in their community rather than saying there’s a 1 in 100 chance of their house being damaged in a flood.

Of course, biases aren’t the only things to consider when planning for long-term risks. The authors suggest that policymakers and the public adopt other guiding principles that may lead to better preparation for our future: committing to long-term protective planning as a major priority, committing to policies that discourage individual and community actions that increase their exposure to long-term risks, and committing to addressing problems equitably.

The challenge is how to get the public to adopt these principles, and the authors don’t have a lot to offer on this front. But they hope that by understanding some of our cognitive biases and ideas of how to manage them, planners and policymakers will have a better chance of engaging people to take action to minimize risks and make all of our futures brighter.

“If we as a society are to commit ourselves to reducing future losses from natural and man-made disasters in the truly long run, we need to do more than hope that individuals and policymakers will see wisdom in these investments on their own,” they write.

Hopefully, the awareness of these biases is a move in the right direction.

GreaterGood Tiny Logo Greater Good wants to know: Do you think this article will influence your opinions or behavior?
You May Also Enjoy
Comments
blog comments powered by Disqus