You can now read 10 articles a month for free. Read as much as you want anywhere and anytime for just 99¢.

opinion | Cass Sunstein

Avoiding the cost of needless fear

Obama’s former regulatory czar explains how to avoid bad rules

Daniel Hertzberg for the boston globe

It isn’t easy to love cost-benefit analysis. When the Department of Transportation compares the monetary benefits of a new vehicle safety requirement with the monetary costs, or when the Environmental Protection Agency does the same for air pollution controls, we don’t hear sustained applause.

While presidents from Reagan to Obama have insisted that regulators must use cost-benefit analysis to discipline their policymaking, many people, especially on the left, vigorously object that the approach is a business-friendly obstacle to sensible safeguards designed to protect health and the environment. Last month, President Obama directed the EPA to issue new rules to increase the fuel economy of heavy-duty vehicles. Environmentalists tend to push for strong fuel economy requirements, limited only by what is feasible — and to contend that rules protecting clean air should not be based on an effort to weigh benefits and costs.

Continue reading below

Cost-benefit analysis is the practice of scrutinizing regulations to guarantee that the social benefits, understood in monetary terms, justify the social costs. One of the strongest justifications for that approach comes from psychologists and behavioral economists, who have shown that human beings are vulnerable to “misfearing.” Sometimes we fear the wrong things, devoting a lot of money to small problems and little or nothing to big ones. Public officials can also be vulnerable to misfearing. The good news is cost-benefit analysis can help steer all of us straight.

Sometimes we lack a sense of proportion, arguing for expensive regulatory requirements that would increase the price of goods and services and thus impose real harm on workers, consumers, and businesses large and small. Cost-benefit analysis helps provide us with that sense of proportion.

A lot of recent work in behavioral science (including Daniel Kahneman’s masterpiece, “Thinking, Fast and Slow”) has explored two systems of cognitive operations in the human mind.

The first system is fast, emotional, associative, and intuitive. It’s frightened by loud noises, big animals, and things that seem disgusting or unnatural. It does not care much about the abstract ideas of air pollution and poor diet. By contrast, the second, slower system engages in some kind of assessment of whether the loud noises, the big animals, or the disgusting or unnatural things pose a genuine threat. So, for example, our emotions may lead us to be frightened of genetically modified food or flying in airplanes, but the human mind can also create a deliberative check, leading people to consider the possibility that the risks are trivial.

Under the influence of the fast system, misfearing can be found in some important judgments that we make in our daily lives. We might be excessively fearful of the risk of terrorist attacks or street crime, while neglecting the dangers associated with obesity, insufficient exercise, smoking, mental illness, and texting while driving. But the political process is also influenced by misfearing, including in the most responsive democracies.

When we misfear, it is often because of what comes first or most readily to mind. It is well-known that people use mental shortcuts, or heuristics, in thinking about risks. For purposes of misfearing, the most important example is the “availability heuristic.” People think that a phenomenon is more probable if they can recall a time when it came to fruition.

The availability heuristic helps explain why people tend to overestimate the number of deaths from highly publicized events (motor vehicle accidents, tornadoes, floods, botulism) and to underestimate the number from less publicized sources (strokes and stomach cancer).

When regulators devote large resources to relatively small risks, it is often because Congress has been moved by a particular incident and forces them to do so. In 2008, for example, lawmakers responded to some high-profile accidents by requiring railroads to adopt new safety technologies that cost billions of dollars. The resulting regulation is one of the very few recent rules with costs far in excess of benefits.

Our thought and behavior are often affected by the worst case, not the likelihood that it will occur. Here is another source of misfearing, both in ordinary life and potentially in public policy.

Consider the remarkable finding that if people are asked how much they will pay for flight insurance for losses resulting from “terrorism,” they will pay more than if they are asked how much they will pay for flight insurance for all causes. And when people discuss a low-probability risk, their concern sometimes rises even if the discussion consists mostly of apparently trustworthy assurances that the probability of harm is small.

Especially in the modern era, risk perceptions tend to go viral. One reason is that when individuals do not have information of their own, the statements or actions of a few people can initiate a kind of cascade, with potentially distorting effects on policy. When the public misfears, it is frequently because cascade effects lead people to rely on what they think other people think, and thus lend their voice to an increasingly loud chorus — whether or not there is any real danger.

While the monetary equivalents may not tell us everything we need to know, cost-benefit analysis plays a natural role here as well. The effect of such analysis is to subject misfearing to a kind of critical scrutiny, by reducing the risk that the public demand for regulation will be rooted in rumor or myth and by ensuring as well that government is regulating real hazards even when the public demand is low.

This defense of cost-benefit analysis – both for ordinary life and for public policy — has a close connection to a famous exchange during the founding era. Thomas Jefferson was in France during the Constitutional Convention, and when he returned, he asked George Washington to explain why the Convention’s delegates had created a Senate.

Washington responded with a seemingly irrelevant question of his own: “Why did you pour that tea into your saucer?” Jefferson replied: “To cool it.” Then Washington gave his answer. “Even so, we pour legislation into the senatorial saucer to cool it.”

Cost-benefit analysis has a similar function — to help ensure that policy is driven not by hysteria or alarm but by a full appreciation of the human consequences. Nor is cost-benefit analysis only a check on unwarranted regulation. It can and should serve as a spur to regulation as well. If risks do not produce visceral reactions, partly because the underlying activities do not produce vivid mental images, cost-benefit analysis can show that they nonetheless deserve our attention. Consider, for example, this week’s important, cost-justified rule from the EPA, designed to reduce air pollution from gasoline.

True, it’s hard to love cost-benefit analysis. But when intuitions are unreliable, when anecdotes mislead us, and when misfearing leads us in the wrong direction, that unlovable approach turns out to be an indispensable safeguard.

Cass Sunstein is the Robert Walmsley University Professor at Harvard Law School. From 2009 to 2012, he served as the administrator of the White House Office of Information and Regulatory Affairs. His latest book, “Conspiracy Theories and Other Dangerous Ideas,” will be released later this month.
Loading comments...

You have reached the limit of 10 free articles in a month

Stay informed with unlimited access to Boston’s trusted news source.

  • High-quality journalism from the region’s largest newsroom
  • Convenient access across all of your devices
  • Today’s Headlines daily newsletter
  • Subscriber-only access to exclusive offers, events, contests, eBooks, and more
  • Less than 25¢ a week