Why we believe the ‘big lie’
National governments have been known to tell whoppers on occasion. Remember that time when Kim Jong-Il went golfing and North Korean media reported that he’d scored eleven holes-in-one? Or that other time when pro-Russian insurgents shot down a Malaysian Airlines flight over Ukraine, but Russian television labeled the attack an assassination attempt on Vladimir Putin? “The bigger the lie, the more it will be believed,” Nazi Propaganda Minister Joseph Goebbels said.
From the outside looking in, the lies are entertaining, but also hard to reckon. You might think that wanton exaggerations and serial mistruths — like a habit of inflating economic data — would work against authoritarian governments, as a knock on their credibility. Political scientists have spent a lot of time trying to answer the question, "Why do governments lie," and a new paper released this month says the right answer might be the simplest one: because their citizens believe them.
"Maybe we've been a little too clever in a lot of our approaches to understanding things," says Andrew Little, author of the working paper, "Propaganda and Credulity," and a professor at Cornell University. "Let's assume there is some chunk of the population that buys what the government says. The idiots can end up being the people who lead what other people end up believing."
Other theories about why governments lie have focused on the lying as an expression of power. Like Kim Jong-Il with the golf. Even if no one believes he carded eleven aces, the lie is so over-the-top that it reminds North Korean citizens who is in charge. "It signals they are just big and bad and are able to get people to repeat the lie," says Scott Gehlbach, a political scientist at the University of Wisconsin.
In other cases, Little argues, the lies aren't merely a signal — they're actually meant to be swallowed. His current paper presents a game theory model that shows how a government-lie might diffuse through a population. The model predicts an outcome under conditions in which each person has to guess what everyone else around him believes.
With a government lie — even an egregious one — a small percentage of the population might be gullible and actually believe it. These are the "blind" in Little's telling, and they can have a major effect on everyone else's actions.
Say you're one of the people who knows a lie told by the Kremlin is ridiculous — the "sighted" as Little labels them in his paper. You might look around and see that other people actually seem to believe it and conclude it's not worth sticking your neck out to challenge the mistruth. Or as Gehlbach puts it, "You don't want to be the only person standing on Red Square holding a placard the says 'Down with Putin.'"
Government lies, Little finds, circulate especially well in conditions where there's strong social pressure to fall in line. "The ingredient to make this work is this desire for consensus," he says, adding that in freer societies, government lies are more likely to peter out. "It doesn't quite work when thinking about the US population."
Little's model might not apply to the United States as a whole, but it does help to explain the way information and beliefs circulate in communities within the country. In party politics there's a strong desire for consensus and also often a penalty for being the one person to raise your hand and challenge something that's not quite true. That's how, for example, one person floats the idea that the president was not born in the United States, a small group of people believes the idea, and the credulous among them start to feel reluctant to correct something they know is not true.
Kevin Hartnett is a writer in South Carolina. He can be reached at firstname.lastname@example.org.