fb-pixel Skip to main content
IDEAS

Those infuriating Republican hypocrites are a lot like you and me

The human mind is disturbingly adept at justifying our actions.

Senator Lindsey Graham, who insisted four years ago that he would be opposed to doing what he's doing right now.
Senator Lindsey Graham, who insisted four years ago that he would be opposed to doing what he's doing right now.Demetrius Freeman/Associated Press

Politicians are no strangers to being labeled hypocrites. But that label has perhaps never been more applicable than in the past two weeks, as Republican senators have moved quickly to confirm a new Supreme Court justice in an election year after having blocked President Obama’s nomination of Merrick Garland in 2016.

At the time Garland was nominated, Republicans said that it would be better to wait 237 days, until after the 2016 election, so that the American people’s voice could be heard. Some, like Senator Lindsey Graham, later said they would follow that policy even if such a situation arose during Trump’s term. Yet many of those same Republicans have been espousing the opposite view since Ruth Bader Ginsberg’s death created a vacancy on the court just 46 days before this fall’s election.

Advertisement



What’s perhaps most vexing, if not infuriating, to many isn’t that these politicians are behaving solely in their self-interest but rather that no attempt at logical argument seems to persuade them that they’re being hypocritical. They have described their change of heart as a matter of principle. And in most cases, I believe they actually see it that way: They believe they are being principled. That’s why any attempt at persuasion by reasoned logic is doomed to fail.

To recognize why this is so, it helps to understand something about the mind. When it comes to morality, the mind isn’t objective. There’s nothing special about politicians; we’re all hypocrites at heart. We want to see ourselves as moral while at the same time acting to benefit not only ourselves but our team by bending the rules. Let me give you an example.

The psychologist Piercarlo Valdesolo and I invited people to our lab to take part in an experiment in which we told them two tasks needed to be done. One was short and fun, involving a hunt for images; the other was long and onerous, involving logic problems. To ensure the number of people completing each task remained equal, every other person would do one of the two tasks. There was a catch, though. Some of these people would be “deciders” — they would get to choose which task they wanted to do, meaning the next person in line would have to complete the other one. We told the deciders that they could make the decision however they wanted, but that the fairest way was to use a computer program to flip a virtual coin. If the green side came up, they could do the enjoyable, short task. If the red side came up, they could do the long, onerous one. Unbeknownst to our participants, however, we had rigged the virtual coin to always come up red. Then we left them alone and via hidden video watched what they did.

Advertisement



Before I tell you the results, let me say that we had previously asked people from our pool of participants what any given decider should do. Like the Republican senators, deciders had the power to do what they wanted; they weren’t bound to be fair and use the coin. Nonetheless, the results were unanimous: Everyone said not using the coin to decide who got stuck with the difficult task was morally wrong. Yet when placed in the actual situation, 92 percent of people didn’t use the coin. We found nearly the same result time and again while repeating this experiment. People just assigned themselves the easy task and moved on.

Advertisement



Right here, it’s evident that a majority of people will go against their previously stated convictions if they feel it benefits them and they can get away with it. Afterward, when we asked people how morally they behaved, most indicated that they had acted fairly. But when our participants watched someone else behave in the same way — take the better option for themselves without flipping the coin — they viewed the act as unfair. That’s the essence of hypocrisy. There is no fixed moral standard; fairness depends on the identity of the actor.

The most interesting part, however, was how people arrived at their view. They created stories to justify their actions. Some said they’d normally have flipped the coin, but on this day they couldn’t risk running late for another meeting (even though they knew what time they’d have been finished when they signed up for the experiment). Or my favorite: One young man said that the next person in line — the one who would get stuck doing the long task — looked like a smart guy who would likely enjoy the challenge.

Believing we’re hypocrites is so aversive that the vast majority of us will attempt to convince ourselves we’re not actually that way, even when we behave selfishly. When Valdesolo and I ran a different version of this experiment — one in which we prevented people from constructing a mental justification for their actions by keeping their minds occupied — hypocrisy vanished. Those deciders who didn’t flip the coin recognized their moral failure. They said their behavior was just as morally compromised as when someone else did it.

Advertisement



The upshot here is simple: When left to our own devices, most of us will add an "e" to our “rational” thought, justify our behavior using tortured logic, and whitewash our sins. As a result, we’ll come to believe that any given situation that benefits us is different. We’ll believe we’re acting on principle. And in truth, we are. It’s just that the principle isn’t one of moral objectivity. The human mind didn’t evolve to be virtuous or objective. To help ensure we behave that way, we have to recognize its limitations.

David DeSteno is a professor of psychology at Northeastern University.