When the Oxford English Dictionary recently declared “post-truth” 2016’s international word of the year, it was seen by many commentators as a sign that emotions are winning out over facts, and that people might no longer be holding public figures accountable for actions based on false or shoddy evidence.
But a new study in the journal PLOS ONE suggests that at least some people see rational behavior as very much a moral issue. While moral attitudes are often seen as based on religious beliefs or ethical codes, the study finds that some people view others as moral or immoral based on how much they ground their decisions in reason and evidence. The researchers hope that by measuring and studying this phenomenon, we could understand — and perhaps even cultivate — this quality.
In the study, researchers led by Tomas Ståhl, a visiting assistant professor of psychology at the University of Illinois at Chicago, developed a scale for measuring how much people moralize rationality. “We’re not measuring how rational people are,” he says. “We’re measuring to what extent they believe that being rational is a moral value.” A moral value isn’t just your personal belief but one you think everyone should follow.
Although lots of beliefs could be considered “rational” (you might argue that it’s rational, for instance, to believe in astrology if doing so makes you feel happy), the study explicitly focused on what’s called “epistemic rationality” — how accurately you match your beliefs to reality and evidence.
In a series of studies, the researchers developed a set of questions to measure attitudes about rationality, testing them on several hundred respondents recruited online. They identified questions that best captured a tendency to moralize about rational behavior, which they found was distinct from attaching a personal importance to making rational choices.
Among people who scored high, further studies found, decisions seemed more moral if they were based in rational thinking. For instance, they perceived someone who sees a doctor when sick as more moral than someone who chooses a homeopathic remedy, and a person who makes an investment based on astrology as less moral than someone who based the decision on a rational analysis.
In a separate survey of more than 300 university students, faculty, and staff, the researchers found that people who moralized rationality expressed more willingness to contribute to a charity working to prevent the spread of irrational beliefs, showing that this value could motivate positive actions, not just judgments of others.
Ståhl says that in the respondents tested, high scores in moralizing rationality were more common in nonreligious people than religious people, but didn’t correlate with political affiliation or education.
This scale is a first step toward understanding how a moral sense of rationality develops, what influences it, and whether it’s possible to persuade people to value rationality more. For instance, Ståhl says, encouraging people to adopt rationality as a moral value might be more effective than attacking falsely held beliefs. But first he wants to understand whether people with this value actually behave rationally, and whether they would update their own beliefs when evidence contradicts them. “Are they less biased,” he says, “or are they just as biased as everyone else?”
Courtney Humphries is a freelance writer in Boston.