Can you spot fraud when you see it? In 2011 a prominent Dutch social psychologist named Diederik Stapel confessed to having made up or manipulated data in nearly half of the papers he’d ever written. Last month the research journal PLOS One published an examination of Stapel’s work by David Markowitz and Jeffrey Hancock, who study communications at Cornell University. Markowitz and Hancock analyzed 49 of Stapel’s papers—24 fraudulent ones and 25 genuine ones—looking for ways in which his writing style differed when he was lying.
They found that he had two key tells. First, his fake papers were filled with words related to research methods—“pattern,” “procedure,” “feedback.” Markowitz and Hancock describe this as “the overproduction of scientific discourse,” and speculate Stapel may have been compensating for the fact that many of the experiments he described he never actually carried out.
Second, he was more likely to exaggerate the significance of his fake results, using terms like “profoundly” and “extreme,” while eschewing words like “partly” or “slightly.” The authors note that a similar tendency to exaggerate has been found when people lie in other domains, like hotel reviews and online dating profiles.
Unfortunately, the authors stress that they haven’t created a secret weapon for detecting lies. When they ran tests to validate their results, they found that about one-third of the time, their method misidentified Stapel’s real and fake studies. It’s the kind of measured acknowledgement you’re unlikely to find in Stapel’s fake papers, and also maybe a pointer for the next generation of fraudster academics: Qualify your results, and you’re more likely to get away with it!
Kevin Hartnett is a writer in South Carolina. He can be reached at firstname.lastname@example.org.