The most interesting conversation I had with a fishmonger was about the mRNA vaccine. As he was ringing up my usual order of salmon, we chatted about COVID-19 and the different vaccines rolling out to the public that spring of 2021.
“I’ve heard a lot of negative things. I’m not sure how I feel about getting a vaccine at all,” he said, blue eyes appearing wary.
We talked about his concerns, what he heard, and from what sources — mostly personal anecdotes. I spent most of the time listening, offering my perspective as a public health expert only at the very end.
In a 2021 Kaiser Family Foundation report, nearly 8 out of 10 adults in the United States believed or were unsure about one or more common falsehoods about COVID-19 or COVID-19 vaccines. This wasn’t without consequence; from burning masks to protesting vaccines to injecting disinfectant, misinformation successfully stymied efforts to curb the spread of COVID-19. A 2020 study estimated that nearly 6,000 people in the United States were hospitalized as a result of believing disreputable information about coronavirus cures on social media.
The spread of misleading health information is not new. False rumors that Benjamin Franklin’s son died from inoculation rather than the smallpox disease itself quickly spread in 1763. In more recent years, celebrities like Jenny McCarthy have utilized their large platforms to amplify inaccurate claims that the measles vaccine leads to autism. Efforts such as these, with no basis in scientific evidence, have catalyzed a rise in vaccine hesitancy and anti-vaccination nationally and globally — movements that took hold pre-COVID.
This rapid rise of misinformation comes at a cost. In 2019, the United States barely held onto its measles elimination status, and the disease — formally declared eradicated in 2000 — resurfaced in the United States and in four European countries — a reversal fueled largely by the rising wave of people who refuse vaccination. Even now, 15 percent of US adults remain unvaccinated against COVID-19, with 42 percent saying they don’t trust the vaccine.
Communicating up-to-date health recommendations grounded in science is a cornerstone of public health. However, health misinformation, defined as “any health-related claim of fact that is false based on current scientific consensus,” can spread faster and farther than accurate information. While our society is battling our way through one pandemic, we are also grappling with an infodemic that is equally dangerous to our society.
The World Health Organization agrees, cautioning against the spread of false information and launching an infodemic initiative, with 132-member states pledging to do their part.
But what does it mean to “do one’s part” in the fight against fake health news?
First, it helps to know why health misinformation spreads. At the heart of untruths is mistrust. With increased fracturing across political, religious, and sociocultural lines, people doubt the expertise and intent of others. This extends both interpersonally (e.g., between physicians and patients) and institutionally (e.g., declining trust in scientific and medical communities).
Second, it is important to understand how misinformation spreads. Social media platforms provide ready tools to create information silos. The result is reduced exposure to content that contradicts the views of individuals’ carefully curated networks. We have the ability to prune our information feeds by muting, blocking, and unfollowing what we don’t like and liking, sharing, and subscribing to what we do like, which reinforces the beliefs and decisions we gravitate toward. And with little to no regulation on information disseminated on social media, people rely on their own abilities to identify accurate information, which vary greatly by digital, information, and media literacy levels.
Misinformation also spreads via interpersonal conversations with family, friends, colleagues, neighbors, and community members — those with whom we have personal connection and trust.
To combat misinformation, we need to know what is being spread and where. Infodemics occur across platforms and topics, with Twitter feeds containing the most dubious health information online. A systematic review of 69 studies identified six principal categories of health misinformation on social media: vaccines (32 percent), drugs or smoking (22 percent), noncommunicable diseases (19 percent), pandemics (10 percent), eating disorders (9 percent), and medical treatments (7 percent).
Knowing this, how can those of us in public health, medicine, and science treat infodemics?
▪ Understand values. We cannot expect to move opinion and change behavior without knowing what matters to people and simply expect unearned trust in return. This means doing the work of understanding the values of each group and tailoring messaging to be consistent with the audience’s worldview; engaging in discussion and discourse without judgment; and recognizing that “expertise” takes multiple shapes and forms.
▪ Nudge for accuracy. In an information-saturated environment, a study published in Nature recommends offering an accuracy nudge to “prime people to think about the accuracy of the information that they see … [and] reduce the spread of misinformation.” Doing so encourages others to reflect, even briefly, on the validity and reliability of the information they encounter from different sources.
▪ Engage with the public. Increase accurate health information accessibility by using narratives and plain language that reach people where they are, prioritizing topics where false information is most frequently discussed. Op-eds, video explainers, testimonials, broadcast interviews, explanatory visuals, interactive tools, and social media engagement are strategies that bring science to the people.
▪ Cultivate and maintain trust. Building trust is crucial for successfully communicating accurate information and countering misinformation. Strategies include partnering with trusted community leaders and sources, inviting dialogue, and addressing miscommunication upfront with transparency. Corrective messages or updates to previously disseminated content must negate the misinformation as well as provide an explanation for the correction to minimize public confusion.
▪ Recognize the power of everyday conversations. In an increasingly online world, we forget that misinformation also spreads offline — through our everyday interactions with family, friends, neighbors, and colleagues — which can then be reflected and amplified in the digital space. This means that there are ample opportunities to intervene and block the spread of misinformation with the people in our networks.
The next time I saw the fishmonger, the corners of his blue eyes crinkled into a smile wide enough to detect behind his mask. “I got my vaccine. Thanks.”
Monica L. Wang is chair of the Narrative Office at the Boston University Center for Antiracist Research, associate professor of community health sciences at the Boston University School of Public Health, and an adjunct associate professor at the Harvard T.H. Chan School of Public Health.