“We want to help physicians communicate and meet patients where they are. But we also have to look inward to see how we’ve contributed to this distrust.”
— Skyler Johnson
Skyler Johnson’s interest in combating health misinformation started a decade ago after his wife learned she had cancer. Immediately after getting the diagnosis, the couple went online looking for information and quickly found themselves wading through a digital sea of falsehoods, distortions and half-truths.
“You’re already dealing with so much uncertainty in those first few weeks (of a cancer diagnosis),” said Johnson, MD, who at the time was a second-year medical student and is now an assistant professor at the University of Utah Huntsman Cancer Institute. “It puts you in a really vulnerable position.”
Thankfully, his wife recovered. Because of the experience, Johnson said he developed a better understanding of how to help his own patients make sense of online information. He also saw the more deadly impacts of misinformation, caring for patients who decided against evidence-based treatment and returned many months later with cancers that had progressed to incurable stages.
Johnson dug into the issue more, co-authoring peer-reviewed studies in 2017 and 2018 that found using alternative medicine instead of conventional cancer treatment was tied to higher risks of death. That led him and his colleagues to a main source of misinformation: social media. In a study published last year in the Journal of the National Cancer Institute, they reviewed 50 of the most popular social media articles on each of the four most common cancers. Nearly a third contained harmful information. Engagement with articles containing misinformation was also significantly higher than with factual ones.
Combating misinformation should be “a major public health priority, both in terms of research and funding,” Johnson said.
The problem of health misinformation has long existed, but the past two years have made its urgency grimly clear, especially as hospitals flooded with unvaccinated COVID-19 patients in January. Last year, one of the country’s top health officials, U.S. Surgeon General Vivek Murthy, MD, MBA, issued an advisory on the issue, describing health misinformation as a “serious threat to public health” and calling on stakeholders, including technology platforms, to step up.
“There’s a lot of types of misinformation out there, but where I think health misinformation is unique is in its dire consequences,” Briony Swire-Thompson, PhD, director of the Psychology of Misinformation Lab at Northeastern University in Boston, told The Nation’s Health. “Political misinformation is everywhere, too, but it doesn’t necessarily lead to increased mortality.”
Misinformation is ubiquitous, but solid evidence on how to effectively counter it is still sparse, Swire-Thompson said. Some promising lessons have emerged, however. For example, she said trustworthiness often trumps expertise when it comes to picking an effective messenger. Another lesson: fact-checking can make a difference, but messengers should offer a factual alternative to replace the debunked information.
“You say ‘This is false, but this is true and this is why,’” she said.
Ideally, the accurate information should also be as easy to understand as the misinformation, which is more challenging than it seems.
“It’s tough, because misinformation is already really simple because it’s fabricated,” Swire-Thompson said. “But if information is removed, you need to add something back in. People are more comfortable with a complete but inaccurate model, rather than one that’s incomplete.”
Health advocates step up to combat trend
A number of public health groups are partnering to help health workers debunk health misinformation, such as the Public Health Communications Collaborative, which provides guidance, monitors media for circulating misinformation and issues alerts.
Brian Castrucci, DrPH, MA, president and CEO of the de Beaumont Foundation, one of the collaborative’s founders, said knowing how to counter misinformation has quickly become a critical public health skill. He cited a recent de Beaumont-
commissioned poll that found people who depend on social media as a primary source of information were 16% less likely to report at least one COVID-19 vaccine dose.
At the local level, continuing to build trust and credibility is one of the best antidotes to misinformation, he said. At the same time, a national strategy is needed to deal with a problem that is so large and entrenched.
“This isn’t some guy yelling at the end of the bar — it’s a well-organized, well-funded assault on the truth,” said Castrucci, an APHA member. “No one really planned for this — no one had tabletop exercises about misinformation and social media. But it should totally change how we do pandemic preparedness going forward.”
Another strategy is to go after the sources. In the fall, de Beaumont and a new grassroots coalition, No License for Disinformation, issued a report calling on state medical boards to discipline physicians who spread COVID-19 falsehoods with intent to deceive, a category of misinformation known as disinformation. Nick Sawyer, MD, MBA, director of the coalition and an emergency room doctor in Sacramento, California, said most of the unvaccinated COVID-19 patients he sees are simply confused and not staunchly against vaccination.
“But then again, that’s the purpose of these misinformation campaigns — confusion,” he said.
Sawyer said he is especially frustrated with fellow doctors who spread COVID-19 lies, such as debunked myths about the vaccines causing infertility or magnetism. Last summer, after the Federation of State Medical Boards confirmed that such behavior could put a physician’s medical license at risk, Sawyer and some of his colleagues filed complaints with the California board, which declined to act. The experience led Sawyer to help launch No License for Disinformation to put pressure on medical boards to take the problem seriously.
According to the federation, in 2021, 67% of state medical boards reported rising complaints about doctors spreading misinformation.
“This is so incredibly urgent because disinformation groups are multiplying, and the networks behind them are massive and extraordinarily well-funded,” Sawyer told The Nation’s Health.
Imran Ahmed, MA, CEO of the Center for Countering Digital Hate, said while public health professionals need to understand the psychology of misinformation, there is far too little pressure on the platforms that facilitate its spread.
Right now, Ahmed said, social media companies financially benefit from misinformation and any private enforcement against spreaders is sporadic at best. He said new laws could start to make a difference, such as a bill Sens. Amy Klobuchar, D-Minn., and Ben Ray Luján, D-New Mexico, introduced last year that would take away liability shields from platforms that promote misinformation during public health emergencies.
“We can’t be so bloody credulous in dealing with social media companies, who’ve played a really disgraceful game with the lives of Americans and those all around the world,” Ahmed told The Nation’s Health. “Social media is the new battlefield.”
Back in Utah, Johnson is working on new tools to help his cancer patients sift through the confusion at home. One of them is a computer plug-in that will flag articles with harmful cancer misinformation. Ideally, it could be applied to all kinds of health misinformation.
“We want to help physicians communicate and meet patients where they are,” he told The Nation’s Health. “But we also have to look inward to see how we’ve contributed to this distrust.”
For more resources, including a guide for managing vaccine misinformation, visit www.publichealthcollaborative.org and www.nolicensefordisinformation.org.
Personal approach can help people combat health misinformation
In November, U.S. Surgeon General Vivek Murthy, MD, MBA, released “A Community Toolkit for Addressing Health Misinformation.” A follow-up to his July advisory on the health risks of misinformation, the toolkit shares practical tips that anyone can use to fight the problem, from health professionals and educators to faith leaders and community members.
The resource helps people identify different types of misinformation and provides ways they can spot it. It also explains why some people are tempted to share information that is not accurate, such as a need to feel connected to others or to try to make sense of situations.
“When we are unsure or frightened, we often seek and share information that can provide explanations — without checking where or who it came from,” the toolkit said.
The toolkit advises that when having discussions about misinformation, people should:
Listen: The best way to change someone’s mind about misinformation is to listen to their fears. Avoid minimizing concerns or telling them they are wrong.
Empathize: Ask questions to understand where people are coming from. Avoid implying that you never fall for false or misleading information.
Point to credible sources: Emphasize the need to find sources who are not in a position to personally profit or to gain power. Do not assume they should know where to go for accurate information.
Do not publicly shame: Have one-onone conversations when possible and use a caring tone. Do not make fun of concerns.
Use inclusive language: Use phrases such as “I understand” or “It is so hard to know who to trust.” Do not share information that makes fun of people who are vaccine hesitant.
“Health misinformation is spreading fast and far online and throughout our communities,” Murthy said in a news release. “The good news is that we all have the power to help stop the spread of health misinformation during this pandemic and beyond.”
To download and share the toolkit, visit www.surgeongeneral.gov.
— Michele Late
- Copyright The Nation’s Health, American Public Health Association