During infectious disease outbreaks, misinformation can spread as rapidly as a virus online — leading to fear, mistrust and prejudice. But reminding people that misinformation is out there can help people spot the difference between facts and fakery, a recent study finds.
Published in September in Open Information Science, study researchers found that when people were informed that deliberate misinformation on science exists — and that it sometimes resembles news — they were were more likely to detect false information online.
“It is important to occasionally remind news consumers to use their critical thinking skills,” study researcher Bartosz Wojdynski, PhD, MA, said in a news release. “Since many articles are found through social media, this study shows the impact that even an automatic disclaimer on Facebook could have in reminding people to use their best judgment when they look at webpages.”
In the study, University of Georgia researchers gave two focus groups four webpages with articles to read. Two were evidence-based news articles and two were fiction. One group heard a statement informing them that false news on science-related topics reached many people, while the other did not. After reading, the participants were given a survey about the articles.
The forewarned participants were three times more likely to detect at least one false story and also tended to investigate the source material more thoroughly. Using an eye tracker, the researchers were able to see how readers examined not just the text of the article, but the webpage around the article. Participants looked at page elements, such as headline links to other articles published on the same site, to help shape their opinion on the trustworthiness of the source.
“This is important because a lot of literature talks about people looking at the URL and the source information, but this research shows that subjects are also trying to get a sense of who the publisher is by looking at what kinds of other stories they publish through external links,” said Wojdynski, a new media professor at the university’s Grady College of Journalism and Mass Communication.
Some social media platforms have been working to remind users about the existence of false information — and in some cases steering them away from it. In 2019, Instagram began blocking hashtags that led to misinformation on vaccines. And when Twitter users searched for COVID-19 recently, a message at the top of the results page directed people toward their country’s government health agency for official information.
Trusted health agencies are also directly addressing false information. Early in the COVID-19 outbreak, the World Health Organization created a “myth busters” page to fight misinformation on the disease, addressing issues such as spread, safety, vaccines and treatments. The organization created graphics to debunk the myths and shared them online.
WHO has also worked to counter the spread of harmful rumors online and in the news by organizing workshops with the media. Some of those workshops were held in Nigeria, where WHO had documented false reports of COVID-19 cases and claims that the disease could be treated with illegal drugs.
“Journalists and media are critical to getting the right messages to the community,” Dhamari Naidoo, a WHO emergency officer who conducted a training with more than 40 journalists in Abuja, Nigeria, in February, said in a news release.
For more information on the University of Georgia study, visit https://www.degruyter.com/.
- Copyright The Nation’s Health, American Public Health Association