Nearly three years into the pandemic, COVID-19 remains stubbornly persistent. So, too, does misinformation about the virus.
As COVID-19 continues to spread, so does misinformation
Doctors are exasperated by the persistence of false and misleading claims about the coronavirus.
By Tiffany Hsu
As COVID cases, hospitalizations and deaths rise in parts of the country, myths and misleading narratives continue to evolve and spread, exasperating overburdened doctors and evading content moderators.
What began in 2020 as rumors that cast doubt on the existence or seriousness of COVID quickly evolved into often outlandish claims about dangerous technology lurking in masks and the supposed miracle cures from unproven drugs, such as ivermectin. Last year's vaccine rollout fueled another wave of unfounded alarm. Now, in addition to all the claims still being bandied about, there are conspiracy theories about the long-term effects of the treatments, researchers say.
The ideas still thrive on social media platforms, and the constant barrage, now a yearslong accumulation, has made it increasingly difficult for accurate advice to break through, misinformation researchers say. That leaves people already suffering from pandemic fatigue to become further inured to COVID's continuing dangers and susceptible to other harmful medical content.
"It's easy to forget that health misinformation, including about COVID, can still contribute to people not getting vaccinated or creating stigmas," said Megan Marrelli, editorial director of Meedan, a nonprofit focused on digital literacy and information access. "We know for a fact that health misinformation contributes to the spread of real-world disease."
Twitter is of particular concern for researchers. The company recently gutted the teams responsible for keeping dangerous or inaccurate material in check on the platform, stopped enforcing its COVID misinformation policy and began basing some content moderation decisions on public polls posted by its new owner and chief executive, billionaire Elon Musk.
From Nov. 1 to Dec. 5, Australian researchers collected more than half a million conspiratorial and misleading English-language tweets about COVID, using terms such as "deep state," "hoax" and "bioweapon." The tweets drew more than 1.6 million likes and 580,000 retweets.
The researchers said the volume of toxic material surged late last month with the release of a film that included baseless claims that COVID vaccines set off "the greatest orchestrated die-off in the history of the world."
Naomi Smith, a sociologist at Federation University Australia who helped conduct the research with Timothy Graham, a digital media expert at Queensland University of Technology, said Twitter's misinformation policies helped tamp down anti-vaccination content that had been common on the platform in 2015 and 2016. From January 2020 to September 2022, Twitter suspended more than 11,000 accounts over violations of its COVID misinformation policy.
Now, Smith said, the protective barriers are "falling over in real time, which is both interesting as an academic and absolutely terrifying."
"Pre-COVID, people who believed in medical misinformation were generally just talking to each other, contained within their own little bubble, and you had to go and do a bit of work to find that bubble," she said. "But now, you don't have to do any work to find that information — it is presented in your feed with any other types of information."
Several prominent Twitter accounts that had been suspended for spreading unfounded claims about COVID were reinstated in recent weeks, including those of Rep. Marjorie Taylor Greene, R-Ga., and Robert Malone, a vaccine skeptic.
Musk himself has used Twitter to weigh in on the pandemic, predicting in March 2020 that the United States was likely to have "close to zero new cases" by the end of that April. (More than 100,000 positive tests were reported to the Centers for Disease Control and Prevention in the last week of the month.) This month, he took aim at Dr. Anthony Fauci, who will soon step down as President Joe Biden's top medical adviser and the longtime director of the National Institute of Allergy and Infectious Diseases. Musk said Fauci should be prosecuted.
Twitter did not respond to a request for comment. Other major social platforms, including TikTok and YouTube, said last week that they remained committed to combating COVID misinformation.
YouTube prohibits content — including videos, comments and links — about vaccines and COVID-19 that contradicts recommendations from the local health authorities or the World Health Organization. Facebook's policy on COVID content is more than 4,500 words long. TikTok said it had removed more than 250,000 videos for COVID misinformation and worked with partners such as its content advisory council to develop its policies and enforcement strategies. (Musk disbanded Twitter's advisory council this month.)
But the platforms have struggled to enforce their COVID rules.
Newsguard, an organization that tracks online misinformation, found this fall that typing "covid vaccine" into TikTok caused it to suggest searches for "covid vaccine injury" and "covid vaccine warning," while the same query on Google led to recommendations for "walk-in covid vaccine" and "types of covid vaccines." One search on TikTok for "mRNA vaccine" brought up five videos containing false claims within the first 10 results, according to researchers. TikTok said in a statement that its community guidelines "make clear that we do not allow harmful misinformation, including medical misinformation, and we will remove it from the platform."
In years past, people would get medical advice from neighbors, or try to self-diagnose via Google search, said Dr. Anish Agarwal, an emergency physician in Philadelphia. Now, years into the pandemic, he still gets patients who believe "crazy" claims on social media that COVID vaccines will insert robots into their arms.
"We battle that every single day," said Agarwal, who teaches at the University of Pennsylvania's Perelman School of Medicine and serves as deputy director of Penn Medicine's Center for Digital Health.
Online and offline discussions of the coronavirus are constantly shifting, with patients bringing him questions lately about booster shots and long COVID, Agarwal said. He has a grant from the National Institutes of Health to study the COVID-related social media habits of different populations.
"Moving forward, understanding our behaviors and thoughts around COVID will probably also shine light on how individuals interact with other health information on social media, how we can actually use social media to combat misinformation," he said.
Years of lies and rumors about COVID have had a contagion effect, damaging public acceptance of all vaccines, said Heidi J. Larson, director of the Vaccine Confidence Project at the London School of Hygiene & Tropical Medicine.
"The COVID rumors are not going to go away — they're going to get repurposed, and they're going to adapt," she said. "We can't delete this. No one company can fix this."
Some efforts to slow the spread of misinformation about the virus have bumped up against First Amendment concerns.
A law that California passed several months ago, and that is set to take effect next month, would punish doctors for spreading false information about COVID vaccines. It already faces legal challenges from plaintiffs who describe the regulation as an unconstitutional infringement of free speech. Tech companies including Meta, Google and Twitter have faced lawsuits this year from people who were barred over COVID misinformation and claim that the companies overreached in their content moderation efforts, while other suits have accused the platforms of not doing enough to rein in misleading narratives about the pandemic.
Dr. Graham Walker, an emergency physician in San Francisco, said the rumors spreading online about the pandemic drove him and many of his colleagues to social media to try to correct inaccuracies. He has posted several Twitter threads with more than a hundred evidence-packed tweets trying to debunk misinformation about the coronavirus.
But this year, he said he felt increasingly defeated by the onslaught of toxic content about a variety of medical issues. He left Twitter after the company abandoned its COVID misinformation policy.
"I began to think that this was not a winning battle," he said. "It doesn't feel like a fair fight."
Now, Walker said, he is watching as a "tripledemic" of COVID-19, RSV and influenza bombards the health care system, causing emergency room waits in some hospitals to surge from less than an hour to six hours. Misinformation about easily available treatments is at least partly responsible, he said.
"If we had a larger uptick in vaccinations with the most recent vaccines, we probably would have a smaller number of people getting extremely ill with COVID, and that's certainly going to make a dent in hospitalization numbers," he said. "Honestly, at this point, we will take any dent we can get."
about the writer
Tiffany Hsu
His political views differed from a transgender classmate’s, but they forged a bond that lasted a decade — until Vance seemed to pivot, politically and personally.