If You Interacted With A Coronavirus Hoax On Facebook, You'll Soon Get An Alert

Social media companies have struggled to combat misinformation about the pandemic so far.

The journalists at BuzzFeed News are proud to bring you trustworthy and relevant reporting about the coronavirus. To help keep this news free, become a member and sign up for our newsletter, Outbreak Today.


Facebook will alert people who have liked, shared, reacted to, or commented on harmful misinformation about the coronavirus on its platform over the next few weeks, the company announced on Thursday.

Facebook users who have interacted with misinformation that the company has removed from its platform will receive messages in their News Feed that will direct them to COVID-19 myths debunked by the World Health Organization and other credible information.

“We want to connect people who may have interacted with harmful misinformation about the virus from authoritative sources in case they see or hear these claims again off of Facebook,” wrote Guy Rosen, the company’s vice president for integrity, in a blog post.

Facebook, however, will not tell users what piece of misinformation they interacted with or what was wrong with it.

Founder and CEO Mark Zuckerberg said the company had removed hundreds of thousands of pieces of content related to the virus that fact-checkers had declared misinformation, including myths and hoaxes that said drinking bleach could help prevent COVID-19 or that social distancing was ineffective.

"If a piece of content contains harmful misinformation that could lead to imminent physical harm, then we'll take it down," he wrote in a post.

Facebook has also added a new section that includes fact-checked articles from the company’s fact-checking partners that are currently available for people in the US.

In addition, the company released some new data about combating misinformation related to the coronavirus in its blog post. The company said it displayed warnings on 40 million COVID-19-related posts in March based on 4,000 articles by its fact-checking partners. “When people saw those warning labels, 95% of the time they did not go on to view the original content,” Rosen wrote.

Social media companies have struggled to combat misinformation about the coronavirus ever since the outbreak began. Hoaxes and myths claiming to cure the coronavirus and blaming religious minorities for spreading the disease have thrived on platforms like Facebook, Twitter, and YouTube for the last few months.

Earlier this month, WhatsApp, the popular instant messaging app owned by Facebook, announced that it was imposing stricter limits on forwarded messages, a popular way for coronavirus misinformation to spread among its 2 billion users.

Topics in this article

Skip to footer