Facebook Is Taking Down Posts That Cause Imminent Harm — But Not Posts That Cause Inevitable Harm

The social media platform is siding with scientists to stop the spread of harmful misinformation about the pandemic. If it can do it now, why wasn't it doing it all along?

The journalists at BuzzFeed News are proud to bring you trustworthy and relevant reporting about the coronavirus. To help keep this news free, become a member and sign up for our newsletter, Outbreak Today.


In January 2020, Facebook executives began paying attention to a new coronavirus spreading across the globe. The virus was deadly and highly contagious, and when the World Health Organization shared guidance about it, the company’s leadership dropped their typically hands-off approach to misinformation.

“We decided we would remove content that directly contradicted [the WHO] and could contribute to risk of imminent physical harm,” Monika Bickert, Facebook’s head of global policy management, told BuzzFeed News.

Say “social distancing doesn’t work” or “wearing a mask can make you sick” on Facebook, and the company will direct its moderators to remove your post. A move of this scale is unprecedented for a company that has sought to keep itself out of battles over the truth, but Facebook has removed hundreds of thousands of posts violating this policy already, according to Bickert, and reduced distribution on tens of millions of others.

"It feels like a collision is inevitable."

Though Facebook has removed some health-related misinformation in the past, the company’s decision, although enforced spottily, to side with science has thrust it into new ground. Removing coronavirus misinformation could make its largely hands-off approach to other harm-causing misinformation — such as on climate denialism and anti-vaccine advocacy — harder to defend. It will also place Facebook squarely in the middle of a political battle in the US, where the not widely shared opposition to social distancing and masks break along party lines.

“I don't know how they're going to reconcile being an open marketplace of political ideas, while at the same time not cracking down on certain political voices that are taking fringe anti-science beliefs and trying to bring them into the mainstream,” Andy Carvin, a senior fellow at the Atlantic Council's Digital Forensic Research Lab, which takes funding from Facebook, told BuzzFeed News. “It feels like a collision is inevitable.”

Bickert acknowledged her company’s tough position. “None of this is easy,” she said. “There are always difficult factors to balance in writing and applying content rules.”

Is Saving Lives Time-Bound?

Although Facebook has removed misinformation on measles in Samoa, and polio in Pakistan, a deletion campaign of this scale is unprecedented, and as the pandemic continues it could be the first of many.

Anti-vaxxers are spreading misinformation in anticipation of a vaccine for the coronavirus if one were to be developed — with some declaring that it would be made mandatory — and their Facebook pages and Instagram profiles are booming. Facebook won’t take down these inaccurate posts, even if they could cause deaths in the future, because the company’s standard for removal requires “imminent harm.” In the meantime, anti-vax pages and profiles are adding hundreds of thousands of engagements each month.

A former Facebook policy employee told BuzzFeed News that the company’s aggressive enforcement of WHO rules didn't square with its hands-off stance on anti-vaccine misinformation. “Where I think the platforms are gonna be in a real shit position,” the ex-employee said, “is when we're closer to a vaccine for coronavirus, all of the anti-vax groups that have taken the last month, and will continue to take the next few months, to build their followings, get donations, make sure that they're right in front of people — they’re going to convince folks not to take the vaccine.”

Bickert defended applying the policy to the coronavirus but not vaccines by saying, "There's something about speech where the risk is immediate, where there is not necessarily going to be time for debate, that makes that speech especially important for us to address."

That’s not to say there isn’t lively ongoing debate about vaccine misinformation policy inside Facebook. Lower-level employees debate policy issues, the ex-employee said, but the company’s decisions on politically explosive topics like anti-vax content remain at the top.

“The anti-vaxxer lobby is very well connected, very loud, so it's a small group of people who are running things, but they know how to use the internet,” the ex-Facebook policy employee said. “[Facebook leadership] just doesn't want the headache. I also think there are people who are somewhat sympathetic to parents saying that they want to make choices for their children.”

Though Facebook added some information about vaccines from the Centers for Disease Control and Prevention after political pressure during a measles outbreak in 2019, it’s largely allowed anti-vaccine misinformation to stay up, and for groups promoting such materials to remain active. Facebook also banned ads with vaccine misinformation, a ban it hasn’t enforced very effectively.

Asked if Facebook would take a more restrictive approach about COVID vaccine misinformation, Bickert allowed for the possibility, but not yet. “The situation is obviously dynamic,” she said. “We're going to have to wait and see how it develops and what the state of treatment is, and when there are treatments what the risks are associated with those treatments and so forth, it's complicated territory.”

That position may inspire scrutiny from critics who would prefer the company to act more consistently — and aggressively. “They should follow through with what they say they're going to do,” Judd Legum, the author of the Popular Information newsletter, told BuzzFeed News. “If there's a page with hundreds of thousands of people following it, and the whole purpose of the page is to talk about how it's a government conspiracy, and hospitals are intentionally killing patients to drum up support for a mandatory vaccine, and this drug has a 99% effective rate against coronaviruses, they should probably be more aggressive about taking that type of content down.”

And there are forms of misinformation that could be even more damaging than content about the coronavirus or vaccines. “Climate change is actually an even bigger emergency than COVID, though society hasn't yet fully appreciated this — partly due to climate misinformation,” climate scientist Peter Kalmus told BuzzFeed News. He worried that the new policy was too narrowly targeted on immediate harm.

"Facebook should apply similar standards to climate misinformation as to COVID."

“Climate breakdown threatens our food, water, infrastructure, health, economic, and geopolitical systems, has the potential to cause death and suffering on a massive scale, and will likely be effectively permanent,” Kalmus said. “So in my opinion, Facebook should apply similar standards to climate misinformation as it does to COVID misinformation.”

Facebook doesn’t appear ready to take action with regard to climate crisis denial — making what appears to some outside observers an indefensible distinction between the imminent and the merely inevitable. Asked if the company would apply the same standard it did to the coronavirus to areas like climate change, Bickert demurred, saying, “All of our policies evolve over time and we're learning a lot through the way we're seeing people respond to what we're doing.”

Fighting the Science

Twitter

When Facebook sided with the WHO’s guidelines for the coronavirus — and hence, science — it was fairly uncontroversial politically. All 50 US states put stay-at-home orders in place, and social distancing was the norm. But it didn’t take long for a political fissure to form among those who subscribe to the WHO’s guidelines and those who don’t — putting Facebook between them.

In a CNBC and Change Research survey conducted last month, Republicans outpaced Democrats in battleground states in the belief that returning to daycares, bars, sporting events, and hair salons was safe. And now sustained protests against the WHO-informed government policies are in full swing, sparked by right-leaning politicians, anti-vax groups, and normal people with social media accounts.

Many of these protests have added event pages on Facebook. The company has removed some for challenging their state’s stay-at-home orders, a decision that some lawmakers have disagreed with. “You have to hear both sides of the debate,” Danny McCormick, a Republican Louisiana state representative, told BuzzFeed News. “If you just censor one side of the debate because you think the other is the side you agree with, you're not increasing education, you're stymieing education.”

Rep. Ro Khanna, a Democrat who represents parts of Silicon Valley, said this is an area where the government should step in. “It should be Congress and regulators that create clear guidelines for what speech is allowed and shouldn't be removed, and what speech should be removed,” he said. Facebook CEO Mark Zuckerberg has argued for government guidelines on speech in the past, but it's a difficult-to-navigate intersection between cracking down on hoaxes and lies and censoring political speech.

When making policy decisions on things like the coronavirus, Facebook looks mostly to immunologists, doctors, and the medical establishment, another ex-Facebook employee told BuzzFeed News. “Facebook would be looking for — what is the medical consensus, not what is the political consensus,” he said.

In this case, that meant conflict.

And it could get worse for Facebook as President Donald Trump takes actions that oppose the health establishment’s guidance. Earlier this week, Trump said he was taking hydroxychloroquine, an antimalarial drug that the Food and Drug Administration had previously warned against taking outside of a hospital setting. (A study released Friday found the drug was tied to an increased risk of death among patients taking it.)

Asked about the possibility of removing something Trump said, Bickert didn’t rule it out. “We have removed content from high-profile individuals, including the president of Brazil and the president of Madagascar, where statements that they've made have contradicted health guidance,” she said. “Nobody is exempted from this policy.”

“Public health is squishy by definition,” Kenneth Bernard, an epidemiologist who’s served in the WHO and set up the NSC’s health security office, told BuzzFeed News. “We don't have enough information and we're dealing with biological organisms that have a variety of responses. Different opinions can exist.”

Still, Bernard said Facebook is doing what’s necessary, even if it’s messy. “I think that it's irresponsible to allow information under 'free speech' to go out if it's actually going to directly cause harm,” he said. “This is not an easy problem. I think Facebook is trying. They don't have a lot to go on.”


Skip to footer