Facebook Will Ban White Nationalist And White Supremacist Content

A Facebook spokesperson said that Wednesday's new policies do not cover Holocaust denial, which will still be allowed on the platform to some extent.

Facebook will begin banning white nationalist, white separatist, and white supremacist content, and direct users who attempt to post such content to the website of the nonprofit Life After Hate, which works to de-radicalize people drawn into hate groups.

The change, first reported Wednesday by Vice’s Motherboard, comes less than two weeks after Facebook was heavily criticized for its role in the Christchurch mosque attack. The gunman went live on the platform for several minutes before the attack began, showing off his guns and at one point ironically said, “Remember lads, subscribe to PewDiePie,” referring to the Swedish YouTuber connected to a number of racist and anti-Semitic controversies.

BuzzFeed News has reached out to Facebook for more information on how the ban will work. In a blog post titled “Standing Against Hate,” Facebook published Wednesday, the company said the ban takes effect next week. As of midday Wednesday, the feature did not yet appear to be live, based on searches by BuzzFeed News for terms like “white nationalist,” “white nationalist groups,” and “blood and soil.”

“It’s clear that these concepts are deeply linked to organized hate groups and have no place on our services,” the blog post reads. “Over the past three months our conversations with members of civil society and academics who are experts in race relations around the world have confirmed that white nationalism and separatism cannot be meaningfully separated from white supremacy and organized hate groups.”

Earlier this week, a French Muslim advocacy group filed a lawsuit against Facebook, along with YouTube, for not removing footage of the attack quickly enough.

Facebook did not respond to an inquiry from BuzzFeed News last week on whether white nationalism and neo-Nazism were being moderated using the same image-matching and language-understanding it uses to police ISIS-related content. According to internal training documents that were leaked last year, Facebook has typically not considered white nationalism intrinsically linked to racism.

Based on information in Motherboard’s report, the platform will use content-matching to delete images previously flagged as hate speech. There was no further elaboration on how that would work, including whether or not URLs to websites like the Daily Stormer would be affected by the ban.

A Facebook spokesperson also said that Wednesday's policy update will not change how the company takes action against the topic of Holocaust denial, a subject that is often tied to white nationalists and white supremacists. Last July, CEO Mark Zuckerberg defended the right of Holocaust deniers to share their views on Facebook in an interview with technology site Recode, sparking widespread backlash and an examination of Facebook's rules.

"I’m Jewish, and there’s a set of people who deny that the Holocaust happened," Zuckerberg said at the time. "I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong."

A Facebook spokesperson said that while Holocaust denial is allowed to some extent on the platform, any content that violated the company's hate speech policies or attacks people based on their religion or national origin would not be tolerated. According to one senior source, Facebook will continue to view Holocaust denial as misinformation, which is allowed on the platform, and not outright hate speech. That source also noted more policy changes may be coming in the future as the company gets more feedback from the community and civil rights experts.

Progressive nonprofit civil rights advocacy Color of Change called Facebook’s new moderation policy a critical step forward.

“Color Of Change alerted Facebook years ago to the growing dangers of white nationalists on its platform, and today, we are glad to see the company’s leadership take this critical step forward in updating its policy on white nationalism,” the statement reads. “We look forward to continuing our work with Facebook to ensure that the platform’s content moderation guidelines and trainings properly support the updated policy and are informed by civil rights and racial justice organizations.”

In another change to Facebook’s moderation policy following public outcry, last month, the platform announced that anti-vax misinformation would appear less frequently across people’s News Feeds, public pages and groups, private pages and groups, search predictions, and in recommendation widgets around the site. The announcement came after weeks of pressure from lawmakers and public health advocates to crack down on anti-vax content.

Topics in this article

Skip to footer