Now You Can Read Facebook's Community Standards Playbook

The company is releasing its internal content moderation guidelines and creating an appeals process for individual posts that get taken down.

When Facebook talks about its policies, it often sounds like a government. Appearing before Congress earlier this month, Mark Zuckerberg offered testimony filled with state-like pronouncements. “I don’t want anyone to use our tools to undermine democracy. That’s not what we stand for,” he said.

But though it can sound like a state, Facebook has always been vague about its laws. The company does publish community guidelines, but it has offered little information about how it decides those guidelines, how it enforces them, and what people who disagree with its decisions can do to appeal them.

That's changing. On Tuesday, Facebook announced that it's releasing its internal content moderation guidelines and giving the public a look into the way it develops its policies. The best glimpse of these guidelines until now has come courtesy of the Guardian, which published a comprehensive report on them last year. Facebook is also introducing an appeals process for people whose posts were removed “for nudity or pornography, hate speech, and violence.”

“Our content reviewers consult a set of internal implementation guidelines in order to make decisions,” Facebook Global Policy Management VP Monika Bickert said in a blog post. “For the first time, we're sharing updated Community Standards that include these internal guidelines.”

The added transparency, Bickert said, is coming for two reasons:

First, the guidelines will help people understand where we draw the line on nuanced issues. Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines – and the decisions we make – over time.

The new appeals process will be the first of its kind for individual pieces of content taken down by Facebook. An appeals process already existed for people whose profiles, pages, and groups had been taken down. To appeal a content takedown, you'll be able to click “Request Review” and Facebook will make a decision within 24 hours.

Facebook has done a lot of promising “to do better” in the aftermath of its Cambridge Analytica scandal. So far, its definition of what it means to do better has largely been to provide more transparency. “We should be clear about how data is used, and offer easy ways to control it,” Facebook Chief Product Officer Chris Cox told BuzzFeed News in a discussion on whether Facebook should revisit how much data it collects.

But while more transparency is a positive step, Facebook ultimately will have to address deeper issues, such as whether it alone should be making decisions to censor speech on its platform. In an interview last month, Zuckerberg expressed some reservations about that: “I feel fundamentally uncomfortable sitting here in California at an office, making content policy decisions for people around the world,” he told Recode. It's a problem that transparency alone won't solve.


Topics in this article

Skip to footer