Facebook's Suspensions Of Political Speech Are Now A Pattern

The social platform's temporary suspension of several Palestinian journalists' accounts is its latest "error," with no policy change in sight.

Facebook, a vital forum for online speech, can’t seem to stop removing significant political content from its platform.

Last week, the company disabled several prominent Palestinian journalists’ accounts, following user reports that they were violating Facebook standards. These weren't small-time reporters — they're people who manage pages followed by millions. Facebook later reinstated their accounts, blaming their removal on an error: “The pages were removed in error and restored as soon as we were able to investigate,” a Facebook spokesperson said, using an excuse that didn’t need dusting off, since Facebook has offered variations of it at least four times in past six months.

In April, Facebook removed six pro-Bernie Sanders groups before reinstating them and blaming a technical error. In July, Facebook pulled a video showing Philando Castile dying after being shot by police at a traffic stop, only to subsequently reinstate it and again blame its original removal on a glitch. In August, Facebook suspended two big libertarian Facebook pages for days before reinstating them, saying: “The pages were taken down in error.” Last week, it was an “error” again.

"We sometimes get things wrong."

After four such errors in six months, Facebook's takedowns seem less like occasional missteps and more like symptoms of a flawed policy that needs to be addressed. Asked if there are fundamental issues within Facebook’s systems that need to change, a Facebook spokesperson pointed BuzzFeed News to a public statement, stating: “Our team processes millions of reports each week, and we sometimes get things wrong.”

The company did not respond to a follow-up question about whether Facebook plans to review its tendency to erroneously silence politically significant speech.

Facebook depends on a system of user reports to police content on its platform. When someone sees content they think violates Facebook’s community standards, they can flag it and send it into review. While this system might work well for content that's broadly recognized as objectionable and in clear violation of Facebook policies, it doesn't work quite as well in situations with more nuance. In some of those situations, it seems people with one political perspective are gaming Facebook's system to silence people with other perspectives.

User reports are used as weapons in other scenarios on Facebook, such as the company's "real names," policy, which has been exploited to suspend transgender Facebook users.

This probably won't be the last time a Facebook review team member makes a curation decision that the company will reverse after complaints and further consideration. If the company doesn't change its review system, there’s little preventing errors like this from occurring again, and again.

Skip to footer