Facebook Proves It Isn't Ready To Handle Fake News
Facebook says it knows the truth is messy; what it doesn't know is how to clean up all the fake news it is helping to spread.
On Wednesday afternoon, Facebook invited a handful of journalists to its New York offices for shrimp cocktail, a short presentation, and a question-and-answer session with its head of News Feed. The company screened its expensive 12-minute short film on fake news, which was directed by Academy Award–winning documentarian Morgan Neville and then opened up the floor for reporters to ask questions. The goal: to convince reporters that Facebook has finally found purchase in its long fight against misinformation.
It didn't go as planned. Midway through the Q&A session, CNN reporter Oliver Darcy grilled Head of News Feed John Hegeman about its decision to allow Alex Jones' conspiracy news site Infowars on its platform. Specifically, how could the company simultaneously tout its crackdown on misinformation while still permitting Infowars to operate a page with over 900,000 followers? Hegeman's response was simply that the company does not "take down false news."
Hegeman went so far as to suggest that Infowars — which in recent weeks has pushed the baseless conspiracy theory that Democrats were planning to start a civil war this July 4 — hadn't violated Facebook's rules. "I guess just for being false that doesn't violate the community standards." he said. "I think part of the fundamental thing here is that we created Facebook to be a place where different people can have a voice. And different publishers have very different points of view."
The cognitive dissonance of Hegeman's claim during an event the company billed as "a presentation about our work to prevent the spread of false news" is not lost on Facebook. News Feed product specialist Sara Su told the room of reporters that Infowars' conspiracy theories "can be really problematic and it bugs me, too."
Su's response is similar to the overall message of the company's "Facing Facts" video, which is: This is all quite hard and, just like you, Facebook is frustrated. But despite its media campaigns and good intentions, the Q&A session, and the subsequent media reaction reveal a deeper issue: Facebook simply isn't willing to make the hard choices necessary to tackle fake news.
Though Facebook's misinformation fight is a new initiative, the rationale behind its implementation is rooted in a decade-old philosophy of dodging notions of political bias and censorship at all costs. The result is a near-pathological resistance toward taking a stand against actors that brazenly flout Facebook's rules. And by doing so, Facebook plays into the hands of those who seek to wage information war.
During yesterday's session, Su argued that Infowars operates in a gray area — often toeing the line of provably false but not always crossing it — and, according to CNN, suggested that the company was focusing its takedown efforts on outlets that "can be proven beyond a doubt to be demonstrably false."
Lost in Su's explanation is the fact that this gray area is part of what makes Infowars' conspiracy and false information machine so effective. By offering baseless theories with a kernel of truth and then distorting and sensationalizing them in bad faith, Jones is able to spread misinformation and then retreat from it with little penalty. Similarly, Infowars' ability to abut community standards without flagrantly crossing them (for example, baselessly suggesting that experts argue Sandy Hook may have been a hoax, rather than asserting it himself) allows the outlet to inject false news into Facebook's ecosystem.
In a follow-up statement to CNN, a Facebook spokesperson clarified that it still might choose to downrank Infowars content. “We allow people to post it as a form of expression, but we're not going to show it at the top of News Feed," the spokesperson said. But this sort of shadow censorship likely does little for those who seek out Infowars content or those who share it. And it does little to stop the nearly 1 million users who subscribe to the page from accessing the content.
The problem is that Facebook's good faith effort to combat misinformation while attempting to remain nonbiased and without censorship simply doesn't work against an entity operating in bad faith. In the case of Jones and Infowars, Facebook allows itself to be played by an outlet operating by a different set of rules.
There's a moment in the middle of Facebook's "Facing Facts" video where a data science manager at the company begins dividing information into four quadrants on a whiteboard: “wrong” information, “right” information, “propaganda,” and “hoaxes.” The visual is one of many in the video geared to drive home the point that the truth is quite messy, but the company is determined to fight for it with the appropriate nuance.
However, as yesterday's session notes, the company appears to be approaching the issue of misinformation without much of that nuance. By focusing only on egregious examples of false news, Facebook allows its biggest purveyors of disingenuous conspiracies and polarizing content to operate with impunity while growing their audiences and expanding the footprint of low-quality information on the platform. All they need to know is how to game the system.
Despite investing considerable money into national ad campaigns and expensive mini documentaries, Facebook is not yet up to the challenge of vanquishing misinformation from its platform. As its videos and reporter Q&As take pains to note, Facebook knows the truth is messy and hard, but it's still not clear if the company is ready to make the difficult choices to protect it.