Welcome to Infowarzel, a newsletter about Big Tech, the pro-Trump media, and the internet's daily information wars. If you haven't yet, you can subscribe to this newsletter right here.
Two essays that are different but also very related!
- Facebook & YouTube v. Infowars (And The First Rule Of Crisis PR)
- What It Would Mean To De-Platform Alex Jones (And Why Infowars Is Spooked)
But first! A quick note on yesterday's news:
Facebook disclosed yesterday that it was (once again) the target of coordinated political influence campaigns on its platform. Facebook said the accounts sought to sow discord by posting about divisive social issues. From the NYT, here's a description of what that division looked like:
Facebook said it had discovered coordinated activity around issues like a sequel to last year’s deadly “Unite the Right” white supremacist rally in Charlottesville, Va. Activity was also detected around #AbolishICE, a left-wing campaign on social media that seeks to end the Immigration and Customs Enforcement agency.
First off, it's probably a positive step that Facebook is trying to disclose this stuff as it happens (although, according to NBC News, it looks like they didn't disclose the worst posts to the public, which is...peculiar!).
But, despite the fact that Facebook caught this meddling campaign, it feels important to point out that a lot of the damage is already done. Even when foreign meddling campaigns on tech platforms are detected they still succeed. Reports like the NYT's reveal to the public that major political discussion topics and hashtags are being gamed by propagandists, which helps cast doubt on legitimacy of larger social conversation. It's a version of what some smart researchers call “reality apathy”: beset by a torrent of constant misinformation, people simply start to give up.
What's perhaps most unsettling about yesterday's foreign influence campaign news is that these actors are constantly drafting off actual social movements in order to stoke division and lull unsuspecting citizens into trusting them. As a result, actual protest movements are getting caught up in the Facebook takedowns:
The disturbing reality, here, is how the trolls/meddlers win no matter the outcome. If they operate in secrecy, they meddle and succeed. If they are discovered, they cast doubt on the greater political and cultural conversation across these platforms. If they get shut down, they take down real advocacy orgs with them. Dark times.
Facebook & YouTube v. Infowars (And The First Rule Of Crisis PR)
If you’ve been following the Big Tech v. Infowars saga the past few weeks, chances are you’re exhausted. To (not so) quickly recap:
Asked point blank at a July 11th summit about eradicating fake news, Facebook suggested that Infowars had every right to stay up on Facebook because simply publishing unprovable, baseless allegations is not enough to warrant a ban. Then, it suggested it would downrank individual posts from the outlet, should they be flagged as fake or irresponsible in some way. A day later, after a brief outcry, the company used its Twitter account to suggest that Infowars was sort of like other sites on the left and right that publish controversial opinion pieces. Shortly after, Mark Zuckerberg made some comments about Holocaust deniers and how its hard to police dangerous untruths from communities that wholeheartedly believe them. Soon after, he apologized. Then, Alex Jones accused Robert Mueller of raping children and flying them around on “sex planes.” Facebook doubled down.
At some point, YouTube — which had previously awarded a strike to Infowars for videos alleging that school shooting victims were crisis actors but decided that old videos suggesting Sandy Hook was a hoax weren’t worth more strikes — piped up and removed four videos from Jones’ account. Jones was given another strike, but only one instead of four — you need three to get banned — because of a “quirk” in the way the company bundles rules violations. Perhaps in response, Facebook scrambled and deleted some offensive videos off of its platform and banned Jones for 30 days (crisis actor videos emerge). When asked, Facebook told its reporters that it is “close” to unpublishing Jones’ pages altogether. Not quite there yet, but close. Jones, meanwhile has been promoting baseless conspiracy theories to his audiences since well-before Y2K.
To call the above scenario a clusterfuck is unfair to clusterfucks. If there is a desired outcome to the policy posturing above, Facebook and YouTube clearly don't know what it is — beyond blindly hoping Infowars cleans up its act or interest in it wanes. As Casey Newton noted last week, when Facebook embarked on its plan to “connect the world,” it didn’t anticipate a site like Infowars and had no plan for a mega popular, bad faith conspiracy news outlet.
It’s embarrassing, really. As Kevin Roose of the New York Times noted:
None of this is new; In fact, it’s well established. Platforms like Facebook,YouTube and Twitter have long avoided taking responsibility for the content people upload to their sites by operating with vague content rules and arbitrary enforcement. What’s changed is the public’s interest in those rules and how they’re applied.
There’s a decent case to be made that imploring technology companies to determine what’s acceptable news and what’s not is a treacherous path to head down and one that we could come to regret. I buy parts of that argument. Earlier this summer, Twitter banned tweets containing a link to a story from a national media outlet that included White House advisor Stephen Miller’s personal phone number (an action that, despite Miller’s status as a public figure, violated Twitter’s rules against publishing personal information). Many were furious at Twitter for effectively censoring a media outlet. The flipside: Twitter’s action was made possible by policies it created in response to years of lobbying on behalf of harassed users and countless stories about Twitter’s failure to curb abuse. Miller’s odious legislative accomplishments notwithstanding, many of the same users who’d clamored for Twitter to enforce its rules bristled when the enforcement seemed to benefit the someone they disagreed with.
The platforms are deeply concerned about setting precedent when they moderate content, but it’s not clear they have to be. Facebook and YouTube and Twitter are not governments, despite having user numbers greater than most countries (or continents, in Facebook’s case). They do not have a judicial branch inside their campuses weighing past judgements for consistency.
As plenty of critics have written, the platforms, in many ways, operate like an authoritarian form of government — certainly, in the case of YouTube demonetization or Facebook nipple takedowns, enforcement can often feel as if there’s no due process. The platforms can do mostly whatever they want without much having to worry about hamstringing themselves in the future. In fact, they already operate this way. What we’re seeing right now with YouTube’s strikes policy is an excellent example. YouTube created the policy and has enforced it as it sees fit. In YouTubeland, Jones currently has one strike, though could just as easily have more than a dozen strikes if the powers that be looked at the issue a different way. Similarly, YouTube forums are full of angry posts from creators alleging YouTube deleted their (much smaller) accounts for no reason or because of a cryptic rule violation and with little warning. And while, sure, there’s an appeals process, the platform can do (and does) what it wants.
Which is not to say that these platforms must eradicate Jones or Infowars from their platforms (although, in the case of Facebook and YouTube, doing so may further their commitment to combating false news). The companies are beholden to shareholders, not the idea of a more perfect union, which makes the song and dance of the last few weeks so exhausting.
A few months ago, during the rapid fallout of Facebook’s Cambridge Analytica scandal, a smart person mentioned to me the first rule of crisis PR. The idea is to quickly figure out what the ultimate end game of a disaster will be, and then cut all the bullshit and just jump straight to doing whatever uncomfortable thing you’ll inevitably have to do under duress days, weeks, or months later. I’ve been thinking a lot about that maxim the past two weeks as the platforms make declarations about Infowars as a legitimate publisher, followed by some hedging, then a bit of backtracking, some light finger-wagging, a short timeout, and finally an ominous suggestion that the publisher is on thin ice. All the statements, interviews, and bad press seems to be careening toward a particular outcome for Facebook, YouTube, and Infowars, and it seems as if everyone but the platforms knows it.
What It Would Mean To De-Platform Alex Jones (And Why Infowars Is Spooked)
As the Infowars platform battle heated up two weeks ago, one talking point argued that banning Infowars outright from sites like Facebook or YouTube would turn Jones and his outlet into a martyr for the pro-Trump media and particular segments of the right. As it turns out, we didn’t even have to wait for that. Last Thursday evening on his Fox News program Tucker Carlson accused mainstream journalists of crusading for Jones’ banishment from the platforms, a move he considers an attack on free speech. He then offered his own defense of Jones:
"I know we're supposed to think Alex Jones is way more radical than Bill Maher or Michelle Wolf or Rosie O'Donnell, but he's got a point of view and CNN is trying to squelch that point of view."
Carlson’s defense of Jones may not feel surprising, but it’s notable, given how little play Jones gets from mainstream conservative media like Fox. More often, the establishment branch of the pro-Trump media pretend as if Infowars doesn’t exist in the same ecosystem (even though Jones’ talking points have made their way to Fox News’ airwaves). But Tucker wasn’t the only one coming to Jones’ defense. Sen. Ted Cruz tweeted his outrage that Jones was being silenced:
Though Cruz can claim he’s actually denouncing Jones (and defending the Constitution), it makes little difference to the pro-Trump media and fever swamps, who’ll read this as an endorsement of Jones’ operation. Similarly, Cruz and Carlson’s defenses (as prominent members of their fields) may send signals to their respective colleagues, opening the door to the possibility of defending Jones along the same speech grounds.
It’s no matter that Infowars’ penalty from Facebook or YouTube is a penalty for violating the sites’ community guidelines for things like hate speech and not a suppression of their free speech, Jones is already becoming a martyr.
But I’m not so sure that the martyr argument is a convincing enough reason to allow Jones on the site in perpetuity. TechCrunch writer Josh Constine said that a ban would cause Jones’ defenders to “push their political allies to vindictively regulate Facebook beyond what’s actually necessary. They’d call for people to delete their Facebook accounts and decamp to some other network that’s much more of a filter bubble than what some consider Facebook to already be. That would further divide the country and the world.”
Maybe! But there’s a decent case to be made that everything Constine describes above are precisely the kinds of tactics that Jones employs daily in front of his massive audience, a substantial portion of which access him through the tech platforms. The pro-Trump media is going to rail against perceived bias and censorship and call for justice against Big Tech whether Jones and Infowars stay up on the platform or not. Constine’s argument implies that the pro-Trump media’s modes and methods of attack are based on a principled argument about free speech rather than what it’s really built around: ginning up a culture war and winning at all costs.
Jones’ quite literally named his publication after information warfare. His outlet’s slogan declares “there’s a war on for your mind.” If you’re Facebook and you’re looking to slow division (and it is still not clear that Facebook sees this as their responsibility), there’s a pretty convincing argument to be made then that limiting Jones’ reach entirely (not just de-ranking or making his content harder to find) would be a drastic way to do that.
Jones and Infowars seem keenly aware of the dangers of deplatforming. Jones is an early adopter of the internet and has used the web to grow his audience since the ‘90s, after traditional media thumbed its nose at his conspiracy theories. Jones knows how important the platforms are in cultivating his audience and distributing his message. A quick glance at Jones’ YouTube page suggests there’s some anxiety in the Infowars newsroom — on Friday afternoon, 12 of the last 18 videos on the page were about tech censorship:
Or this, from two nights ago:
There’s a lot at stake for Jones (and the pro-Trump media as a whole). Their movement has been wildly successful at exploiting and leveraging the major social platforms to access people outside their ideological spectrum. Losing those platforms could be devastating. I think a lot about Milo Yiannopoulos’ increasing cultural irrelevance ever since his Twitter ban in the summer of 2016. Some of Milo’s wounds have been self-inflicted, but it feels hard to deny that his de-platforming from Twitter decreased his ability to hijack news cycles. Twitter, especially for a troll like Milo who thrives on provocation, is a way to attract the attention of mainstream journalists and stay relevant. Milo may still plot and scheme and catch the eye of the press, but his frequency is limited.
The de-platforming of the pro-Trump media and the conspiracy industrial complex isn’t a free speech case in the traditional sense that the group is arguing but, in many ways, the threat is equally existential. For Jones and his ilk, success comes not just from cultivating a loyal audience, but having access to a broader swath of people — especially his biggest foil: the mainstream media. It might not be every day that Jones wins new fans, but across his platforms, with every broadcast, he ignites a slew of enemies, a tactic that he’s shown, is proven to divide.
THINGS YOU SHOULD READ
- Here is an unbelievably thoughtful piece on the tech backlash and why it will ultimately be insufficient. This has stayed with me all week and will definitely be the subject of another newsletter.
- Our scoop from last week: A memo from departing Facebook chief Alex Stamos, which has a number of candid thoughts about how the company can right some of its wrongs.
- A really eye-opening podcast between Ezra Klein and Jaron Lanier about the case for deleting your social media accounts. My favorite part is when Lanier starts talking about how the mechanics of the social platforms make it so that progressive movements often have severe (and even more powerful) backlashes. Highly recommend.
- The Daily Beast's Will Sommer (who also has a newsletter that you should subscribe to) follows the fever swamps very closely and has this excellent Qanon primer...which you're (sadly) probably going to need.
Okay, that's it for today! Tell me what you thought (email me at email@example.com) and send suggestions, questions, or things you'd like me to look into/discuss in these emails. This is an experiment/work in progress and your input is so appreciated!