Reddit Is An Incubator Of Hate

Reddit’s decision to police “behavior, not ideas” isn’t just foolish — it’s reckless.

If, last week, as the world mourned the brutal church murder of nine black men and women in Charleston, South Carolina, you were to visit Reddit's most popular racist community and scroll down the right side of the landing page — just past the rotating memorial image carousel of white people who've been murdered, injured, or robbed by black people — you'd see Reddit's familiarly cheerful alien logo, originally drawn by Reddit co-founder and chair Alexis Ohanian. Above that logo was a screenshot, taken from a comment Reddit CEO Ellen Pao made the week prior regarding Reddit's decision to shut down five hateful subreddit communities. It read, "We're banning behavior, not ideas." And sitting above Pao's screenshot and Ohanian's creation rested three lines of bolded and shadowed text: "Coontown Supported By Reddit."

It was a striking little triptych, and one that was intended to provoke visitors while simultaneously justifying the existence of a virulently anti-black web forum that is growing at a rapid clip:

Perhaps most unsettling of all about the "Supported by Reddit" tagline, though? It was completely true.

As Reddit celebrates its 10th birthday, it is stuck in the untenable position of simultaneously trying to clean up its most vile nether regions while still creating conditions that attract racism and misogyny the way a porch light draws in moths. The site's management has recently taken several genuine steps to make users feel safer on Reddit: In February, they set out to define its standards of harassment and revenge porn; earlier this month, they acted to enforce those standards by banning five of the site's most odious subreddits. In the post noting that change, Reddit's executives — including Pao and Ohanian — stated that "ultimately, we hope to have less involvement, but right now, we know we need to do better and to do more."

But if the company's rhetoric suggests an organization committed to eradicating hate and bigotry, its actions are that of one that's ultimately unwilling to do the hard work of abolishing the conditions in which hate and bigotry thrive. The r/CoonTown forum is far from the site's only racist forum: Before being banned for harassing redditors in other subreddits, r/n*****s was the site's most infamous racist community. And in 2013, the popular racist subreddit r/GreatApes branched off into a network of smaller groups known as "The Chimpire." According to the Southern Poverty Law Center, "within a year, the Chimpire network had grown to include 46 active subreddits spanning an alarming range of racist topics, including 'Teenapers,' 'ApeWrangling,' 'Detoilet,' and 'Chicongo,' along with subreddits for both 'TrayvonMartin' and 'ferguson.'" And that list fails to include the darkest, most egregious communities like r/WatchN*****sDie, the content of which very much reflects the group's name. On Reddit, hate is alive and well, and in some cases, growing.

At the core of the problem is Reddit's newfound vow to police hate only when it manifests into real-world harassment — that is, to create a distinction between ideas and behavior that doesn't actually exist. Ideas inform and incite behavior; we see this both in the physical world and on Reddit, where the ideals and discussions of its thousands of communities are reflected in the actions — both good (raising money for a Kenyan orphanage as well as a terminally ill cancer patient) and bad (Violentacrez, r/creepshots, and The Fappening) — of its members.

What's more, there's credible research to suggest that right-wing extremist online communities are frequently linked to hate crimes. An April 2014 report from the Southern Poverty Law Center found that more than 100 murders have been linked to Stormfront.org, a white nationalist website and forum that first emerged in 1995. The author of the report, Heidi Beirich, told The Guardian that her research showed that online hate forums helped nurture and strengthen already formed prejudices and, in the case of Stormfront, transform them into real-world violence:

"It's pretty clear that websites like Stormfront are breeding grounds for people who are just enraged at their situation ... Stormfront helps them find the enemy that is standing in their way – whether it be Jews, African Americans, immigrants and so on ... Unfortunately it's not very surprising that people who live in this kind of stew of violent racism eventually pick up a gun and do something about it at some point."

While — to be clear — Reddit doesn't have the same documented ties to hate crimes as Stormfront, Reddit's racist forums have recently shown alarming support for the kind of thinking that motivates and accompanies violence in the physical world. In the wake of the shooting, r/CoonTown members expressed sympathy for and identified with shooter Dylann Roof, suggesting that he was "one of us." Other r/CoonTown posts surfaced this week by Gawker clearly endorse Roof's act of terrorism, with comments like "the only good n*****r is a dead n*****r" and "I have to say it was a great bit of news this morning to see 9 n******s get good."

Similarly, Here's how the SPLC's Keegan Hankes described a visit to r/CoonTown during its earlier days:

"These gruesome videos show black men being hit in the head repeatedly with a hammer, burned alive, and killed in a variety of other ways. The subreddit's banner features a cartoon of a black man hanging, complete with a Klansman in the background. One fairly typical user, 'Bustatruggalo' applauded the graphic violence as '[v]ery educational and entertaining.' He or she continued on a separate thread: 'I almost feel bad for letting an image like this fill me with an overwhelming amount of joy. Almost….'"

This history makes Reddit's "behavior, not ideas" distinction incredibly precarious. It's nearly impossible to quantify or measure the effect Reddit's most hateful communities have on any one individual — or, in r/CoonTown's case, what the effect of prolonged exposure to an echo chamber of hate could have on any of its nearly 16,000 members. Reddit's decision to wait until harassment scales its walls and into the physical world is an act of good faith in a community that has historically and repeatedly translated its hateful speech into actionable barbarity.

Reddit's choice to police behavior, not ideas — that is, to ignore the hate in its community only until the hate makes itself effectively unignorable — reflects the company's unwillingness to do the difficult, frustrating, and never-ending whack-a-mole work of fighting hate and violence.

You can see it in the company's user agreement, which in 2013 read that those signing up for the service "agree not to use any obscene, indecent, or offensive language," and should not post "graphics, text, photographs, images, video, audio or other material that is defamatory, abusive, bullying, harassing, racist, hateful, or violent." The user agreement also contained a clause asking new sign-ups "to refrain from ethnic slurs when using the Website." Today, that agreement reads much differently. The words "racist," "hateful," "violent," and "abusive" appear nowhere in the 2015 version of the document because to include them would force the site's hand in doing the arduous work of banning more communities.

Stamping out hate on Reddit is roughly as easy as stamping out hate anywhere, which is to say it's nearly impossible. "You can't treat this kind of hate or structural violence as a bug — it's a feature in the system," David Banks, a social scientist who has written extensively about Reddit's ability to foster hate, told BuzzFeed News. "Structural violence exists in the site because it exists in society, and so it will keep showing up. So relying on reporting and flagging and tagging to get rid of this will never fix the issue for good."

It's hard, going forward, to see how Reddit can address its failings and come out whole on the other side. For the site to make the structural changes necessary to ensure the safety of its entire community from harassment, it's possible Reddit could lose a not-insignificant piece of its identity and community. As Banks told BuzzFeed News, "The things which make Reddit unpredictable are the things that make it interesting at all." But as we look back at Reddit, which is now a decade old and still embodies the quixotic ideals of the early social web, perhaps that change is the only way forward. "Maybe it speaks to just how effectively we, as a society, ignore violence against women and people of color that we'd think this sort of community and the premise of Reddit was ever a good idea in the first place," Banks said of the site.

But Reddit is not simply some dark corner of the internet — it's a media platform that has been home to celebrities, sitting presidents, powerful investors, and advertisers. And its status as an incubator for hate is a choice with very real consequences. The process by which racist idea curdles into racist action is impossible to observe and chart in one human being. Believing this can be done to protect redditors inside 9,000 communities with hundreds of millions of individuals isn't just foolish — it's reckless.

Skip to footer