Skip To Content
BuzzFeed News Home Reporting To You

iFunny Moderators Say They Have A Nazi Problem That The Site's Leaders Won't Fix

An extremist subculture festers unchecked on a meme-sharing site popular with young white men.

Posted on August 15, 2019, at 7:21 p.m. ET

Last March, within minutes of the Christchurch mosque attacks, footage from the shooter’s livestream began circulating on the meme-sharing app and site iFunny.

The site’s users, predominantly white young men and teenage boys, declared the shooter a hero. They photoshopped him to look like Jesus. One user re-created the entire attack as a Minecraft simulation. And, according to a current moderator for iFunny, there wasn’t a thing he could do about it.

“Our app was so filled to the brim with [reuploaded] footage from the livestream, I was actually losing sleep at night,” the moderator told BuzzFeed News. “Their screams were burned into my mind.”

In the hours immediately following the attack, the site’s COO told him and the other volunteer moderators to leave up the footage of the shooting, as well as content that glorified the killer. Only days later did the site reverse course, under external pressure.

“Moderators were not even allowed to remove all clips of the livestream from the app until certain governments started declaring possession of the footage a crime,” said the moderator, who requested anonymity to speak freely.

An iFunny spokesperson denied that extremist content is allowed to remain on the site, saying that it is removed within a day of being reported: “All content of mass shootings is bannable as ‘death/gore’ if it shows victims or potential victims.”

This week, authorities charged Justin Olsen, an 18-year-old Youngstown, Ohio, resident with threatening a federal officer for posts he made on a Discord server full of other iFunny users and which he advertised from an iFunny account full of memes calling for attacks on Planned Parenthood and a holy war between Christian and Muslims. Law enforcement seized 15 rifles, 10 semiautomatic pistols, and 10,000 rounds of ammunition from Olsen’s father’s home on Aug. 7, according to court documents made public this week.

Said the moderator, “I sincerely fear the next El Paso–type shooter will come from the depths of our app.”

The site's executives declined multiple requests for comment.

Since its founding in 2011, iFunny has become the beating heart of the teen boy internet, a place to share funny images — think 9gag for the mobile age. On its homepage this week you’d find pictures of Larry David, overflowing bottles of beer, and copies of tweets that read, “did you know girls can die from lack of attention?”

But under that veneer is a radicalized culture of white nationalism. According to a current moderator, a former moderator, and dozens of iFunny users who spoke to BuzzFeed News, iFunny is swarming with extremist content, and the company’s COO Eugene Litvinov, a Russian living in Los Angeles, isn't doing a particularly good job of policing it.

In Slack messages provided to BuzzFeed News, Litvinov seemed to celebrate iFunny’s negative press coverage following Olsen's arrest.

"[The] Saboteur story brought us some new users," Litvinov said in reference to iFunny user Samuel Woodward, an alleged member of the violent neo-Nazi group Atomwaffen Division, accused last year of killing his gay, Jewish former classmate and currently facing life in prison without parole. "Will see how many users this story [about Olsen] will generate."

The current and former moderators BuzzFeed News spoke to claimed that Litvinov is running iFunny, not David Chef, referred to by the community as the "only official voice of iFunny."

According to a spokesperson for iFunny, Litvinov communicates with moderators in Slack, but is only involved in high-level discussions. But after the news of Olsen’s charge broke this week, moderators in Slack fought with Litvinov about how the site was being run. In chat logs reviewed by BuzzFeed News, concerned moderators worried that an iFunny user could carry out a terror attack in the near future.

“'OK but like what happens when one of our users shoots up some place' — we have over 10 million [monthly active users] now. For sure some of them will shoot somebody, commit suicide, and the rest, and so on,” Litvinov said. “It happens.”

The moderators BuzzFeed News spoke to said that iFunny’s guidelines on extremist content are woefully inadequate. They also said that there is a group called the Abuse Neutralization Bureau (ANB) that Litvinov calls in to moderate extremist content. It is unclear to the moderators who actually works on the ANB team.

“Moderators are no longer allowed to so much as touch the neo-Nazis and are only able to report them to a team called ANB, which may or may not remove the posts in a timely manner,” the current moderator told BuzzFeed News. “But ANB provides no consequences to these users at all.”

According to a copy of iFunny’s 100-page Moderation Guidelines that BuzzFeed News obtained, moderators should remove posts that dox users, as well as GIFs that could trigger seizures.

"Memes and jokes about gay or LGBT issues are fine,” the handbook reads. “Memes and images that show disabled people are fine if they use text or meme format to make a joke.” Any post including “The N-Word" is to be removed. Posts that call for political violence, as well as neo-Nazi propaganda, are not. Instead, the handbook directs moderators to report them to the ANB.

“Users are free to use political symbols in their jokes and memes, for example: soviet hammer and sickle, photos of Hitler, swastikas, donkeys and elephants, rainbow flag, etc.,” the handbook reads. “The spelling of the word ‘nigga’ is like saying bro, and too many people use it to police it.”

“Nazis can only get a one-day/one-week time out from posting the N-word or personal info,” the moderator said. “Accounts such as these are never deactivated or removed.”

Users who are banned for harassment or extremism are able to continue influencing the site via Discord servers. They coordinate with those still on the site, leading the conversation and having them repost memes. It was in one of these servers that Olsen allegedly threatened to kill federal agents earlier this month.

An iFunny user named Gamermunchies420 invited BuzzFeed News into a 400-person Discord server run by a banned iFunny poster who goes by Boson. The Discord server’s chatlogs only go as far back as last week. The oldest message reads, “All channels have been cleared! Thank you for visiting federal agents, nothing to see here.”

The chats that remained were deeply paranoid arguments about who could secretly be a journalist or a federal agent hiding out on the server. They also attacked other users about who could be the one leaking information about the radicalized iFunny communities.

“[Boson] had his account deleted due to the community leaking images of his family, [but] his influence still shines,” Gamermunchies420 said. “He is also operates a Discord server named after the app, where he monologues anti-Semitic rants and makes jokes about killing homosexuals in chat and bans users for ‘blasphemy.’”

Gamermunchies420 said not much is known about Boson, but believes him to be a 17-year-old white nationalist living in southeast Montana.

Another user, who asked to be referred to as S, said that two far-right groups of iFunny users — an anti-Brony group called $WA and a white nationalist group called 4R (Fourth Reich) — fantasize about political violence on Discord servers.

“Many people are violently radicalized — give it a little longer and I guarantee there will be more shooter/murderers from the app,” S told BuzzFeed News.

Just as 8chan users venerate certain mass shooters, iFunny has its own celebrities. One of the earliest was a 13-year-old going by “Shaugureth" or “Shaug." Law enforcement investigated the boy after he allegedly threatened to carry out a mass shooting at Land O’ Lakes High School in Florida in 2014. Memes circulating on iFunny this week showed Shaug and Olsen meeting Samuel Woodward in jail.

Shaugureth, now 18, told BuzzFeed News that another user had orchestrated the incident as a prank. “An iFunny member called Yoloswagdawg had taken my picture and posted it with a caption threatening a high school in Florida called Land o Lakes that was the high school of another iFunny member,” Shaugureth said. “I had to send pictures of myself to the Sheriff and was visited by local police (I live in Iowa) to clarify that I was not a threat or that I had made the threat.”

Shaugureth understands how purportedly satirical content on iFunny can blend into actual extremism. iFunny users call it “LARPing."

“Sometimes people get so wrapped up in political ideologies that they ‘roleplay’ what would happen in the case of something happening, like conspiracy theories,” he said. “Generally it’s just teenagers who say edgy things for notoriety.”

Edgy jokes from teenagers or not, the line between irony and real-life violence is thin.

Reddit and 4chan didn’t start as hubs of violence and extremism. Without proper moderation, though, that’s what they became. Of the dozens of iFunny users who contacted BuzzFeed News this week, the majority were concerned about losing their app to far-right extremists.

“We still love this app and the people we’ve met on it,” a user named VRtist told BuzzFeed News. “We don’t know where we’d go if that happened, there’s not many places quite like the meme-making, GIF-captioning app we’ve all come to love.”

The current moderator who spoke to BuzzFeed News isn’t optimistic things will get better on the site.

“Our users are radicalizing themselves at an alarming rate as they use each other as a springboard for the next most edgy thing,” he said. “With every domestic terrorist we see, it seems our users grow bolder."

ADVERTISEMENT