YouTube's conspiracy video problem may be more widespread than previously thought, according to new research by professor and data journalist Jonathan Albright. Venturing down the platform's algorithmic rabbit hole, he found a network of almost 9,000 conspiracy-related videos with almost four billion views in total.
In the wake of the Parkland school shooting, YouTube has come under criticism for allowing conspiracy videos suggesting the survivors of the shooting are crisis actors to appear high in search results and Trending pages. YouTube has attempted to remove some of the offending videos and, according to CNN, issued one of three strikes to Alex Jones' Infowars channel for one such video.
Albright's initial research, published early Sunday morning to his Medium page, is a new look at the breadth and depth of YouTube's conspiracy problem. To start, Albright searched YouTube's API for "crisis actors" and culled the “next up” recommendations for each of the results. Albright ended up with list of about 9,000 conspiracy-themed videos. A sampling of the videos surfaces everything from false flag videos to disturbing sexual content — one YouTube-hosted video flagged by Albright bears the title "Truth or Dare with rape."
While not all videos in Albright's results are conspiratorial or offensive — some are from major news organizations and others are innocuous late-night TV or comedy clips — the majority of the 9,000 videos appear to contain conspiracy theories. They come from hundreds of disparate accounts that post and repost videos to the tune of hundreds of thousands of views. Indeed, many of the videos have gone viral — as Albright notes in his Medium piece, the top 50 mass shooting–related conspiracy videos he surfaced have around 50 million views.
Albright told BuzzFeed News on Sunday morning that he was concerned by the depth of the search results he uncovered. "It's fueling the expansion of a genre of socially harmful material that's quickly leading to our inability to fact check and counter false claims," he said.
A review of the raw data that Albright shared with BuzzFeed News shows how YouTube's recommendation algorithm can push a user deeper into the murky world of conspiracy theories. Albright's initial search for "crisis actor" videos initially surfaces videos about the Parkland children, but then quickly branches off into recommended videos for dozens of other popular conspiracies about subjects including 9/11, the JFK assassination, Waco, the Oklahoma City bombing, Pizzagate, the Illuminati, chemtrails, vaccines, Freemasons, and the Sandy Hook, Aurora, and Las Vegas shootings.
Albright said the results suggest that the conspiracy genre is embedded so deeply into YouTube's video culture that it could be nearly impossible to eradicate.
"It's already tipped in favor of the conspiracists, I think," Albright told BuzzFeed News. “There are a handful of debunking videos in the data. They can't make up for the thousands of videos with false claims and rumors."
Albright also suggested that the proliferation of these videos makes it more attractive for others to create this content. "It's algorithmically and financially incentivizing the creation of this type of content at the expense of truth," Albright said. This, he argues, makes moderation incredibly difficult. "Journalists and affected parties (parents, survivors, first responders, etc.) are not only fighting the content on YouTube, they are fighting its algorithms — first at the 'trending' level and then again at the search and recommendation levels."
Currently, it's unclear from Albright's findings just how many of the conspiracy videos are monetized through YouTube's ad platform. Albright told BuzzFeed News he is working with researchers to attempt to understand the monetization of conspiracy videos.
YouTube did not immediately respond to a request for comment Sunday.