YouTube Is STILL Hosting Graphic Images Of Bestiality Alongside Children's Videos

In April of last year, a YouTube spokesperson told BuzzFeed News, "These images are abhorrent to us and have no place on YouTube."

Nearly a year after YouTube pledged to remove images of graphic bestiality from its platform, simple search queries that include the word "girl" along with "horse" or "dog" ("girl horse" or “girl and her horse”) return dozens of videos promoted with thumbnails of women seemingly engaged in sexual acts with those animals.

While these bestiality-thumbed videos do not depict explicit sexual acts, they are often quite suggestive, featuring women in sundresses caring for animals or crotch shots of women as they bathe or play with horses or dogs. Some of these videos have racked up millions of views. Beyond being clearly in violation of YouTube's policies, they suggest the company is struggling to make good on its April 2018 pledge to develop "better tools for detecting inappropriate and misleading metadata and thumbnails so we can take fast action against them."

Nearly half of the top 20 search results for “girl and her horse” feature thumbnail pictures of women petting horses with erections, fondling horse erections, or seemingly being mounted by aroused horses. Similarly, nine of the top 20 YouTube search results for “girl and her dog” returned videos with thumbnails that feature, among other things, a dog sniffing around a young girl's crotch, a dog mounting a girl, and a girl touching the dog's penis. “CATCHING A WOMAN DOING IT WITH HER DOG!” blares one search result.

One video with an explicit "horse and girl" thumbnail amassed more than 11 million views for the Johny Johny Yes Papa YouTube channel, which currently boasts more than 30,000 subscribers. That same channel, apparently named after a children's meme, features dozens of videos targeted at kids.

When BuzzFeed News flagged other similar bestiality-thumbed videos to YouTube last April, the company immediately removed them and issued this statement: "These images are abhorrent to us and have no place on YouTube. We have strict policies against misleading thumbnails, and violative content flagged to us by BuzzFeed has been removed. We're working quickly to do more than ever to tackle abuse on our platform, and that includes developing better tools for detecting inappropriate and misleading metadata and thumbnails so we can take fast action against them."

It's unclear just what YouTube has done in service of this pledge. In a statement issued after the publication of this article, the company said it has "worked to aggressively enforce our monetization policies to eliminate the incentive for this abuse," adding that it is ramping up enforcement against abusive thumbnails. "We recognize there’s more work to do and we’re committed to getting it right."

AI experts say "getting it right" is certainly possible, but perhaps unpleasant.

"It is definitely possible for AI to detect bestiality-related porn, but it would need to be trained on images related to that. So, it requires a special effort to do that kind of training and it's not 'fun to work on,'" Bart Selman, a Cornell University professor of artificial intelligence, told BuzzFeed News. "Another issue is that the content spreading mechanisms may actually push this stuff widely, going around content safety checks."

A senior employee at YouTube who spoke with BuzzFeed News last year speculated that the graphic thumbnails might have originated at a Cambodian content farm that had previously been banned by YouTube. The videos BuzzFeed News uncovered today appear similar. They do not appear to be monetized, but the same YouTube employee explained that content farm accounts often keep their videos ad-free in the hopes of monetizing them after spiking view counts.

The recurrence of the bestiality thumb issue suggests that content moderation may perhaps be an intractable problem for YouTube. Certainly the company continues to struggle to address it despite nearly two years of reports about false or inappropriate content on its platform. In November 2017, reports of unsettling animated videos and bizarre content aimed at children surfaced on YouTube. Weeks later, the company announced it would crack down on the child-exploitative videos.

But a December 2017 BuzzFeed News report revealed a flawed content moderation process with confusing and sometimes contradictory guidelines. Several YouTube moderators told BuzzFeed News that these guidelines instructed them to promote “high quality” videos with strong production, regardless of the content.

That same month, YouTube CEO Susan Wojcicki said the company would increase its number of human moderators to more than 10,000 by 2018 to help rein in unsavory content on its platform.

“I’ve seen how some bad actors are exploiting our openness to mislead, manipulate, harass or even harm,” Wojcicki said in a blog post two years ago. “Our goal is to stay one step ahead of bad actors, making it harder for policy-violating content to surface or remain on YouTube.”


Topics in this article

Skip to footer