Here's What YouTube Is Doing To Stop Its Child Exploitation Problem

The company plans to have over 10,000 content moderators on staff by the end of 2018, YouTube CEO Susan Wojcicki said.

YouTube is adding more human moderators and increasing its machine learning in an attempt to curb its child exploitation problem, the company's CEO Susan Wojcicki said in a blog post on Monday evening.

The company plans to increase its number of content moderators and others addressing content that violates company rules to more than 10,000 employees in 2018 in order to help screen videos and train the platform's machine learning algorithms to spot and remove problematic children's content. Sources familiar with YouTube's workforce numbers say this represents a 25% increase from where the company is today.

In the last two weeks, YouTube has removed hundreds of thousands of videos featuring children in disturbing and possibly exploitative situations, including being duct-taped to walls, mock-abducted, and even forced into washing machines. The company said it will employ the same approach it used this summer as it worked to eradicate violent extremist content from the platform.

Though it's unclear whether machine learning can adequately catch and limit disturbing children's content — much of which is creepy in ways that may be difficult for a moderation algorithm to discern — Wojcicki touted the company's machine learning capabilities, when paired with human moderators, in its fight against violent extremism.

According to YouTube, it used machine learning to remove more than 150,000 videos for violent extremism since June; such an effort "would have taken 180,000 people working 40 hours a week," according to the company. The company also claimed its algorithms were getting increasingly better at identifying violent extremism — in October the company said that 83% of its videos removed for extremist content were originally flagged by machine learning; just one month later, it says that number is now 98%.

Wojcicki, on behalf of YouTube, also pledged to find a "new approach to advertising on YouTube" for both advertisers and content creators. In the last two weeks, YouTube said it has removed ads from nearly 2 million videos and more than 50,000 channels "masquerading as family-friendly content.” The crackdown has come after numerous media reports that revealed that many of the videos — often with millions of views — ran with pre-roll advertisements for major brands, a few of which have suspended advertising business with the platform in November.

Though Wojcicki offered no concrete plans for advertising going forward, she said that the company would be "carefully considering which channels and videos are eligible for advertising." The blog post also said the company would "apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers."

It's unclear when the advertising changes will go into effect. For now, controversial videos still appear to be running alongside advertisements. In a review of videos masquerading as family friendly content, BuzzFeed News found advertisements running on a number of popular "flu shot" videos, a genre that typically features infants and young children screaming and crying.

On Monday afternoon, two flu shot videos on a family account called "Shot Of The Yeagers" were found running advertisements for Lyft, Adidas — which had previously told the Times of London it had suspended advertising on the platform — Phillips, Pfizer, and others. When BuzzFeed News contacted Adidas and Lyft about their ads running near the videos, both companies said they would look into the matter.

"A Lyft ad should not have been served on this video," a Lyft spokesperson told BuzzFeed News. "We have worked with platforms to create safeguards to prevent our ads from appearing on such content. We are working with YouTube to determine what happened in this case."

Adidas offered BuzzFeed News a statement dated from Nov. 23 and added, "we recognize that this situation is clearly unacceptable and have taken immediate action, working closely with Google on all necessary steps to avoid any reoccurrences of this situation." Less than one hour after their initial response, the flu shot videos appeared to be deleted off YouTube entirely.

UPDATE

This piece has been updated to clarify that the 10,000 employees includes content moderators as well as others that will address content that violates the company's rules.


Topics in this article

Skip to footer