After The Discovery Of A Pedophile Ring, YouTube Will Disable Comments On Some Videos Featuring Children

A small number of channels can keep comments enabled, but they must actively moderate them.

YouTube is disabling comments on some videos featuring young children in response to public and advertiser backlash following the discovery of an organized pedophile ring sharing disturbing content on the platform.

The video-sharing company, which is owned by Google’s parent company Alphabet, said it will no longer allow users to leave comments on videos featuring children 13 and under. It will also disable comments on videos featuring minors between 13 and 18 that YouTube believes have a risk of attracting predatory behavior.

YouTube’s decision follows the discovery of a network of pedophiles that were using its platform to share and comment on videos of young children dancing and doing gymnastics. The pedophiles then commented on the videos, leaving timestamps and details noting when children were in states of undress or in suggestive positions.

“We will begin suspending comments on most videos that feature minors, with the exception of a small number of channels that actively moderate their comments and take additional steps to protect children,” a company spokesperson said in a statement Thursday. “We understand that comments are an important way creators build and connect with their audiences, we also know that this is the right thing to do to protect the YouTube community.”

Media outlets, including BuzzFeed News, and concerned members of the public have long scrutinized YouTube for hosting harmful and exploitative videos, including content that depicted minors in abusive or violent situations, encouraged self-harm, and attracted pedophiles. While YouTube moved to ban only specific channels and users that were uploading or sharing these videos, Thursday’s decision marks the company’s most comprehensive action to date, one that could significantly impact engagement on kids videos and the revenue they generate.

The company said it expects the change to take several months to implement, noting that it will also launch a new automated tool that will detect and remove two times more individual comments.

“While we have been removing hundreds of millions of comments across the entirety of YouTube every quarter for violating our policies, we had been working on an even more effective classifier,” the company said in an email to advertisers.

Following the publication of stories over the last two weeks about videos encouraging the exploitation of minors and self-harm, several companies, including Hasbro, Disney, and AT&T, pulled their ads from the platform.

On Thursday, the company also said that “no form of content that endangers minors is acceptable on YouTube,” whether on its main platform or its kids-focused application. To that end, the company terminated channels of FilthyFrankClips and other users who were found to be splicing in self-harm clips into longer videos featuring cartoons.

Topics in this article

Skip to footer