In an effort to crack down on its massive problem of disturbing and exploitative family-friendly video content, YouTube has purged hundreds of thousands of videos in the last week. The company has also instituted a policy of disabling the commenting feature on any video suspected of having predatory comments (that is, those that are sexual in nature, appear to be intended to convince children in videos to act in a sexualized manner, or attempt to get the children to engage with them privately on a separate platform). But the platform's steps to rid the site of exploitative content often fail to stop a crucial party: the predatory commenters themselves.
A BuzzFeed News review of YouTube videos with exploitative and predatory comments directed at children shows that, while the company is quick to flag the videos themselves, in at least 13 instances, the commenters themselves were never suspended for their posts.
Of the 13 videos, 11 were deleted entirely shortly after they were reported to YouTube, and two had comments disabled. But the accounts of users engaging in potentially predatory behavior — including sexualized compliments, making potentially problematic requests (such as asking children to disrobe or give themselves wedgies), or asking for or providing their numbers or social media accounts to get in contact — did not appear to be deleted.
The videos — which were provided to BuzzFeed News and then reported by a member of YouTube's Trusted Flagger Program — were mostly videos posted by children's accounts. While few, if any, of the videos were likely made for the purpose of exploiting children, all depict young kids in situations that elicited predatory or explicit comments. Many of the videos were of the popular "challenge" genre (similar to the popular Ice Bucket Challenge), where YouTubers compete in any number of competitions or scenarios for fans. The flagged videos reviewed by BuzzFeed News included children in bathing suits, in the shower, or engaging in actions that might be targeted by those with fetishes, such as the "hold it in" or "try not to pee" challenges.
BuzzFeed News provided four such examples to YouTube on Thursday morning for guidance as to why the accounts were still active. Shortly after, YouTube told BuzzFeed News the accounts "seem to be old as our teams just looked for some of the accounts/videos and some actions have been taken." However, none of the predatory commenters' accounts had been taken down. Roughly one hour later, the four accounts were taken down. When asked for comment, a YouTube spokesperson provided BuzzFeed News with the following statement:
"Last week we took action to shut comments down on tens of thousands of videos at scale. Our teams are now methodically reviewing the accounts behind the inappropriate and unacceptable comments, terminating these accounts, and reporting illegal predatory behavior to NCMEC. We have shut down hundreds of predatory accounts in the last week and we continue to work to terminate more."
Below are some examples of videos that YouTube deleted or disabled comments on — but didn't disable the predatory commenters' accounts.
Here's a screenshot of a video of some kids swimming. Before it was reported, it had 49 comments.
Before the video was reported, it had a number of predatory comments:
Shortly after reporting the video the comments were disabled by YouTube.
But the accounts that left the comments were active as of 24 hours later.
YouTube deleted a number of these commenters' accounts after BuzzFeed News provided screenshots to the company. This account was left up for more than 36 hours after the video it commented on had comments disabled.
Another video — this one of two young girls practicing gymnastics moves in leotards — was flagged by BuzzFeed News and reported by a member of YouTube's Trusted Flagger program for predatory comments.
Shortly after the video was flagged for review, the video's comments were disabled.
But when BuzzFeed News searched for the user who'd sent the comment, the user's anonymous account was still up.
In another instance, a video from a child's account of the kid participating in a challenge video — in which the children film themselves taking a cold shower — was flagged after the video received multiple inappropriate comments.
Less than 24 hours later, the video was taken down by YouTube. The commenters, however, were seemingly not penalized. Their accounts remained active long after the video was removed.
This pattern continued in videos viewed both before and after flagging.
Like this video of a young shirtless boy attempting to do an ice bath challenge. After it was flagged for a predatory comment by a user with the name "Ypga Fanirl," the video was taken down.
Over 36 hours later, the "Yoga Fanirl" account was still active. According to one flagger, had the account been taken down, the page would would display a red bar with white text saying "This account has been terminated for violating YouTube's Community Guidelines."
Policing predatory commenters appears to be especially challenging for YouTube, which has hundreds of hours of content uploaded every minute. To combat the issue of scale, YouTube has committed to employing and investing in machine learning to help flag and moderate videos when humans cannot. Still, the issue of moderation can be treacherous: Some comments that appear to be predatory toward children may in fact be innocent. In other situations, comments that appear innocuous — such as commenters asking child vloggers to participate in a particular YouTube "challenge" — may be predators goading the children into creating custom fetish content.
The distinction between predatory and innocent accounts isn't always clear. But the problem persists, despite YouTube's efforts. One volunteer moderator told the BBC — and later BuzzFeed News — that they believe between 50,000 and 100,000 predatory accounts remain across the platform. While YouTube's disabling of comments is a swift correction, without disabling the predatory accounts themselves, it appears to be only a cosmetic one.