YouTube Is Assembling New Teams To Spot Inappropriate Content Early

Following a series of scandals and advertiser boycotts, YouTube is taking a more proactive approach to hateful and offensive content.

YouTube is planning to proactively seek out and police inappropriate or offensive content following public backlash over its repeated failures to keep hateful, exploitative, or otherwise unsavory videos off its platform.

The company is creating what it calls an "Intelligence Desk," a multipronged "early detection" initiative intended to ferret out controversial content before it spirals into a bigger problem, BuzzFeed News has learned. The desk is part of a broader push to improve YouTube’s content moderation system following a series of humiliating failures.

YouTube’s Intelligence Desk will rely on Google data, user reports, social media trends, and third-party consultants to detect inappropriate content early, and either remove it or prevent advertiser messaging from appearing near it. The Intelligence Desk was described in an advertiser briefing obtained by BuzzFeed News.

A YouTube spokesperson confirmed the Intelligence Desk's creation in a statement to BuzzFeed News, stating: "As we outlined in a blog in December, we're expanding our work against bad actors trying to abuse our platform. This includes hiring more people working to address potentially violative content and increasing our use of machine learning technology. We can confirm that part of those efforts will include assembling new teams dedicated to protecting our platform against emerging trends and threats."

The establishment of an Intelligence Desk comes amid a period of turmoil for YouTube. The platform has come under repeated fire for, among other things, allowing major corporations’ ads to appear next to extremist content, and for failing to recognize it was hosting a large number of videos depicting children in disturbing and abusive situations. Earlier this month, YouTube was widely criticized for its slow response to a video from vlogger Logan Paul that appeared to show a dead body in Japan’s suicide forest; the video ranked among the site's top 10 trending videos before it was removed.

A team that could bring together Google data, social media trends, and third-party expertise to ferret out such videos early would presumably help YouTube to manage incidents like the Paul one more expeditiously, quickly heading off advertiser and user outrage.

John Montgomery, executive vice president of brand safety at GroupM, a major media buying agency, welcomed the more proactive moderation stance. “This will hopefully help Google to anticipate any negative content trends and allow them to nip them in the bud before they become serious issues,” he told BuzzFeed News. “It’s a decisive move in the right direction.”

This isn't the first time YouTube has turned to technological solutions to solve its content moderation problems. In its early days, YouTube was plagued by piracy issues, which turned off advertisers. In response, it came up with an system to detect copyrighted content and allow rights holders to monetize it. Called ContentID, it allowed YouTube to not only keep advertisers happy, but also help rights holders monetize videos uploaded by third parties.

The YouTube document also indicated the platform will partner with more than 100 NGOs, government entities, and academics in an effort to add greater expertise to its handling of controversial content.

YouTube is making a series of changes to its platform in an effort to win back advertiser trust. On Tuesday, it said creators on its platform would now need 4,000 hours of total watch time in the previous 12 months and 1,000 subscribers in order to get paid. YouTube also will also add human vetting to all Google Preferred videos before ads can run on them. And the company is planning to add 10,000 content moderators by the end of 2018.

YouTube has been promising to do better by advertisers for months. “We work hard every day to earn our advertisers’ and agencies’ trust, and we apologize for letting some of you down,” YouTube CEO Susan Wojcicki said in May. “I’m here to say that we can and we will do better.”

Topics in this article

Skip to footer