Sheryl Sandberg Says She's "Disgusted" By Hateful Ad Targeting As Facebook Makes Changes

"Seeing those words made me disgusted and disappointed – disgusted by these sentiments and disappointed that our systems allowed this," Facebook COO Sheryl Sandberg said.

Facebook is making changes to its ad platform in an attempt to prevent people from using it for hateful ad targeting.

On Wednesday afternoon, Facebook COO Sheryl Sandberg responded to a ProPublica report published last week that found advertisers could use Facebook to target people interested in topics like “Jew hater,” “How to burn jews,” and the Nazi Party. In a Facebook post, Sandberg called the targeting criteria totally inappropriate, and said Facebook will make changes to prevent similar issues from taking place again.

"Seeing those words made me disgusted and disappointed – disgusted by these sentiments and disappointed that our systems allowed this," she said.

Before last week's report, whenever someone wrote anything into Facebook's self-reported profile fields — education, employment, job title, and field of study — Facebook's ad system would automatically make that entry a targeting option. So people listing “Jew hater” in their field of study automatically turned "Jew hater" into an ad targeting option. To fix the problem, Facebook is adding human review to the process, hoping it will be a firewall against something like this happening again. ProPublica found the targeting criteria inside Facebook's ad system following a tip. There were 2,274 people in the "Jew hater" category that it discovered.

Facebook is also working on a program "to encourage people on Facebook to report potential abuses of our ads system to us directly," Sandberg said. The company will also clarify its ad policies and tighten its enforcement of the policies Sandberg said, without providing much more detail.

Sandberg admitted in her post that Facebook was unprepared for such abuse because it hadn't considered it. "We never intended or anticipated this functionality being used this way – and that is on us. And we did not find it ourselves – and that is also on us," she said.

Facebook's inability to anticipate how less-than-altruistic people might abuse its products has been a long-running problem and has factored into a number of its biggest crises, from its fake news scandal to the shocking level of violence that's aired on Facebook Live. As BuzzFeed News' Mat Honan put it in April, "The problem with connecting everyone on the planet is that a lot of people are assholes."

Asked if the need to add human reviewers means there's a fundamental flaw with its technology, Facebook directed BuzzFeed News to this line in Sandberg's post: "The fact that hateful terms were even offered as options was totally inappropriate and a fail on our part."