Facebook Claims It Can Now Detect Revenge Porn Automatically — Though It Didn't Explain How

How can Facebook's machine learning and artificial intelligence systems tell whether a nude photo or video was uploaded consensually?

Facebook claimed on Friday that it can now detect revenge porn videos and images posted on both Facebook and Instagram before anyone reports it — though it didn't explain how its machine learning and artificial intelligence systems understand it was uploaded consensually.

“Finding these images goes beyond detecting nudity on our platforms,” wrote Antigone Davis, Facebook’s global head of safety. “By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram.”

Facebook said once its tech detects what it thinks is revenge porn, a Facebook moderator will review it and remove it if it violates Facebook's Community Standards. Facebook will also disable the account that shared it in most cases.

This isn’t the first time Facebook has announced measures to curb revenge porn shared on its platforms. In 2017, the company started letting users proactively upload their own nude photos that they think may be distributed without consent. It then claimed a “specially-trained representative” would review the images and create a unique, machine-readable numerical fingerprint known as a hash to prevent future uploads of the same image.

Facebook also said that it would launch a support hub for revenge porn victims where they could find organizations and resources to support them, and include information about the steps they can take to remove their pictures and videos from the platform and prevent them from being shared further.


Topics in this article

Skip to footer