BuzzFeed News

Reporting To You

tech

Pornhub Banned Deepfake Celebrity Sex Videos, But The Site Is Still Full Of Them

More than 70 deepfake videos — depicting graphic fake sex scenes with Emma Watson, Scarlett Johansson, and other celebrities — were easily searchable from the site's homepage using the search term "deepfake."

Last updated on April 18, 2018, at 6:13 p.m. ET

Posted on April 18, 2018, at 5:14 p.m. ET

Last February, Pornhub announced that it will no longer tolerate machine learning–powered deepfake videos on its platform. The site said the videos — which feature realistic celebrity faces swapped onto the bodies of adult film actors — were a form of nonconsensual content and would be purged from the site, which averages over 100 billion video views a year. But despite the initial pledge, celebrity deepfake porn videos continue to thrive on Pornhub.

While banned material frequently slips through the cracks on large sites that allow users to upload content, the deepfake violations on Pornhub are especially flagrant. Shortly after the ban in February, Mashable reported that there were dozens of deepfake videos still on the site. Pornhub removed those videos after the report, but a few months later, BuzzFeed News easily found more than 70 deepfake videos using the search term "deepfake" on the site's homepage. Nearly all the videos — which included graphic and fake depictions of celebrities like Katy Perry, Scarlett Johansson, Daisy Ridley, and Jennifer Lawrence — had the word "deepfake" prominently mentioned in the title of the video and many of the names of the videos' uploaders contained the word "deepfake." Similarly, a search for "fake deep" returned over 30 of the nonconsensual celebrity videos.

Most of the videos surfaced by BuzzFeed News had view counts in the hundreds of thousands — one video featuring the face of actor Emma Watson garnered over 1 million views. Some accounts posting deepfake videos appeared to have been active for as long as two months and have racked up over 3 million video views.

Some of the nonconsensual deepfake videos appeared to be monetized. Below is a K-Y lubricant pre-roll ad before a fake, graphic video of celebrity Emma Watson.

Similarly, Pornhub's algorithms appeared to be promoting the nonconsensual content.

After clicking through the videos, Pornhub's recommendation algorithm surfaced more deepfake videos to BuzzFeed News via its homepage (on its press page, Pornhub alleges the company receives over 80 million daily visits to its homepage).

In response to a series of detailed questions, Pornhub provided BuzzFeed News with a boilerplate statement from company vice president Corey Price.

"Content that is flagged on Pornhub that directly violates our Terms of Service is removed as soon as we are made aware of it; this includes non-consensual content," the statement read. "To further ensure the safety of all our fans, we officially took a hard stance against revenge porn, which we believe is a form of sexual assault, and introduced a submission form for the easy removal of non-consensual content." The company also provided a link where users can report any "material that is distributed without the consent of the individuals involved."

Pornhub did not comment as to why the deepfake videos were allowed to stay up on the site or if the site was actively policing uploads of deepfakes beyond users reporting the offending material. Shortly after being contacted by BuzzFeed News on Tuesday morning, a search for "deepfake" on Pornhub returned zero results, suggesting some of the offending videos had been removed. However a different search Tuesday afternoon for "deep fakes" turned up dozens of videos with view counts in the hundreds of thousands. It is unclear if Pornhub has any official policies against running ads before videos it deems as "non-consensual," or if the site has any plans to modify its recommendation algorithms so as not to surface new videos from categories it deems "non-consensual" to users on its homepage.

Beyond the obvious concerns — that the videos are potentially harmful to the celebrities depicted in them and are posted without their consent — the proliferation of deepfake videos on Pornhub appears to be promoting secondary markets for the creation and purchase of more celebrity deepfake videos. One account uploading fake celebrity videos and allegedly located in Russia directed visitors to its deepfakes store and invited them to connect via the secure messaging app Telegram. "What we have for sale: 500 deep fakes and number is growing you can purchase them one by one or in one full pack with a big discount," the account read before advertising secondary services, including "access to private forums," a "guide on making deep fakes," and a tutorial on "how to make money selling deep fakes."

Other Pornhub videos began with a call for interested visitors to purchase custom-made deepfakes with cryptocurrency.

In light of Pornhub's ineffective policing of celebrity deepfake videos, and as the machine-learning technology behind the software becomes more sophisticated and realistic, this content will continue to spread. And though the stakes for those depicted in the videos are getting higher, those hosting the content don't seem to be any more vigilant.

Support our journalism

Help BuzzFeed News reporters expose injustices and keep quality news free.

Contribute
ADVERTISEMENT