The Pelosi Video Was Altered. Why Won’t Facebook Just Say That?

Facebook's handling of a forged video of House Speaker Nancy Pelosi baffles leading researchers.


In the now infamous Nancy Pelosi video distributed across the internet last week, Facebook had a clear opportunity to set a precedent for how it will handle the manipulated political video certain to flood its pages during the 2020 US presidential election. And according to researchers who study misinformation and design, Facebook blew it.

The altered video of Pelosi first appeared online last week and spread widely on Facebook, YouTube, and other platforms. It had been purposefully slowed down to make the Democratic leader's speech sound slurred and garbled. YouTube promptly pulled it, citing a violation of “clear policies that outline what content is not acceptable to post.” Twitter took it down as well. Facebook left the video up, but reduced its distribution in News Feed and directed anyone who shared it to “Additional reporting on this” via a pop-up window that pointed to fact-checks. What Facebook did not do was tell people the video had been manipulated. And this, the experts say, was a mistake.

“Referring to 'additional reporting' is pretty weak and unclear—it could mean almost anything." 

“Referring to 'additional reporting' is pretty weak and unclear—it could mean almost anything,” Stephan Lewandowsky, a psychology professor at the University of Bristol, told BuzzFeed News. “It would have been better to unambiguously identify the video as altered (or better yet, ‘forged’) rather than pointing to something that sounds as though it may just provide more detail rather than a correction.”

Calling a video altered or “forged” is a scary thing for a company like Facebook. It involves making a statement about a piece of media, rather than offloading that statement (and the accompanying accountability) to third-party fact-checkers. Facebook in the past has backed away from making such calls, and even the appearance of making them, arguing it shouldn’t add “disputed” flags to misinformation since, among other reasons, these flags can backfire, “and further entrench someone’s beliefs.” Yet, when it comes to manipulated videos like this one, there’s nothing to dispute — they are altered without question. And the backfire effect research that Facebook has linked to in the past was conducted by Lewandowsky himself, who said the effect is not enough to warrant leaving out straightforward labeling off the video.

“Under some circumstances, i.e. with worldview-challenging material in particular, there is some evidence for a backfire effect. However, the keywords here are some and some,” Lewandowsky said. “So the question then becomes whether you gain more by being explicit and taking the risk with a backfire effect, or by sticking to ‘additional information’ (thus avoiding backfire) but being insufficiently explicit for the majority of people who might not be susceptible to a backfire effect. This is a difficult question that does not have a one-size-fits-all answer. However, given the relatively infrequent occurrence of backfire effects — they occur less frequently than initially thought — I would lean towards being more explicit as this might maximize the overall impact even if the occasional person backfires.”

Aviv Ovadya, a disinformation researcher who has warned of “reality apathy,” or a world where misinformation is so prevalent that people stop paying attention to news because of the difficulty discerning real from fake, similarly argued that Facebook’s labeling didn’t go far enough. Finding a ground truth is complicated, he said, but looking at whether an original video has been modified is more straightforward. “There is an opportunity to not waffle around with ‘there is additional reporting on this,’ but to actually describe the forms of distortion,” he said.

When it comes to video, Ovadya said, Facebook has already built the technology that can be used to deliver further information about manipulated content: its advertising system, which it can use to explain to people that the videos they’re watching are altered. “There’s this frame of, ‘Oh we can’t do anything.’ Well actually, you already do things that relate to video in terms of ads. And you have the infrastructure in place,” he said.

Don Norman, director of the Design Lab at UC San Diego, took a harder line. “Only accept unaltered media,” he told BuzzFeed News. “Stating that a video has been altered is useless. We know that people will still watch and the disclaimer will not be understood. If it was altered — do not allow it. This also means no cropping or enhancement of photos. Even if all it does is make them better.”

Facebook did not comment.

On Wednesday, Pelosi addressed the video, condemning Facebook’s decision to leave it up. “We have said all along, ‘Poor Facebook, they were unwittingly exploited by the Russians.’ I think wittingly, because right now they are putting up something that they know is false. I think it’s wrong,” Pelosi told the radio station KQED.

The altered video now is now approaching 3 million views, and many of the comments show its viewers believe it’s real. It’s a foreboding moment as an era filled with manipulations of this sort is right around the corner.

Topics in this article

Skip to footer