Skip To Content
BuzzFeed News Home Reporting To You

Utilizamos cookies, próprios e de terceiros, que o reconhecem e identificam como um usuário único, para garantir a melhor experiência de navegação, personalizar conteúdo e anúncios, e melhorar o desempenho do nosso site e serviços. Esses Cookies nos permitem coletar alguns dados pessoais sobre você, como sua ID exclusiva atribuída ao seu dispositivo, endereço de IP, tipo de dispositivo e navegador, conteúdos visualizados ou outras ações realizadas usando nossos serviços, país e idioma selecionados, entre outros. Para saber mais sobre nossa política de cookies, acesse link.

Caso não concorde com o uso cookies dessa forma, você deverá ajustar as configurações de seu navegador ou deixar de acessar o nosso site e serviços. Ao continuar com a navegação em nosso site, você aceita o uso de cookies.

Facebook Won’t Accept New Political Ads The Week Before The Election — But Older Ads With Lies Are Still OK

Facebook will also impose restrictions on forwarding links in Messenger.

Posted on September 3, 2020, at 9:20 a.m. ET

Pool / Getty Images

Facebook CEO Mark Zuckerberg

Facebook will stop accepting political advertising in the United States a week before Election Day on Nov. 3, CEO Mark Zuckerberg announced in a post on Thursday.

But the social network will continue showing users all political ads that candidates or political action committees buy before that day, and continue to let these groups adjust who they target.

Candidates for political office will also still be able to run ads containing lies.

In Thursday’s announcement, Zuckerberg claimed he is adding the one-week ban on new ads prior to the election because he is “concerned” about the challenges people could face while voting.

“I’m also worried that with our nation so divided and election results potentially taking days or even weeks to be finalized, there could be an increased risk of civil unrest across the country,” Zuckerberg said.

Social networks are facing increasing pressure to police political advertising on their platforms ahead of the US elections, and some critics have urged tech companies to stop running political ads altogether. Last year, Twitter banned all political advertising from its platform, and Google restricted micro-targeting of political ads on certain products.

Besides announcing that it would let people across Facebook and Instagram turn off political advertising altogether, however, Facebook hasn’t done much. Earlier this year, the company declined to make changes to its policies around allowing politicians to lie in ads, saying that it didn’t think that “decisions about political ads should be made by private companies.”

Critics have accused Facebook of accepting ad money from politicians who use its platform to spread misinformation and propaganda. Last year, the Trump campaign released a 30-second video ad on Facebook that falsely claimed that Democratic presidential nominee Joe Biden offered a $1 billion bribe to officials in Ukraine to stop investigating his son. Facebook declined to take the ad down saying that it did not violate the company’s policies. In July, the Biden campaign wrote to Zuckerberg asking him, to revise the social network’s policies around free speech ahead of the November elections.

The company also hasn’t always been transparent about how it enforces its political advertising policies. A Sunday report in the Wall Street Journal claimed that Facebook declined to act after it discovered that India’s ruling nationalist Bharatiya Janata Party was circumventing its policies that required political parties to disclose their identities.

In his post on Thursday, Zuckerberg wrote that Facebook would also take a series of steps to prevent voters from being misinformed about the elections through the platform, such as removing lies claiming that going to vote could get people infected with the coronavirus, and removing claims like “you can send in your mail ballot up to 3 days after election day” that could cause somebody to lose the chance to vote.

Facebook will also try to slow down misinformation from spreading through Messenger, the company’s instant messaging app. Starting this month, users will no longer be allowed to forward links to Messenger to more than five people, a rule that Facebook first put in place in WhatsApp, the company’s other instant messaging app, in countries like Brazil and India to slow down rumors and hoaxes.

Zuckerberg also claimed that Facebook would continue to “ramp up enforcement” against militias, conspiracy networks like QAnon, and other groups that could be used to promote violence in the weeks after the elections. “We have already removed thousands of these groups and removed the event more from being included in our recommendations and search results,” he wrote.

Last week, however, a BuzzFeed News report showed that Facebook let a militia page advocating for followers to bring weapons to the Kenosha, Wisconsin protest, where a 17-year-old suspect allegedly shot and killed two protesters, remain on the platform despite being reported by users to the company 455 times.

Facebook employees have been left frustrated at how the company is responding to political and civil unrest, including the violence in Kenosha. For months, staffers have been speaking out as they reckon with the real-world effects the social media giant is having on democracies around the world.

“Come November,” Yaël Eisenstat, Facebook's former election ads integrity lead, told BuzzFeed News in July, “a portion of Facebook users will not trust the outcome of the election because they have been bombarded with messages on Facebook preparing them to not trust it.”

BuzzFeed News’ FinCEN Files investigation exposed massive financial corruption on a historic global scale. Want to support our journalism? Become a BuzzFeed News member.

ADVERTISEMENT