Facebook’s rules to combat misinformation and hate speech are subject to the whims and political considerations of its CEO and his policy team leader.
“How are the emoji’s being recommended in this and can we remove this so this doesn’t perpetuate Asian racial stereotypes?”
In an internal post, the company outlined how it will try to protect people opposing Myanmar’s military coup.
Following An Insurrection, Lawmakers And State Attorneys General Are Asking Facebook To Halt Ads Of Military Gear
“Whether through negligence or with full knowledge, Facebook is placing profit ahead of our Nation’s democracy.”
Facebook Says It’s Standing Up Against Apple For Small Businesses. Some Of Its Employees Don’t Believe It.
“It feels like we are trying to justify doing a bad thing by hiding behind people with a sympathetic message."
"We all get the privilege of seeing the future because we are making it.”
A departing Facebook employee said the social network's failure to act on hate speech “makes it embarrassing to work here.”
Internal data shows that labels on President Trump’s posts decrease reshares by about 8%. They still account for some of the most engaging posts on the platform.
Facebook Has A Rule To Stop Calls To Arms. Moderators Didn't Enforce It Ahead Of The Kenosha Shootings.
“What we learned after Kenosha is that Facebook’s call to arms policy didn’t just fail. It was never designed to function properly in the first place.”
Facebook’s CEO also said Steve Bannon’s comments about beheading government officials did not warrant his complete removal from the platform.
The metric, which assesses the potential for danger based on keywords, rose to 580 from 400 this week — a 45% increase.
On the same day Facebook announced only 51% of its employees believed the social network was having a positive impact on the world, a vice president commended their efforts to prepare for the 2020 presidential election.
“This is a measure we put in place in the lead-up to Election Day. We will assess when to lift them afterwards, but they are temporary."
Despite the request, Facebook said it registered an estimated 4.4 million people to vote in the upcoming presidential election.
In a companywide meeting, Facebook’s CEO said recent content rules banning hate and conspiracy content were implemented because of the US presidential election and that a wide margin of victory for either candidate could prevent violence following the vote.
A new report finds “Facebook is routinely behind the curve in cracking down on domestic extremists on its platform.”
Facebook Took Down A Trump Post That Compared COVID-19 To The Flu And Said That We Should Just Learn To Live With It
Twitter placed a misinformation label on the same message but did not remove it.
A 6,600-word internal memo from a fired Facebook data scientist details how the social network knew about specific examples of global political manipulation — and failed to act.
“I don’t know what the damn problem is at Facebook with anti-Muslim hate, but I would just say at this point that they don’t seem to care.”
Facebook said it removed a militia event associated with the shooting of three protesters in Kenosha, Wisconsin. It didn't. Here's what really happened, and why it could happen again.
A Kenosha Militia Facebook Event Asking Attendees To Bring Weapons Was Reported 455 Times. Moderators Said It Didn’t Violate Any Rules.
CEO Mark Zuckerberg said that the reason the militia page and an associated event remained online after a shooting that killed two people was due to “an operational mistake.”
At a companywide meeting, Facebook's CEO also said President Donald Trump brought up China during an October dinner at the White House.
Facebook Employees Are Outraged At Mark Zuckerberg's Explanations Of How It Handled The Kenosha Violence
Following days of violence and civil unrest, Facebook employees wonder if their company is doing enough to stifle militia and QAnon groups stoking violence on the social network.
Ankhi Das expressed regret in an internal Facebook post — but some Muslim employees think the company needs to go further.
Facebook’s employees and fact-checking partners say they are left in the dark about how the company decides what content stays up and what comes down.
“I certainly think that there are valid national security questions about having an app that has a lot of people’s data that follows the rules of another country, a government that is increasingly is kind of seen as a competitor.”
Facebook Fired An Employee Who Collected Evidence Of Right-Wing Pages Getting Preferential Treatment
Facebook employees collected evidence showing the company is giving right-wing pages preferential treatment when it comes to misinformation. And they’re worried about how the company will handle the president’s falsehoods in an election year.
“Facebook Is Hurting People At Scale”: Mark Zuckerberg’s Employees Reckon With The Social Network They’ve Built
As the US heads toward a crucial and contentious presidential election, the world's largest social network is facing an unprecedented cultural crisis.
“It’s encouraging to see Zuck post this, but I’ll maintain my skepticism until some sort of action is taken by the company," one Facebook employee told BuzzFeed News.