As votes are being tallied across the country to determine the next US president, internal Facebook data shows that the company has seen a significant increase in what it calls “violence and incitement trends.”
In a post to a group on Facebook’s internal message board, one employee alerted their colleagues to a nearly 45% increase in the metric, which assesses the potential for danger based on hashtags and search terms, over the last five days. The post, which was seen by BuzzFeed News, notes that trends were previously rising “slowly,” but that recently “some conspiracy theory and general unhappiness posts/hashtags” were gaining popularity.
“The ‘probable violence and incitement’ metric for explore has been slowly rising over the last few days,” the post’s author wrote, referring to Facebook’s “Explore” search tool. “The risky hashtags seem conspiracy-theory-esque.”
The internal metric, which has not previously been reported, had been at around a score of 400 on Oct. 31. It rose to a 24-hour average of nearly 580 as of Thursday morning.
Although Facebook has a policy on violence and incitement, the existence of a metric shows the extent to which the company actively measures activity on its platform that can foster unrest. In recent years, the social media platform was widely criticized for inciting genocide in Myanmar, fostering political unrest in the Philippines, and abetting the organization of domestic extremism in places like Kenosha, Wisconsin.
The rapid uptick in the “violence and incitement” number indicates that the company's own internal metrics have found Facebook posts are contributing to an unstable situation around the counting of ballots in the US presidential election as President Donald Trump and his supporters attempt to inject unfounded doubts into the process.
Do you work at Facebook or another technology company? We’d love to hear from you. Reach out at firstname.lastname@example.org, email@example.com, or via one of our tip line channels.
In the post, the Facebook employee highlighted a number of hashtags ranging from popular Trump slogans, including #DrainTheSwamp, #Trump2020, and #KeepAmericaGreat, to “qanon-esque” sayings like #WatchTheWater. QAnon is the collective delusion pushed by a growing number of the president’s supporters and Republican politicians that a secret cabal of deep state operatives and people who sexually abuse children control the government.
These hashtags, which are gaining popularity, point to the overall theme, the employee wrote of “democrats stealing the election, not trusting the results, and supporting Trump.”
“I wouldn’t panic because most of these hashtags aren’t very prevalent, but it does seem to be a building trend that ideally we stop before it gains momentum,” the employee said in their post. “While these hashtags aren’t as explicitly risky as #RiggedElection and #VoterFraud, they do promote similar ideas; given we have approval to block those hashtags, it would be great if we could come up with a standard for these ‘borderline’ hashtags.”
“I’m not sure they have a handle on it at all.”
The employee’s comments suggest that the company has already been monitoring and suppressing hashtags that seed doubt about the election process and violate its terms of service. Facebook spokesperson Liz Bourgeois declined to say which hashtags the company had suppressed, or how long the "violence and incitement" metric had been in use.
"We're staying vigilant in detecting content that could incite violence during this time of heightened uncertainty," she said. "We've readied products and policies in advance of this period so we can take action quickly and according to our plans."
Nina Jankowicz, a disinformation researcher and fellow at the Woodrow Wilson International Center for Scholars, said she wasn’t aware Facebook had a metric for “violence and incitement” trends and was heartened that they were tracking it. Still, she said, suppression of individual hashtags “is not going to do the trick.”
“We’re talking about the broader structure of Facebook that incentivizes these communities to organize and foster offline violence,” Jankowicz said. “I’m not sure they have a handle on it at all. It’s a structure that they’re relying on to keep people engaged and make money these days.”
The employee’s report on the “violence and incitement” metric came hours before the social network banned a fast-growing Facebook group called Stop the Steal, due in part to calls for violence circulating among its more than 365,000 members. The group was formed on Wednesday by apparent Republican operatives and members used it to sow discord about the vote-totaling process, organize protests, and make threats.
“In line with the exceptional measures that we are taking during this period of heightened tension, we have removed the group ‘Stop the Steal,’ which was creating real-world events,” Facebook spokesperson Andy Stone said in a statement. “The group was organized around the delegitimization of the election process, and we saw worrying calls for violence from members of the group.”
While Facebook acted after multiple complaints and press questions about the group, its organizers had already prepared for a scenario in which it was banned. A sign-up process for the group on Thursday morning asked potential members to join a direct mailing list “in the event that social media censors this group.”
On a companywide call on Thursday, Facebook CEO Mark Zuckerberg and Vice President of Global Affairs Nick Clegg said the group was removed for attempting to delegitimize the vote-counting process. Clegg, the former United Kingdom deputy prime minister, added that Facebook had been “prepared for this,” despite the group amassing more than 360,000 members in less than 48 hours.
Prior to the election, the company temporarily disabled the recommendation of political and social issue groups, though Stop the Steal seemed to spread organically across the platform without any algorithmic distribution.
Wisconsin Sen. Tammy Baldwin criticized Facebook for its slow response to groups like Stop the Steal, adding that it’s time the company started “providing answers.”
“Mark Zuckerberg needs to stop getting caught flat-footed when Facebook is used to promote violence and undermine democracy,” she said in a statement. “Facebook has the tools to track violence and now they need to use their tools to stop its spread on their platform.”
“Mark Zuckerberg needs to stop getting caught flat-footed when Facebook is used to promote violence and undermine democracy.”
“It’s worrisome that Facebook continues to refuse to make meaningful changes to the products and features that have become essential tools for violent groups,” said Virginia Sen. Mark Warner. “When the risks of a poorly designed product are foreseeable, businesses that don’t take steps to fix their products to address those risks are normally held liable for the harms they cause.”
During a Thursday all-hands meeting, Zuckerberg opened by discussing the election and thanking his employees for their work in preparing for the contentious vote. He answered questions about when employees would return to the office and the progress of their cryptocurrency arm and said that any regulation of the company may be harder to achieve because the country appears to be heading toward a split government.
One question, however, about how to make Facebook “filled with more love and less divisiveness” gave him pause. Election results seem to indicate so far, that the country is pretty divided, he said.
“My hope is that by building up a strong ecosystem of communities — where people can join groups meaningful to them — that’s how people can build a stronger social fabric,” Zuckerberg said.
An hour earlier, Facebook had banned a group trying to undermine US democracy. Zuckerberg did not acknowledge this seeming contradiction and moved on. He closed his discussion with employees by answering questions about stress-eating and his least favorite Halloween candy.
“Milky Way,” he said. “I’ve never been a Milky Way person.” ●