Instagram Censored Posts About One Of Islam’s Holiest Mosques, Drawing Employee Ire
The photo-sharing app mistakenly removed content about the Al-Aqsa Mosque, the site of clashes between Israeli police forces and Palestinian worshippers, after associating the site with terrorism.
Instagram removed posts and blocked hashtags about one of Islam’s holiest mosques because its content moderation system mistakenly associated the site with a designation the company reserves for terrorist organizations, according to internal employee communications seen by BuzzFeed News. The mistake is just the latest content moderation failure by Instagram and its parent company Facebook, which has faced accusations from users around the world that it’s censored content about Israeli aggression toward Palestinians.
The error, which was flagged internally by upset employees on Tuesday, caused Instagram to remove or block posts with hashtags for the Al-Aqsa Mosque, the third-holiest site in the Islamic faith. Since Friday, the mosque has been the location of clashes between Israeli police forces and Palestinians, many of whom had visited the site to pray during the last days of Ramadan.
In an attempt to draw attention to the violence, Instagram users posted videos tagged with the hashtag #AlAqsa or its Arabic counterparts #الاقصى or #الأقصى, only to find that their posts had been taken down or hidden from search results. Some notifications showed that Instagram, which is owned by Facebook, removed the posts because they were associated with “violence or dangerous organizations.” When employees learned of the removals and the justification behind them, some filed internal complaints.
In one case, an employee saw that Instagram had removed an infographic describing the situation at Al-Aqsa, because of its association with “violence or a terrorist organization.” After the employee filed a grievance, they wrote in an internal post, they were informed that the image was taken down “based on a reference to ‘alaqsa’ which is a designated organization,” a Facebook term that refers to “dangerous individuals and organizations.” (The content was eventually restored following the complaint.)
“Both of these mistakes and many others are entirely unacceptable,” the Facebook employee wrote on an internal communications platform on Tuesday. “Al-Aqsa is the 3rd holiest site in Islam and is a central aspect of faith for some 1.8 billion people.”
“Both of these mistakes and many others are entirely unacceptable.”
Facebook’s censorship of posts about Al-Aqsa comes during a period of extreme tensions and violence in the region. So far 53 Palestinians, including more than a dozen children, and six Israelis have died, and more than 300 people have been wounded since fighting broke out last week. As people have used Instagram and Facebook to disseminate information from the ground — from the forced evictions of Palestinians in the East Jerusalem neighborhood of Sheikh Jarrah to the violence at Al-Aqsa — some have found their posts blocked or removed.
For critics and even some employees, Facebook’s latest content moderation failures are evidence of the American company’s lack of understanding and resources in the region, and show how even careless mistakes can have an outsize impact when its products are used by more than 3 billion people around the world.
Facebook previously told Middle Eastern news outlet the National that posts with the Al-Aqsa hashtags “were restricted in error,” but an internal post obtained by BuzzFeed News on Wednesday went further, noting that the content was taken down because Al-Aqsa “is also the name of an organization sanctioned by the United States Government."
A Facebook spokesperson declined to comment beyond what was in Wednesday’s internal post.
Last week, Palestinian Instagram users also complained that Instagram stories, or ephemeral videos and images that last for 24 hours on the platform, about the conflict were also being removed. On Friday, the company attributed that mistake to a bug on the social network that affected users sharing stories around the world.
Those mistakes have triggered reflection among some Facebook employees. In a post over the weekend, one employee wrote in an internal group that “the external perception is FB is timely silencing political speech and apologizing later.”
“Some of those incidents are human review errors and others are automated and I am not familiar with which is more prevalent but why can’t decision makers use the local expertise in the [Middle East and North Africa] region like Public Policy or Comms and consult with them before taking the decision on removing sensitive hashtags or political content,” they wrote, before sharing screenshots of various users complaining that their Instagram posts had been censored. They also noted that Instagram users around the world had started a campaign to give poor ratings to Instagram’s apps in the Google Play store.
“The external perception is FB is timely silencing political speech and apologizing later.”
In response, Guy Rosen, Facebook’s vice president of integrity, wrote a day later that the company had teams “triaging and unblocking any issues as they come up.”
That effort, however, did not prevent the continued removal of content about the Al-Aqsa Mosque, where conflict began last Friday when Israeli police stormed Palestinians who had gathered to observe the last Friday of the Muslim holy month of Ramadan. Complaints about the censoring of content with the Al-Aqsa hashtags continued into Tuesday, when the concerned employee reported the incorrect removal of a post.
While there is an armed Palestinian coalition in the West Bank known as the Al-Aqsa Martyrs' Brigades that’s been deemed as a terrorist entity by the United States and European Union, and other similarly named organizations like the Al-Aqsa Foundation are considered part of its support network by the US government, the critical Facebook employee said this was no excuse for censoring the Al-Aqsa Mosque hashtags.
“If there was a designated group called Washington’s troublemakers and posts that simply mentioned the word Washington were being taken down it would have been entirely unacceptable,” they wrote. “I really want to emphasize that this portion of our userbase already feels alienated and censored and after having so many issues like these — be they technical or product based — our users will not give us the benefit of the doubt.”
On Wednesday, an employee on the company’s Dangerous Organizations and Individuals policy team wrote in their internal post that the term Al-Aqsa (الأقصى) “should not and does not violate our policies.”
“As many of you have rightly pointed out, simply using the same name as a designated organization does not make the place and the organization the same,” they wrote. “Our policies do not call for the removals of people, places or things that simply share a name with a designated organization — so any removals based solely on a mention of the name of the mosque are certainly enforcement errors and they never should have happened under our policies.”
Others were less confident in Facebook’s internal explanation. Ashraf Zeitoon, who served as Facebook’s head of policy for the Middle East and North Africa region from 2014 to mid-2017, noted that the company employed some of the top terrorism experts in the world who could surely distinguish mentions of Al-Aqsa from the Al-Aqsa Martyrs' Brigades.
“For them to go and identify one word of a two-word name as associated with a terrorist organization is a lame excuse,” he said, noting that he was involved in drafting policies on how the company designated terrorist groups and their content. “They are more qualified than this and more competent than this.”
Zeitoon cited an internal fear at Facebook of upsetting Israeli interests and overreporting of the content as potential reasons why the Al-Aqsa videos and images were removed.
In response, a Facebook spokesperson told BuzzFeed News that the Al-Aqsa content was restricted due to human error, and not because of any government requests.
Facebook’s removal and blocking of some Palestinian content has caused the social network’s employees to speak up internally. Ahead of a regular companywide meeting on Thursday that is expected to be led by CEO Mark Zuckerberg, some workers began upvoting a question that asked, “Our integrity systems are failing marginalized groups (see: Palestine, BLM, Indigenous women). What will we do about it?”
The question is low on the list of top questions, behind at least three different questions on Facebook’s work-from-home policies and one wondering if Mark Zuckerberg will ever host Saturday Night Live, following an appearance by Tesla CEO Elon Musk on the variety show this past weekend.
In another question, one employee asked whether Facebook would move its regional office from Tel Aviv, which cannot be accessed by some Palestinian American employees because of Israeli restrictions. Noting that Human Rights Watch had designated Israel as an apartheid state, they asked if Facebook would ever reconsider its location in the Israeli city.
A Facebook spokesperson declined to comment on the matter.