How Facebook Failed The Rohingya In Myanmar

The UN calls Myanmar’s treatment of the Muslim Rohingya minority a genocide, and says Facebook has done little to tackle hate speech. Analysis by BuzzFeed News sheds new light on Facebook's failures.

Lawmakers from the home state of Myanmar’s persecuted Rohingya minority regularly posted hateful anti-Muslim content on Facebook and, in some cases, explicitly called for violence, according to an analysis by BuzzFeed News. The posts were made both before and for months after state-led violence displaced 700,000 Rohingya Muslims last year, in what the UN has described as genocide.

UN investigators published a damning report on Monday calling for a number of senior military figures, including the head of the armed forces, to be prosecuted by the International Criminal Court. Soldiers murdered, tortured, and raped members of the Rohingya minority as part of a “widespread and systematic attack on a civilian population,” according to the report.

But the UN report also took Facebook to task, describing it as a “useful instrument for those seeking to spread hate,” adding that the company’s response to concerns about its role had been “slow and ineffective.” It said that the “extent to which Facebook posts and messages have led to real-world discrimination must be independently and thoroughly investigated.”

Facebook banned 20 organizations and individuals in Myanmar on Monday, including the head of the armed forces, in an effort to stop “the spread of hate and misinformation.” The company said it was making the move in response to a disinformation campaign it had uncovered that it called “coordinated inauthentic behavior.” But many fear it is too little, too late.

The UN report took Facebook to task, describing it as a “useful instrument for those seeking to spread hate.”

Facebook once regarded itself as a largely neutral platform for content. But the company has reevaluated this notion amid calls from the UN and other groups to take greater responsibility for what users post — especially calls for violence.

BuzzFeed News’ analysis shows how widespread the problem of hate speech is on Facebook’s platform. A review of more than 4,000 posts by politicians from the Arakan National Party found that 1 in 10 of the posts — made between March 2017 and February 2018 — contained hate speech as defined by Facebook’s own public community standards. The ANP is the most popular party in Rakhine state, which was home to hundreds of thousands of Rohingya before they were expelled last year. It says it represents the interests of the ethnic Rakhines, the dominant group in the state, which is also the home of the Rohingya and other groups.

Posts by members of Rakhine state’s parliament compared Rohingya to dogs, said Muslim women were too ugly to rape, falsely stated Rohingya torched their own houses for a payout from NGOs, and accused Muslims of seeking to become the dominant group in the state by having too many children. Some even told Muslims to get ready to be killed. Some of the most popular posts identified by BuzzFeed News as hate speech garnered 3,400 reactions or were shared up to 9,500 times. Asked about the posts, Tun Aung Kyaw, general secretary and spokesperson for the ANP, said he had never seen members of the party MPs post about other religions on Facebook, despite the evidence. “As general secretary of the party, I have never seen my party members post hate speech online,” he said.

As the crisis worsened last year, the analysis shows that Facebook took no action for months. The platform finally removed many posts earlier this month after BuzzFeed News sent links to a spokesperson.

As the crisis worsened last year, the analysis shows that Facebook took no action for months.

A recent report by Reuters found more than 1,000 posts, comments, images, and videos attacking Rohingya and other Muslims on Facebook. Beyond Myanmar, Facebook is also facing an onslaught of public pressure over its role in spreading and amplifying dangerous speech in places like Sri Lanka and South Sudan.

Facebook’s decision to ban top military officials shows that the company is intent on making Myanmar the centerpiece of a new campaign to tackle hate speech. This is particularly important in parts of the world where a lack of rule of law and deep societal divisions mean racist and abusive language against minority groups can translate into real-world violence. Facebook’s standards ban “violent or dehumanizing speech” against people based on their religion, ethnicity, national origin, race, and other characteristics.

Facebook admits that it acted too slowly in the past and has promised to pour more resources into digital literacy campaigns and company personnel to target hate speech in Myanmar.

“I do want to say that we know that over the past three years, we’ve taken too long to remove content. It’s important that we acknowledge that,” David Caragliano, content policy manager at Facebook, told BuzzFeed News in July. “We’ve learned from these experiences and not just on the policy side, but on the enforcement side, we are improving our emergency response processes.”

Victoire Rio, a digital rights researcher in Myanmar, had long noticed that even when posts including hate speech were reported, Facebook’s reaction was not fast enough, particularly given how quickly violence can follow the dissemination of hate speech. Rio raised the issue with Facebook officials last year, she said, and was told the company generally responds within six hours to posts threatening violence when they’re reported through Facebook’s system.

“We were told the best way to report things was to go through their internal process,” she said. “We were told this takes an average of six hours, which seemed like an absolute aberration to me.”

Late last year, she decided to conduct an experiment, systematically reporting dozens of posts that she believed clearly violated Facebook’s standards and evaluating the turnaround time for each report.

“What we found was a pretty compelling pattern of 48 hours plus on most reports, which was eight times longer than what they had announced to us,” Rio said.

Asked about this experiment, Facebook said posts are often taken down before a user receives a message about whether action has been taken, which could lead to the impression that moderators act more slowly than they actually do. Sara Su, a product manager for the company, told BuzzFeed News that content moderators move faster in cases where there’s a credible threat of violence and usually respond within hours.

After decades of military rule and economic stagnation, much of Myanmar’s population came to smartphones and the internet after 2015, at a time when Facebook was already popular in Asia. User numbers grew at lightning speed, and today the country has between 15 and 20 million monthly active Facebook users. The word “Facebook” has become synonymous with the internet itself in Myanmar.

Facebook removed “18 Facebook accounts, one Instagram account and 52 Facebook Pages,” which it said were followed by “almost 12 million people.”

When asked about the results of BuzzFeed News’ analysis, Facebook said it has adapted its policies to changing concerns about hate speech, working with independent organizations that flag fake news, and rumors that may spur violence offline. Company officials say they are taking a more proactive approach to removing hate speech rather than only waiting for posts to be reported by users.

In interviews, Facebook officials said part of their plan for Myanmar is to remove profiles that constitute “the worst of the worst” — a step that mirrors its recent decision to remove the US conspiracy site Infowars from its platform (made after other tech platforms, including Apple, did so). On Monday, Facebook removed “18 Facebook accounts, one Instagram account and 52 Facebook Pages,” which it said were followed by “almost 12 million people.” It said that it was “banning 20 individuals and organizations from Facebook in Myanmar — including Senior General Min Aung Hlaing, commander-in-chief of the armed forces,” and the military’s television network.

“At a high level, we’ve gotten really aggressive in a way that we weren’t before,” Caragliano told BuzzFeed News in July, before the UN report was published. “We are getting really aggressive at reducing the bad actors and the bad content.”

The move is welcome news to researchers and civil society groups that have for years called on Facebook to take a tougher approach. But BuzzFeed News’ analysis shows that the problem is more complicated than simply banning a handful of individuals and groups. Questions remain as to how Facebook can scale its approach to regulating content in a country where hate speech is overwhelmingly common on social media, and enforce its rules without alienating its users or becoming the country’s de facto censor.

Last summer, Facebook announced it would remove posts that included the anti-Muslim racial epithet kalar when it is used to attack a person or group. But BuzzFeed News’ analysis shows that hundreds of posts in the sample set used the term in exactly this manner. (The term kalar also has many other uses that have nothing to do with the slur, including in compound words — a fact that sparked a pushback in Myanmar after initial reports that Facebook decided to ban the word altogether.) It did not appear that the simple use of the word — even as a slur — was enough to get Facebook’s content moderators to take the posts down.

Other posts spread made-up stories, suggesting Rohingya groups were being trained by ISIS or had waged genocide against other groups in Myanmar.

The analysis also offers a snapshot of anti-Rohingya talking points used by nationalists, some of which seem designed to get past Facebook’s moderators. Often they stop short of explicit calls for violence, but repeatedly state that kalars are “breeding” too many children, or that Rakhine state would be more beautiful without Muslims, likening their presence in the area to an “invasion.” Others compared Rohingya to animals, but in photos and memes rather than text, making them hard to search for. Some of the explicit calls for violence discovered by BuzzFeed News were taken down — albeit months after they were posted — but many posts evaded Facebook’s censors. Still other posts spread made-up stories, suggesting Rohingya groups were being trained by ISIS or had waged genocide against other groups in Myanmar.

Facebook founder Mark Zuckerberg has publicly stated he hopes to move toward automating a substantial part of its content management process using artificial intelligence tools, a statement echoed by company officials in interviews with BuzzFeed News. Local NGOs and activists say it’s tough to imagine machine learning will produce the kind of linguistic and cultural understanding it will take to combat these forms of speech.

Civil society groups that have been briefed by Facebook say the problem is that the company does not have enough content reviewers who can speak Burmese, the majority language in Myanmar, or the country’s many other minority languages.

Facebook says it has 60 people moderating content in Burmese, and that by the end of the year, there will be 100. Mia Garlick, head of Asia Pacific for Facebook, said that the company also has dozens of staff members across many teams who are working on Myanmar. Facebook employment ads show the company is recruiting Dublin-based content reviewers to work on Myanmar.

Facebook discloses next to nothing about its employees’ qualifications or identities.

But Facebook says its core problem is not a lack of moderators who can speak Burmese, but the fact that users in Myanmar are reporting content at lower rates than other markets. To help remedy this issue, it’s investing in digital literacy campaigns in Myanmar and moving users to a more user-friendly text format. But critics say that the reporting process on Facebook has been clunky, which is the real reason people in Myanmar don’t use it.

Facebook also discloses next to nothing about its employees’ qualifications or identities, other than that they are native speakers from Myanmar and the diaspora who are, as Caragliano put it, “immersed in the online ecosystem and real experts.” Considering Facebook’s dominance as a platform for news and information, it’s striking how little people in Myanmar know about the professional backgrounds, viewpoints, and biases of a group of people who essentially act as content censors.

In the past, Facebook has depended almost totally on content flagged by users, including government and civil society leaders who have met company officials in real life and flagged content via direct emails or messages. And the platform is still heavily reliant on civil society groups in Myanmar to inform it of problematic content. This process became news earlier this year after a controversial interview with Zuckerberg. At the time, Zuckerberg said Facebook’s own processes ensured it took down quickly circulating messages calling for violence against both Rohingya Muslims and Buddhists. In reality, Rio and other campaigners got Facebook to take action on the messages only after days of pleading with the company.

Rio was in the capital city of Naypyidaw when the Zuckerberg interview was published. When she read it, she was about to hop on a six-hour bus ride back to Yangon, where she lived. She was infuriated. She spent the trip texting with other campaigners, drafting an open letter to Zuckerberg that would take him to task for omitting their work. In the end, Zuckerberg replied directly with an apology.

“Facebook is relying on Myanmar’s civil society to do their work for them.”

“Facebook is relying on Myanmar’s civil society to do their work for them. They want to collaborate with us, but they ask us to notify them of these things privately,” said Myat Thu of the NGO Burma Monitor. “When everyone can report easily and it works, that will solve the problem. They say if you found something, please tell us — that’s not problem solving. That doesn’t work.”

Hardline groups in Myanmar have already started pushing back against Facebook over its new, tougher line against hate speech. Facebook has no personnel or offices in Myanmar, so it would be tough for extremists in the country to directly target their staff. Reuters reported that Facebook outsourced its hate speech monitoring to the firm Accenture in a venture dubbed Project Honey Badger, which hired its first two Myanmar-language speakers three years ago in Manila.

But civil society activists said they’re worried Facebook’s actions could be blamed on them, putting them at risk.

Facebook officials said the company is working to make it easier for people in Myanmar to report problematic content more often.

“We want to do everything we can to get on top of and take action on the problematic content, and civil society can help us,” said Caragliano. “The data tells us and we can see that our action rate has significantly increased.”

The UN report calling for senior military figures to be prosecuted for genocide underscores just how widespread anti-Rohingya sentiment has become in Myanmar, as well as the state’s role in directing and fomenting violence. The report’s criticisms of Facebook reveal how social media has been exploited to spread hate speech, making the problem exponentially worse.

As Facebook's user base grew in Myanmar, it’s unclear whether the site fully appreciated a notion that is familiar to social scientists and historians: that dehumanizing rhetoric targeting minority ethnic groups and spread through mass media channels can be a catalyst for ethnic cleansing.

“I can’t think of a genocide where there hasn’t been a media component,” said Alexa Koenig, executive director of the Human Rights Center at University of California, Berkeley. “There are many social scientists and scholars who have documented the same patterns. As humans, our gut instinct isn’t to perform violence against each other.”

To human rights scholars, Facebook’s role in the Rohingya crisis carries echoes of the role mass media played in the run-up to other atrocities.

“There has to be historical trigger,” she added. “But you have to have tinder in place for a spark to become a flame.” In this case, she said, the tinder was hateful posts on Facebook.

To human rights scholars, Facebook’s role in the Rohingya crisis carries echoes of the role mass media played in the run-up to other atrocities. After the 1994 Rwandan genocide, executives of domestic radio stations who had broadcast calls for violence against the ethnic Tutsis were convicted by a UN tribunal. One radio station broadcast lists of names and locations of people to be killed and called on people to “exterminate the cockroaches.” Some 800,000 Tutsis as well as moderates from the Hutu ethnic group were killed in subsequent violence. Calls for violence in the media have played a key role in atrocities from the Bosnian genocide to the Holocaust.

Similar dehumanizing language — comparing Rohingya to dogs and pests, for instance — spread widely on Facebook over the past few years, as did explicit calls for violence and targeted, race-based harassment. But the role Facebook played in Myanmar’s Rohingya crisis is different from the role media organizations played in the run-up to atrocities of the past, legal experts say. For one thing, it would be tough to find evidence that Facebook understood or had directly profited from hate speech or calls for violence on its platform, and courts have repeatedly found that social media platforms are not responsible for the content they broadcast, including one recent case that rejected a complaint that sought to hold Twitter legally liable for allowing ISIS members to have accounts on its platform.

As in other countries, activists in Myanmar say they repeatedly tried to warn Facebook that dehumanizing rhetoric targeting Rohingya and other Muslims could quickly spiral out of control. Facebook representatives, they said, listened politely but ultimately didn’t do as much as they had hoped.

In June, Facebook representatives from product, policy, and partnerships teams took a trip to Myanmar to meet with civil society groups, publishers, and government officials, planning to hear their concerns. The company’s staff had made visits like this before, but this one came not long after Zuckerberg had taken heat in a congressional hearing over its treatment of hate speech in Myanmar, among other issues. In meetings with civil society groups, Facebook officials spoke about the company’s systems and policies, and the activists gave their own presentations and raised questions.

Myat Thu of Burma Monitor said he and others used the opportunity to renew a call for Facebook to provide more data about the number of takedown requests it receives and how it acts. Facebook still hasn’t responded, he said.

“In the end, it was just an introduction session,” he said. “There were so many problems, and we didn’t have a lot of time to discuss. The time was very limited.”

“To solve the problem, we need to stay longer, and we need to discuss a lot more,” he said. ●

Topics in this article

Skip to footer