New Times,
New Thinking.

  1. Science & Tech
6 November 2018updated 05 Oct 2023 8:25am

Facebook thinks it might be responsible for ethnic cleansing in Myanmar

In a report released late last night, the social media giant took some responsibility for incitement of violence over the last two years.

By Sarah Manavis

Late on Monday night, Facebook released a report it had commissioned from an outside body that said the social network may hold some responsibility for the incitement of violence in Myanmar.

In the Human Rights Impact Assessment: Facebook in Myanmar, BSR (Business for Social Responsibility), the non-profit that produced the report, found that Facebook’s platforms (including WhatsApp, Messenger, Facebook, and Instagram) were used to incite violence against the Rohingya people living there; violence that has been described by the UN as “ethnic cleansing”. The report states that “Facebook has become a means for those seeking to spread hate and cause harm [in Myanmar], and posts have been linked to offline violence”. BSR also deemed “it noteworthy” that Facebook is still the first point of call for most activists organising peaceful protests. However, this was just one line in the 62-page report.

What makes the report even more troubling is the ubiquity of Facebook in Myanmar. “For the majority of Myanmar’s 20 million internet-connected citizens,” the report reads, “Facebook is the internet.” According to the report, social media platform is so popular that most mobile phones sold in Myanmar have Facebook preloaded onto it. It is used by everyone from the commander-in-chief of the armed forces in Myanmar, General Min Aung Hlaing, down. His account, along with 19 others and 52 Facebook pages, were removed by Facebook in August 2018 to prevent them from further inflaming ethnic tension.

This is a confirmation of previous reports suggesting Facebook use exacerbated the Rohingya crisis in Myanmar. Earlier this year, the Guardian reported that hate speech on Facebook in Myanmar had exploded during the crisis – a two-year period in which the stateless Rohingya people living in the country were actively and violently persecuted. The human rights abuses against the Rohingya people during this time included rape, ethnic cleansing, murder, and infanticide. Doctors Without Borders estimated nearly 7,000 deaths in the first month of the crisis alone and Time Magazine reported in March that an estimated 43,000 Rohingya parents were missing and presumed dead.

Facebook’s product policy manager Alex Warofka published a blog post shortly after the report was released, agreeing with its findings, and saying that Facebook “can and should do more.” In the post, he addressed many of the key recommendations and, more specifically, how Facebook was planning to address them: principally by hiring 100 native speakers to work at Facebook headquarters in Myanmar to review potentially dangerous content and modifying its hate speech policy. Warofka’s blog post also went on to address how Facebook had already done work to combat hate speech and dehumanising language on the platform. However, the BSR report argued that, even with those changes, Facebook still hasn’t done enough.

With Facebook in the spotlight over questions of how it sways US and UK elections, the idea it is causing this kind of harm in some parts of the world may not come as a huge surprise. While Facebook is reportedly making efforts to improve the state of its functionality in Myanmar, it’s likely that this report, the first of its kind at Facebook, has only just scratched the surface. We, depressingly, can probably expect to see more findings of this nature.

Give a gift subscription to the New Statesman this Christmas from just £49
Content from our partners
How the UK can lead the transition to net zero
We can eliminate cervical cancer
Leveraging Search AI to build a resilient future is mission-critical for the public sector