Friday, March 29, 2024

Facebook promoted violence against Rohingya; Meta owes reparation: Amnesty

Rohingya children in India’s Haryana refugee camp. Photo: Irfan Hadi K

Facebook owner Meta’s dangerous algorithms and reckless pursuit of profit substantially contributed to the atrocities perpetrated by the Myanmar military against the Rohingya people in 2017, human rights group Amnesty International said in a new report published on Thursday.

The report titled “The Social Atrocity: Meta and the right to remedy for the Rohingya” details how Meta knew or should have known that Facebook’s algorithmic systems were supercharging the spread of harmful anti-Rohingya content in Myanmar, but the company still failed to act.

The Rohingya are a predominantly Muslim ethnic minority based in Myanmar’s northern Rakhine State.

In August 2017, more than 700,000 Rohingya fled Rakhine when the Myanmar security forces launched a targeted campaign of widespread and systematic murder, rape and burningof homes.

The violence followed decades of state-sponsored discrimination, persecution, and oppression against the Rohingya that amounts to apartheid.

Amnesty International also launched a new campaign calling for Meta Platforms, Inc. to meet the Rohingya’s demands for remediation.

In the months and years prior to the crackdown, Facebook in Myanmar had become an echo chamber of anti-Rohingya content, according to the report.

Actors linked to the Myanmar military and radical Buddhist nationalist groups flooded the platform with anti-Muslim content, posting disinformation claiming there was going to be an impending Muslim takeover, and portraying the Rohingya as “invaders”.

In one post that was shared more than 1,000 times, a Muslim human rights defender was pictured and described as a “national traitor”. The comments left on the post included threatening and racist messages, including ‘He is a Muslim. Muslims are dogs and need to be shot’, and ‘Don’t leave him alive. Remove his whole race. Time is ticking’.

The UN’s Independent International Fact-Finding Mission on Myanmar ultimately concluded that the “role of social media [was] significant” in the atrocities in a country where “Facebook is the Internet”.

Mohamed Showife, a Rohingya activist, said: “The Rohingya just dream of living in the same way as other people in this world… but you, Facebook, you destroyed our dream.”

Internal studies dating back to 2012 indicated that Meta knew its algorithms could result in serious real-world harms.

In 2016, Meta’s own research clearly acknowledged that “our recommendation systems grow the problem” of extremism.

Meta received repeated communications and visits by local civil society activists between 2012 and 2017 when the company was warned that it risked contributing to extreme violence. In 2014, the Myanmar authorities even temporarily blocked Facebook because of the platform’s role in triggering an outbreak of ethnic violence in Mandalay.

However, Meta repeatedly failed to heed the warnings, and also consistently failed to enforce its own policies on hate speech.Amnesty International’s investigation includes analysis of new evidence from the ‘Facebook Papers’ – a cache of internal documents leaked by whistleblower Frances Haugen.

In one internal document dated August 2019, one Meta employee wrote: “We have evidence from a variety of sources that hate speech, divisive political speech, and misinformation on Facebook… are affecting societies around the world. We also have compelling evidence that our core product mechanics, such as virality, recommendations, and optimizing for engagement, are a significant part of why these types of speech flourish on the platform.”

spot_img

Don't Miss

Related Articles