Facebook owner Meta’s dangerous algorithms and profit motive were a major contributor to the atrocities committed by the Myanmar military against the Rohingya population in 2017, according to a new Amnesty International report released on Thursday.
In 2017, Rohingya were killed, tortured, raped and displaced by the thousands as part of Myanmar’s security forces’ ethnic cleansing campaign. “In the months and years leading up to the atrocities, Facebook algorithms fueled a storm of hatred against the Rohingya. This added to real-world violence,” says the report, titled “Social Atrocity: Meta and the Rohingya’s Right to Remedy.”
Supporters of the Myanmar military and radical Buddhist nationalist groups flooded the platform with anti-Muslim content, posting disinformation claiming a Muslim takeover was imminent and describing the Rohingya as “invaders”. Violence was also incited, and anti-Muslim content remained available on Facebook for a long time.
An independent United Nations international commission of inquiry has also found that social media plays a major role in the atrocities committed in the country. The report further details how Meta has repeatedly failed to conduct human rights due diligence in its operations in Myanmar, despite being required to do so by international standards.
Amnesty International is therefore now launching a new campaign in which Meta Platforms, Inc. is called upon to respond to Rohingya demands for reparations. Rohingya refugee groups asked Meta directly to make amends by funding a million dollar education project in a refugee camp, but that request was denied in 2021. “Facebook must pay. If they don’t, we will go to every court in the world. We will never give up our fight,” said Amnesty International, which is advocating for sweeping reforms to Facebook’s algorithmic systems “to prevent abuse and increase transparency.”