International human rights organisation Amnesty International said that Facebook should pay compensation to the Rohingya, as its owner Meta’s hazardous algorithms and reckless chase after profit significantly contributed to the violence committed by the Myanmar military against the Rohingya.
Millions of Rohingya people were forced to leave their ancestral homes in the Rakhine state of Myanmar, and took refuge in neighbouring Bangladesh. Reckless hate speeches online, largely through Facebook posts, hastened the drive to displace them. On 29th August, a report by the Amnesty International said these things, according to the renowned news agency AFP.
Myanmar’s military targeted and slaughtered the Rohingya Muslim minority in 2017. Many of them migrated to Bangladesh to save their lives. Since then they have been living in separate refugee camps in the country. Victims’ organizations and rights activists say violence against them has increased because of Facebook’s messy algorithms. Algorithms overspread extremist contents, which actually encouraged harmful rumours and hate speech against Rohingya. There are over 20 million Facebook users in Myanmar. Many residents of the country rely on this social media to get news on various issues. Thus, in the months and years prior to the crackdown, Facebook in Myanmar had become an echo chamber of anti-Rohingya contents.
How does Facebook’s Meta work?
Facebook’s Meta uses engagement-based algorithmic systems to power its news feed, ranking, recommendation, group features, and shaping what is seen on the platform. Meta profits when Facebook users stay on the platform as long as possible, by selling more targeted advertising. The display of inflammatory content, including that which promotes hatred, creates provocation to violence, hostility and discrimination. This particularly is an effective way of keeping people on the platform longer. As such, the promotion and amplification of this type of content is key to the surveillance-based business model of Facebook.
Many influencers in Myanmar, linked to the military and radical Buddhist nationalist groups flooded the platform with anti-Muslim contents, posting misinformation and propaganda claiming there is going to be an imminent Muslim takeover, and representing the Rohingya as the invaders. In one post that was shared more than a thousand times, a Muslim human rights defender was portrayed and described as a national traitor. The comments left on the post comprised threatening and racist messages, including “He is a Muslim. Muslims are dogs and need to be shot”, and “Do not leave him alive. Remove his whole race. Time is ticking.” Contents inflaming violence and discrimination went to the very top of Myanmar’s military and civilian leadership. General Min Aung Hlaing, the leader of Myanmar’s military, posted on his Facebook: “We openly declare that absolutely, our country has no Rohingya race.” In the aftermath, he went on to seize power in a coup in February 2021. These hateful statements have been allowed by Facebook to spread and reach a large audience in Myanmar.
Many Rohingyas tried to report the anti-Rohingya contents through Facebook’s ‘report’ function, but to no avail, Amnesty said in the report. The human rights organisation highlighted the issues raised in the controversial ‘Facebook Papers’ published in October 2021. There were indications that company executives knew the site was involved in spreading harmful content against ethnic minorities but never took action against it as they were profiteering from this. The report also states that Facebook repeatedly failed to regard the warnings and constantly failed to enforce its own policies on hate speech, despite its responsibility under international standards to do so. In 2014, Facebook’s mother company Meta attempted to uphold an anti-hate initiative known as ‘flower speech’ by generating stickers for the users to post in response to content which promoted violence, racism or discrimination. However, activists soon noticed that the stickers were having reverse consequences. Facebook’s algorithms interpreted the use of these stickers as a sign that people are enjoying the post and started promoting them. Instead of lessening the number of people who saw a post promoting hatred, the stickers actually made the posts more visible.
The Rohingya representatives have filed a total of three cases against Facebook under the Responsible Business Conduct Guidelines with the United States and United Kingdom as well as the Organisation for Economic Co-operation and Development (OECD). The US lawsuit was filed last December in the state of California. It is where Facebook and its parent company Meta are headquartered. In that case, the refugees have sought compensation of 15 billion dollars. Facebook has also been called on to proactively take responsible steps to protect human rights across its platform, while also calling on national authorities to step up their oversight.
In addition, a legal body in the United Kingdom has written to Facebook on behalf of some Rohingya living in the country. The allegations include:
- Facebook has spread hate speech against Rohingya people.
- Facebook failed to invest in moderators and fact checkers knowledgeable about Myanmar’s political situation.
- Facebook fails to remove posts and accounts inciting violence against Rohingyas.
- Despite repeated warnings from various media and organisations, Facebook could not take appropriate measures in this regard.
A complaint has also been filed against Facebook in San Francisco, United States. It has been said in the complaint that Facebook has traded with the lives of Rohingyas to grab the market.
Facebook yet to comment
Facebook, or its main company Meta, did not immediately comment on the allegations brought by the Rohingyas. However, the organisation admitted in 2018 that it had not done enough to prevent incitement against the Rohingya. In the same year, the United Nations complained that Facebook’s role in tackling hate speech on the Internet was slow and ineffective. Facebook did not comment on this matter either.