13.1 C
New Delhi
Sunday, December 15, 2024
HomeTech‘Kill more’: Facebook fails to detect hate against Rohingya

‘Kill more’: Facebook fails to detect hate against Rohingya


A new report has found that Facebook failed to detect blatant hate speech and calls to violence against Myanmar’s Rohingya Muslim minority years after such behaviour was found to have played a determining role in the genocide against them.


The report shared exclusively with The Associated Press showed the rights group Global Witness submitted eight paid ads for approval to Facebook, each including different versions of hate speech against Rohingya. All eight ads were approved by Facebook to be published.

The group pulled the ads before they were posted or paid for, but the results confirmed that despite its promises to do better, Facebook’s leaky controls still fail to detect hate speech and calls for violence on its platform.

Clearance campaign

The army conducted what it called a clearance campaign in western Myanmar’s Rakhine state in 2017 after an attack by a Rohingya insurgent group. More than 7,00,000 Rohingya fled into neighbouring Bangladesh and security forces were accused of mass rapes, killings and torching thousands of homes.

Also Monday, US Secretary of State Antony Blinken announced that the US views the violence against Rohingya as genocide. The declaration is intended to both generate international pressure and lay the groundwork for potential legal action, Blinken said.

On February 1 of last year, Myanmar’s military forcibly took control of the country, jailing democratically elected government officials. Rohingya refugees have condemned the military takeover and said it makes them more afraid to return to Myanmar.

FILE PHOTO: Rohingya refugees cross the Naf River with an improvised raft to reach to Bangladesh in Teknaf, Bangladesh, November 12, 2017. 

Experts say such ads have continued to appear and that despite its promises to do better and assurances that it has taken its role in the genocide seriously, Facebook still fails even the simplest of tests — ensuring that paid ads that run on its site do not contain hate speech calling for the killing of Rohingya Muslims.

“The current killing of the Kalar is not enough, we need to kill more!” read one proposed paid post from Global Witness, using a slur often used in Myanmar to refer to people of east Indian or Muslim origin. “They are very dirty. The Bengali/Rohingya women have a very low standard of living and poor hygiene. They are not attractive,” read another.

“These posts are shocking in what they encourage and are a clear sign that Facebook has not changed or done what they told the public what they would do: properly regulate themselves,” said Ronan Lee, a research fellow at the Institute for Media and Creative Industries at Loughborough University, London.

The eight ads from Global Witness all used hate speech language taken directly from the United Nations Independent International Fact-Finding Mission on Myanmar in their report to the Human Rights Council. Several examples were from past Facebook posts.

The fact that Facebook approved all eight ads is especially concerning because the company claims to hold advertisements to an “even stricter” standard than regular, unpaid posts, according to their help centre page for paid advertisements.

Smoke is seen on the Myanmar border as Rohingya refugees walk on the shore after crossing the Bangladesh-Myanmar border by boat through the Bay of Bengal, in Shah Porir Dwip, Bangladesh September 11, 2017. 

Smoke is seen on the Myanmar border as Rohingya refugees walk on the shore after crossing the Bangladesh-Myanmar border by boat through the Bay of Bengal, in Shah Porir Dwip, Bangladesh September 11, 2017. 

“I accept the point that eight isn’t a very big number. But I think the findings are really stark, that all eight of the ads were accepted for publication,” said Rosie Sharpe, a campaigner at Global Witness. “I think you can conclude from that that the overwhelming majority of hate speech is likely to get through.”

Improving safety and security controls

Facebook’s parent company Meta Platforms Inc said it has invested in improving its safety and security controls in Myanmar, including banning military accounts after the Tatmadaw, as the armed forces are locally known, seized power and imprisoned elected leaders in the 2021 coup.

“We’ve built a dedicated team of Burmese speakers, banned the Tatmadaw, disrupted networks manipulating public debate and taken action on harmful misinformation to help keep people safe. We’ve also invested in Burmese-language technology to reduce the prevalence of violating content,” Rafael Frankel, director of public policy for emerging markets at Meta Asia Pacific wrote in an e-mailed statement to AP on March 17.

“This work is guided by feedback from experts, civil society organisations and independent reports, including the UN Fact-Finding Mission on Myanmar’s findings and the independent Human Rights Impact Assessment we commissioned and released in 2018.” Facebook has been used to spread hate speech and amplify military propaganda in Myanmar in the past.

Shortly after Myanmar became connected to the internet in 2000, Facebook paired with its telecom providers to allow customers to use the platform without having to pay for the data, which was still expensive at the time. Use of the platform exploded. For many in Myanmar, Facebook became the internet itself.

Targeting muslims

Local internet policy advocates repeatedly told Facebook hate speech was spreading across the platform, often targeting the Muslim minority Rohingya in the majority Buddhist nation. For years Facebook failed to invest in content moderators who spoke local languages or fact checkers with an understanding of the political situation in Myanmar or to close specific accounts or delete pages being used to propagate hatred of the Rohingya, said Tun Khin, president of Burmese Rohingya Organisation UK, a London-based Rohingya advocacy organisation.

In March 2018, less than six months after hundreds of thousands of Rohingya fled violence in western Myanmar, Marzuki Darusman, chairman of the UN Independent International Fact-Finding Mission on Myanmar, told reporters social media had “substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public”.

Rohingya refugee children slide down the road at Balu Khali refugee camp near Cox’s Bazar, Bangladesh November 16, 2017. 

Rohingya refugee children slide down the road at Balu Khali refugee camp near Cox’s Bazar, Bangladesh November 16, 2017. 

“Hate speech is certainly of course a part of that. As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media,” Darusman said. Asked about Myanmar a month later at a US Senate hearing, Meta CEO Mark Zuckerberg said Facebook planned to hire “dozens” of Burmese speakers to moderate content and would work with civil society groups to identify hate figures and develop new technologies to combat hate speech.

Hard to detect hate speech

“Hate speech is very language specific. It’s hard to do it without people who speak the local language and we need to ramp up our effort there dramatically,” Zuckerberg said. Yet in internal files leaked by whistleblower Frances Haugen last year, AP found that breaches persisted. The company stepped up efforts to combat hate speech but never fully developed the tools and strategies required to do so.

Rohingya refugees have sued Facebook for more than $150 billion, accusing it of failing to stop hate speech that incited violence against the Muslim ethnic group by military rulers and their supporters in Myanmar. Rohingya youth groups based in the Bangladesh refugee camps have filed a separate complaint in Ireland with the 38-nation Organisation for Economic Cooperation and Development calling for Facebook to provide some remediation programs in the camps.

The company now called Meta has refused to say how many of its content moderators read Burmese and can thus detect hate speech in Myanmar. “Rohingya genocide survivors continue to live in camps today and Facebook continue to fail them,” said Tun Khin. “Facebook needs to do more.”

Published on


March 22, 2022



Source link

- Advertisment -

YOU MAY ALSO LIKE..

Our Archieves