13.1 C
New Delhi
Sunday, December 15, 2024
HomeTechMeta Oversight Board report raises concerns over lack of appeals from India

Meta Oversight Board report raises concerns over lack of appeals from India


Meta’s Oversight Board, in its first annual report, which covers the period between October 2020 and December 2021, has highlighted the lower number of appeals from users in India, a country that has the most number of Facebook and Instagram users


Even as the report mentioned an “enormous pent-up demand among Facebook and Instagram users for some way to appeal Meta’s content moderation decisions to an organisation independent from the company”, it highlighted the lower number of user appeals from countries outside Europe and the US and Canada.

Overall, 1,152,181 cases were submitted to the Board during the period, including 47 from Meta. Central & South Asia accounted for merely 2.4 per cent of the estimated cases submitted to the Board. Of the 47 cases referred by Meta, only three were from Central and South Asia.

“The lower numbers of user appeals from outside Europe and the US and Canada could also indicate that many of those using Facebook and Instagram in the rest of the world are not aware that they can appeal Meta’s content moderation decisions to the board,” said the board in its report. 

“We have reason to believe that users in Asia, Africa, and West Asia experience more, not fewer, problems with Meta’s platforms than other parts of the world. Our decisions so far, which covered posts from India and Ethiopia, have raised concerns about whether Meta has invested sufficient resources in moderating content in languages other than English,” it added. 

According to Prateek Waghre, Policy Director at Internet Freedom Foundation: “It is a challenge for platforms across the world and in India. It is a question of resources, not a question of the ability of the language, model that they’re working on to detect, classify speech and content in those languages. This is an issue across social-media platforms, and they all need to invest more in this.”

According to Raman Jit Singh Chima, Asia Policy Director and Senior International Counsel at Access Now, “the problem that comes up in the context of what the Oversight Board report and Facebook and Whatsapp operations in India, is that the extent to which the processes for these companies, and how people can raise a complaint and even the very terms and services themselves need to be more clearly available in Indian languages and made available to Indian users.”

According to Chima, at the very least, the larger tech platforms should be doing this.

Questions Meta did not answer

In total, the board published 20 case decisions in 2021, wherein 14 decisions overturned Meta, while six upheld the company’s actions. It also shared data on the questions asked to the social media major as part of its case review and those answered by the company. It asked 313 questions asked to Meta as part of its case review, where 19 questions were not answered by the company.

The report highlighted two cases from India. Meta did not answer one question in a case pertaining to a protest in India against France, involving a picture posted by a user in a Facebook group that showed a man holding a sheathed sword and with accompanying texts that described France’s President Emmanuel Macron as the devil.

Meta did not answer the question if it had previously enforced violations under the Violence and Incitement Community Standard against the user or group. 

The second case was regarding the ‘Punjabi concern over the RSS in India case,’ where Meta did not answer two questions. The case was regarding a video post from a Punjabi-language online media company 

Global Punjab TV featured a 17-minute interview with a professor, described as “a social activist and supporter of the Punjabi culture”. In an accompanying text, the user asserted that the Hindu nationalist organisation Rashtriya Swayamsevak Sangh (RSS) and India’s ruling party Bharatiya Janata Party (BJP) were threatening to kill Sikhs, a minority religious group in India.”

After being reported by a user, the post was removed by a human moderator for violating Facebook’s Dangerous Individuals and Organisations Community Standard. 

“This triggered an automatic restriction on the user’s account. The user then appealed to the company. Meta told the user it could not review this appeal, citing a temporary reduction in capacity caused by Covid-19. As a result of the Board selecting the case, Meta belatedly restored the content, conceding that its initial decision was wrong,” said the report.

One of the questions that Meta did not answer was regarding what specific language in the content caused Meta to remove it under the Dangerous Individuals and Organisations Community Standard. Meta responded that it was unable to identify the specific language that led to the erroneous conclusion that the content violated the Dangerous Individuals, and Organisations policy, the report said.

The second question included asking how many strikes users need for Meta to impose an account restriction, and how many violations of the Dangerous Individuals and Organizations policy for account-level restrictions.

“Meta responded that this information was not reasonably required for decision-making in accordance with the intent of the Charter,” it said.

The issue regarding alleged issues in terms of Meta’s lack of resources dealing with content moderation outside of English have also been highlighted in complaints made by former employee turned whistleblower Frances Haugen.

Quasi-judicial body not the answer, say experts

The Oversight Board’s report comes at a time when major social media platforms are at loggerheads with the Centre regarding a proposed Grievance Appellate Committee (GAC)  in the amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics) Rules (IT Rules, 2021) which will have the powers to overrule decisions by social media platforms to either take down, remove or block user content.

However, according to experts, while Meta may need to work on its resources in terms of handling content moderation in different languages, a ‘quasi judicial body’ may not be the answer.

“Where the challenge comes in where the question that has not been answered globally is the way to address this is, just essentially, adding a much larger team of content moderators? Or is it far better working Machine Learning models? Or is there a third solution? What kind of resources does that require? I don’t think we have an answer to that yet. And that’s where that partly some of the complication is because if you look at the larger companies, yes, they can afford to hire,  relatively large numbers of content moderators. But the smaller companies may not be able to,” said Waghre.

“However, according to, the executive committee is not the answer, especially at this point where larger details are yet to be known including how the committee is going to work, how it is being set up or what the composition will be,” he said.

“That’s not the answer, because that is potentially bringing the government into content moderation decisions, that’s normatively not a good thing. And practically a very hard thing to do, because of the scale,” he further added.

“The fundamental point is that this we’ve not really understood how they’re (social media platforms) are affecting us, and there needs to be more analysis of that. While we’re figuring this out, what should we do to regulate them? And I think that that question still remains largely unanswered, but I think what is undesirable is a quasi judicial body inserting itself into the process and adjudicating over decisions,” added Waghre.

“Ultimately, there is an argument you could make very clearly that when content when people’s legal rights are being impacted, should it be companies and platforms that will take it down, or should it be courts of law or government mechanisms?” said Chima.

Chima further opined that the current consultations from the government were more focussed on the idea that people can be de-platformed on certain services. 

“The theme of this is that companies need to be more accountable and how they respond to user complaints. I would actually say the companies already have a lot of resources in their content operation teams, it is a question about where they deploy them and how they manage those concerns,” Chima said.

“And why are we not trying to solve the actual problem? If the Oversight Board recognises, for example, that there are not enough appeals happening from India, then there seems to be a  larger structural problem. In fact, I think the report is welcome. But even Facebook and others are obviously making a mistake, and the Oversight board itself has to do much more. At least there’s data that’s coming out there about the true nature of the problem. And that’s what we need to hear more,” he added.

Meta’s actions on Oversight Board’s recommendations 

The Oversight Board, in its report stated that it has made 86 recommendations to Meta in 2021 in a bid to push off more transparency about the company’s policies. Meta, in response, now gives people using Facebook in English who break its rules on hate speech more detail on what they’ve done wrong.

The company is also rolling out new messaging in certain locations telling people whether automation or human review resulted in their content being removed. It has committed to provide new information on government requests and its newsworthiness allowance in its transparency reporting.

“Meta translated Facebook’s Community Standards into Punjabi and Urdu, and committed to translate the platform’s rules into Marathi, Telugu, Tamil and Gujarati. Once completed, more than 400 million more people will be able to read Facebook’s rules in their native language,” the report further added.

Published on

June 23, 2022



Source link

- Advertisment -

YOU MAY ALSO LIKE..

Our Archieves