16.1 C
New Delhi
Saturday, November 23, 2024
HomeTechETtech Opinion: It’s time for a fresh look at social media regulations

ETtech Opinion: It’s time for a fresh look at social media regulations


If recent comments by Rajeev Chandrasekhar, minister of state for electronics and IT, are anything to go by, India’s IT laws are in for a major overhaul.


Not long ago, minister for electronics and IT Minister Ashwini also alluded to the need for greater accountability and self-regulation from social media platforms. Similar views have also been expressed in the Joint Parliamentary Committee’s (JPC) report on data protection, tabled in Parliament last December. The report noted that there was a need to re-evaluate the intermediary status of social media platforms and make them responsible for the content they host.

While these views are bound to raise concerns among Big Tech companies and free speech absolutists, there is no denying that the exponential growth of social media over the years is a double-edged sword. Hate speech, fake news, cyber harassment and misuse of data are just a few of the many challenges that governments, individuals and communities have been grappling with.

The world over, governments have become acutely aware of these issues and have been coming up with legislative frameworks that seek to reign in social media platforms. The UK Online Safety Bill is a case in point. It proposes doing away with the intermediary and safe harbour constructs altogether and imposing extensive ‘duties of care’ on social media platforms and search engines with respect to the content on their platforms.

The conundrum

Whether or not revoking intermediary status of social media platforms is the right way forward is not an easy question to answer. After all, social media in essence does serve as an intermediary, providing users with a window to cyberspace through which they can project their views, knowledge, skills and creativity. Removing safe harbour protection and making them wholly liable for content shared by users, over which they have no editorial control, may not be the most cogent way forward.

Discover the stories of your interest

On the other hand, calls for a more responsible social media require platforms to proactively block harmful content and suspend offending accounts. The paradox is that as this policing activity of social media platforms becomes more extensive, it correspondingly erodes their status as passive intermediaries of content.

A possible solution

A way out might be to retain the intermediary construct, but with more clear-cut obligations for platforms to qualify as intermediaries and enjoy safe harbour protection. This may be achieved with the following steps, carried out through amendments to the IT Act:

  • Define intermediaries: The definition of intermediaries needs to be updated, rationalised and subdivided according to functionality, such as: social media platforms, private messaging platforms, search engines, web-hosting services, and internet service providers. Obligations for each of these sub-categories may need to be suitably tailor-made, given their distinct modes of operation.
  • List banned content: The types of content that are deemed to be harmful and illegal should be specifically and exhaustively set out. This is required as the IT Act presently does not adequately address content that can adversely affect individuals at a personal level, such as online threats, harassment and breach of privacy. It would also help address challenges such as hate speech, fake news, information and psychological warfare.
  • Lay out govt’s powers: The powers of the government with respect to illegal content should be set out. This could include the power and procedure to take down content, situations where decryption or interception of communications is necessary, and appropriate punishments for certain forms of content. While the IT Act already covers parts of this, there needs to be a consolidated, updated and rationalised framework that deals with these issues.
  • Encourage verified accounts: Liability for illegal content will only be meaningful if such liability can be backstopped against an actual person or entity. This in turn is only possible if the proliferation of fake or fictitious accounts is contained. Therefore, platforms could be required to incentivise verification of accounts with disclosure of real identities. While this should not be made mandatory – given users’ right to privacy, it could provide verified users with additional privileges such as increased reach and grievance redressal options, while restricting the account features of non-verified users.
  • Minimise platforms’ takedown powers: Suo-moto content takedown powers of platforms should be kept to a minimum in the case of verified users – limited to things like pornographic and child sexual abuse content. This would prevent platforms from becoming laws onto themselves, with a free hand in controlling the discourse on their platforms.
  • Discourage echo chambers: Platforms should be encouraged to do away with algorithms that promote certain types of content by way of user profiling and selective highlighting. This will help prevent platforms from becoming information echo chambers that reinforce polarisation and prejudices.
  • Set up a tribunal: Perhaps most importantly, a quasi-judicial body could be set up as a nodal tribunal for all actions taken by the government and platforms on user content, and for hearing users’ grievances against such actions. Suitable exceptions could be built-in for emergency action against content such as pornography and child sexual abuse. Such a setup would bring in much needed transparency and due process in dealing with illegal content, and hopefully help reduce government and Big Tech overreach in content moderation.

The authors are lawyers based in Delhi.

Stay on top of technology and startup news that matters. Subscribe to our daily newsletter for the latest and must-read tech news, delivered straight to your inbox.



Source link

- Advertisment -

YOU MAY ALSO LIKE..

Our Archieves