The request for information focuses on Meta’s risk assessment and mitigation measures “linked to the protection of minors, including regarding the circulation of self-generated child sexual abuse material (SG-CSAM) on Instagram“, the European Commission said.
Elevate Your Tech Prowess with High-Value Skill Courses
Offering College | Course | Website |
---|---|---|
IIM Lucknow | IIML Executive Programme in FinTech, Banking & Applied Risk Management | Visit |
Indian School of Business | ISB Digital Transformation | Visit |
Northwestern University | Kellogg Post Graduate Certificate in Product Management | Visit |
Meta must also give information about “Instagram’s recommender system and amplification of potentially harmful content”, it added.
The investigation is the first step in procedures launched under the EU’s Digital Services Act (DSA), but does not itself constitute an indication of legal violations or a move towards punishment.
Meta must respond by December 22.
A report by Stanford University and the Wall Street Journal in June this year said Instagram is the main platform used by paedophile networks to promote and sell content showing child sexual abuse.
Discover the stories of your interest
Meta at the time said it worked “aggressively” to fight child exploitation.The commission has already started a series of investigations against large digital platforms seeking information about how they are complying with the DSA.
It has sought more information from Meta in October about the spread of disinformation as well as a request for information last month about how the company protects children online.
The DSA is part of the European Union’s powerful regulatory armoury to bring big tech to heel, and requires digital giants take more aggressive action to counter the spread of illegal and harmful content as well as disinformation.
Platforms face fines that can go up to six percent of global turnover for violations.