The decision provides ammunition to campaigners elsewhere in Europe who want tougher controls to prevent the spread of racist and discriminatory content on Twitter and other social media platforms.
It upheld a ruling by a lower court last year that ordered Twitter to provide details on the number, nationality, location, and spoken language of people it employs to moderate content on the French version of the platform.
The lower court decision also required Twitter to disclose any contractual, administrative, commercial and technical documents that would help determine the financial and human resources it has deployed to fight hate speech online in France.
The appeals court said it confirmed, in full, the first ruling and said Twitter should pay 1,500 euros ($1,700) in damages to each of six plaintiffs, a copy of the ruling seen by Reuters showed.
A Twitter spokesperson said the company’s top priority was to ensure the safety of the people using its platform, adding that the group was reviewing the court decision. The U.S. company declined to comment on the financial and operational implications of the ruling.
Discover the stories of your interest
Campaigners were jubilant, though. The six lobby groups that sued Twitter had maintained that only a fraction of hateful messages were removed from the platform 48 hours after they were signalled.
‘Forbidden to forbid’
“I’m tired of this reign where everything is allowed and where it is ‘forbidden to forbid’,” said Marc Knobel, the president of J’Accuse! (I Accuse), one the groups, referring to a famous slogan spread on Paris’ walls during the 1968 protests.
“We have to stop with this delusion: not everything should be allowed in our society.”
The ruling sets France apart from countries such as Denmark, Britain and the United States, as the country’s stringent anti-racism laws allowed such litigation to succeed.
In France, racism and anti-Semitism aren’t considered as opinions that can be held publicly, but as infractions.
Global technology giants have been accused of doing too little to address online abuse. An upcoming EU regulation, the Digital Services Act (DSA), is slated to provide procedures for faster removal of illegal content, such as hate speech.
Last May, Britain said a planned new law would see social media companies fined up to 10% of turnover or 18 million pounds ($25 million) if they failed to stamp out online abuses such as racist hate crimes, while senior managers could face criminal action.