9.1 C
New Delhi
Saturday, December 14, 2024
HomeTechEpic Games to pay $520 million over children's privacy and trickery charges

Epic Games to pay $520 million over children’s privacy and trickery charges


When Epic Games released Fortnite five years ago, the video game quickly became a cultural sensation among millions of teenagers and children. It was easy to sign up and start playing and talking or text chatting with strangers through the game.


Some children also racked up hundreds of dollars on their parents’ credit cards as they bought digital items like colorful outfits for their game characters. Those purchases, along with branded merchandise like action figures, helped make Fortnite a billion-dollar blockbuster for Epic Games, with more than 400 million users.

On Monday, the Federal Trade Commission accused the company of illegally collecting children’s personal information, of harming young players by matching them with strangers on Fortnite while enabling live communications and, separately, of using manipulative techniques, called “dark patterns,” to trick millions of players into making unintentional purchases.

In a historic deal that puts the entire video game industry on notice, Epic agreed to pay a record $520 million in fines and refunds to settle the FTC’s accusations.

The FTC acted at a moment of heightened public concern over the mental health, safety and privacy risks that some popular social media networks and multiplayer video games may pose to children and teenagers.

The crackdown is also the latest indication that the agency is following through on pledges by Lina M Khan, its chair, to take a more assertive stance toward regulating the tech industry.

Discover the stories of your interest



Earlier this month, the agency made an aggressive move to stop consolidation among video game makers when it filed a lawsuit to try to block Microsoft’s $69 billion acquisition of Activision Blizzard, the company behind the popular Call of Duty franchise.
“Protecting the public, and especially children, from online privacy invasions and dark patterns is a top priority for the commission,” Khan said in a statement Monday about the Fortnite case. “These enforcement actions make clear to businesses that the FTC is cracking down on these unlawful practices.”

Epic Games said in a statement that it had instituted multiple children’s privacy and purchasing safeguards over the years and that “the practices referenced in the FTC’s complaints are not how Fortnite operates.”

The company’s proposed settlement agreements with the FTC involve record amounts in two separate cases as well as novel legal remedies.

Epic agreed to pay $275 million to settle regulators’ accusations that it violated a federal law, the Children’s Online Privacy Protection Act, by collecting personal information from children younger than 13 who played Fortnite without obtaining verifiable consent from a parent.

In addition, the company made parents “jump through hoops” to have their children’s data deleted and sometimes failed to honor parents’ deletion requests, the agency said in a legal complaint filed on Monday in U.S. District Court for the Eastern District of North Carolina, where Epic is based.

The amount dwarfs the $170 million penalty – the previous record for child privacy violations – that Google agreed to pay in 2019 over accusations that it illegally harvested data from children on YouTube and used it to target them with ads.

Regulators and lawmakers around the world are intensifying their scrutiny of the ways popular social networks and video game companies deal with young people. In September, California enacted a sweeping online children’s safety law that will require companies to turn on the highest privacy settings for children by default and turn off certain features, like precise location tracking, that might expose them to risk.

That law is scheduled to take effect in 2024. Last week, a tech industry trade group sued the state of California in an effort to block it.

The California law came one year after Britain instituted comprehensive online protections for minors. Last year, as British regulators were developing that effort, Google, YouTube, Instagram, TikTok, Snap, Epic Games and other major platforms announced new safeguards for younger users worldwide.

Congress is also working to boost online safety for young people.

Last year, Sen. Edward Markey, D-Mass., introduced a bill that would expand the federal children’s online privacy law to cover ages 13 to 16. Last week, the senator, together with several prominent children’s advocacy groups, urged members of Congress to include provisions from that children and teen privacy bill in end-of-the-year omnibus spending legislation.

“An entire generation of young people is being targeted, tracked and traumatized, as popular platforms rake in profits every day,” Markey said in a statement Monday commending the FTC for taking action against Epic Games. “Congress must meet this moment in history by stepping up and stepping in for the well-being of young people across America.”

In the privacy complaint filed Monday, the FTC said Epic had caused “substantial” injury to children by matching children and teenagers with adult strangers to play Fortnite while enabling live voice and text chats by default.

Children and teenagers had been “bullied, threatened and harassed within Fortnite, including sexually,” the complaint said, adding that some had also been exposed to traumatizing issues like suicide and self-harm through the game. Epic Games was aware of such harms, the complaint said, noting that the company’s user experience director sent an email to executives in 2017 seeking “basic toxicity prevention” for children on Fortnite and asking them to turn off voice chat.

Although Fortnite introduced some relevant privacy and parental controls over the years, regulators said the changes “have not meaningfully alleviated these harms or empowered players to avoid them.”

As part of the proposed settlement, Epic Games is required to adopt high-privacy default settings for children and teenagers, including turning off live text and voice chats by default for younger users – the first time the FTC has imposed such a remedy. The company said it had already done so.

This month, Epic Games said it was introducing a new type of account for young users, called “cabined accounts,” in which features like live chat and in-app purchases are disabled by default. The company said it would default players who indicated they were younger than 13, or the digital age of consent in their countries, to cabined accounts and ask them to provide a parent’s email address before they could gain access to features like live voice chats.

Epic also agreed to pay $245 million to refund consumers over accusations that it used manipulative online practices to trick players of all ages into making unintended purchases. The FTC announced last year that it intended to crack down on such “illegal dark patterns” that can manipulate consumers.

The amount dwarfs previous refunds in such FTC cases, including a settlement last month in which Vonage, an internet phone service provider, agreed to refund $100 million to consumers over accusations that it used dark patterns to make it difficult for consumers to cancel their service.

In a separate dark patterns complaint against Epic Games, regulators said Fortnite’s user interface had a counterintuitive, inconsistent and confusing layout that led users to incur charges with the press of a single button. Players could be charged while trying to activate the game from sleep mode or while the game was loading a screen, the complaint said. Children ended up racking up charges without their parents’ knowledge.

Despite a range of complaints from players and pleas from employees, the company used manipulative tactics to deter consumers from canceling unauthorized charges or requesting refunds, regulators said. These dark-pattern techniques resulted in hundreds of millions of dollars in unwanted charges for users, the agency said.

In a statement, Epic said that, in a fast-moving video game industry, long-standing industry standards were no longer enough to protect players.

“No developer creates a game with the intention of ending up here,” the company said in a statement. “We accepted this agreement because we want Epic to be at the forefront of consumer protection and provide the best experience for our players.”

Well before the FTC action Monday, British legislators pushed for stronger safeguards for children on propulsive, multiplayer video games. In a parliamentary hearing on immersive and potentially addictive technologies in 2019, members of the Digital, Culture, Media and Sport Committee grilled video game company officials about children signing up for accounts and racking up purchases without parental knowledge or consent.

“So there is no age verification at all to start with?” Ian C. Lucas, then a member of Parliament for the Labour Party, asked during the hearing about users who start playing Fortnite.

“No. We don’t collect that information,” Matthew Weissinger, the vice president of marketing at Epic Games replied.

Epic Games instituted age verification for children in Britain last year, requiring users younger than 13 to obtain verified consent from a parent before they could use Fortnite.



Source link

- Advertisment -

YOU MAY ALSO LIKE..

Our Archieves