But the company dropped a component of the tool that would notify parents when their children under 13 view or send nudes. That decision has won praise from some advocates, who say it protects the privacy of kids’ communications, and has rankled others who say it could leave kids vulnerable to predators and bullies.
“They should have kept parental notifications for kids under 13,” said Stephen Balkam, chief executive officer of the nonprofit Family Online Safety Institute. “There’s no reason we should ask an 11-year-old to keep herself safe online.”
Apple’s new feature, expected this month in iOS 15.2, raises questions about how tech companies balance the rights of both parents and children when they develop products, and how much control kids should have over their tech use.
Many parents would say it’s normal—even expected—for them to know what their kids are doing on their devices. Sexting has recently increased among kids under 13 and so has online child predation.
Thorn, a technology nonprofit focused on protecting children from online sexual exploitation, last month issued a report showing that of the 9- to 12-year-olds it surveyed, 14% said they shared sexually explicit images of themselves last year. That’s up from 6% a year earlier. Thorn conducted online surveys of 1,000 kids ages 9 to 17 in both 2020 and 2019. In 2020, of the 9- to 12-year-olds surveyed, 21% said it’s normal for kids their age to share nudes, up from 13% the year before.
Giving kids the chance to think before sending or opening a nude image could stop them from making a rash decision that could have major consequences, according to Apple. Kids sometimes share nudes of other kids as a form of bullying, and some teens have died by suicide after such incidents. In many states, it’s illegal for kids to have nude photos of other minors on their phones, even if the exchange was consensual.
Mr. Balkam’s organization has been urging Apple for years to do something about sexually explicit images involving children. He applauded Apple for taking steps, though he said parents shouldn’t be left out. “I think they’re 85% there with this,” he said.
This anti-sexting tool for Messages was announced earlier this year alongside another initiative aimed at protecting victims of child pornography. Apple had planned to roll out a system that would identify known child-pornography images and alert the company if a certain number of them were uploaded to iCloud, Apple’s cloud storage service. After receiving criticism that a tool capable of flagging users’ content could be misused by governments, Apple said it would take more time to collect input. This component won’t be part of the iOS 15.2 release.
Some privacy experts say Apple’s anti-sexting solution for Messages is a compromise that protects children and their privacy.
“The idea of notifying parents of kids under 13 operates with the assumption that there’s a relationship with the child that is safe,” said Elissa Redmiles, a privacy scholar and faculty member at the Max Planck Institute for Software Systems.
In the case of LGBTQ youth, for example, a parent seeing an image that reveals something about their child’s sexuality could lead to conflict or even abuse, she said.
Notifying parents also conveys to kids that they’re being watched, she said.
An Apple engineer whom Apple provided to speak about the feature said that, in the earlier plan, kids who were warned about nude photos would also have been told that their parents would be notified if they proceeded. While that message was designed to be clear to kids, the engineer said, some would likely have glossed over it or might not have understood the implications.
The system Apple decided on still provides useful protection, he said. Parents can choose to monitor their kids’ devices in other ways, such as by manually reviewing their texts and photos (if kids don’t delete them first).
The company also hopes to help stop children from sharing photos with predators. Thorn’s study found that of the 9- to 12-year-olds who have shared nude photos, 36% did so with people they believed to be 18 or older.
How Apple’s new safety feature will work
When Apple releases iOS 15.2, it is expected to include the new communication safety feature through the Family Sharing setting, which parents will have to choose to turn on from their phone. When enabled, artificial-intelligence software on the child’s device will detect nude images that the Messages app receives, or that the child adds to the app. A nude image in the app will appear blurred—until the child chooses to open it.
Attempts to send or receive such an image would trigger a message asking whether they’re sure they want to proceed. They will also get a second message urging them to talk to someone they trust if they feel pressured to view or send naked photos. They will have an option within the Messages app to text a trusted adult for help.
Apple said the company won’t have access to the messages or images on its servers, as they are still end-to-end encrypted.
The feature will work on any child-designated account for kids under age 18, as long as the kids’ accounts are linked to the parents’ accounts through Family Sharing. (You can learn how to set up Family Sharing here.)
The feature is also slated for the next updates for iPadOS 15 and MacOS, which have their own versions of the Messages app.
The image blurring and warnings will work only in Apple’s Messages app. The feature won’t apply to third-party messaging apps such as WhatsApp and Signal, or to photo-sharing social-media apps such as Instagram and Snapchat.
—For more Family & Tech columns, advice and answers to your most pressing family-related technology questions, sign up for my weekly newsletter.