2 min

Tags in this article

, ,

Soon all Apple platforms will be actively scanning and analyzing user devices in real time.

This week Apple said its iPhones and iPads will soon start scanning to detect images containing child sexual abuse. The devices would then automatically report them when they upload them to Apple’s online storage in the United States.

The new tool would supposedly use a “neural matching function” called NeuralMatch to detect if images on a user’s device match known child sexual abuse material (CSAM) fingerprints. Although Apple claims it has taken user privacy into consideration, there are also concerns that the tech may open the door to unintended misuse. The concerns are particularly acute when it comes to surveillance.

Privacy experts are raising red flags

Matthew Green, an associate professor at Johns Hopkins Information Security Institute, is a well-known security expert who is raising the alarm. Green has written extensively about Apple’s privacy methods over the years. “I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow,” Green tweeted in a thread late last night. “This is a really bad idea.”

“These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.”

In a blog post, Apple said that it wants to protect children from predators by limiting the spread of  Child Sexual Abuse Material, CSAM. “This program is ambitious, and protecting children is an important responsibility,” Apple said. “Our efforts will evolve and expand over time.”

But Greg Nojeim of the Center for Democracy and Technology in Washington, DC said that “Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship.”

This, he said, would make users “vulnerable to abuse and scope-creep not only in the United States, but around the world.”

“Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”

This feature is coming in an update later this year to accounts set up as families in iCloud for iOS 15, iPadOS 15, and macOS Monterey.