Apple to Scan iPhones for Images of Child Sex Abuse

Apple on Friday has unveiled plans to scan U.S. iPhones for images of child sexual abuse drawing applause from child protection groups but raising concern among some security researchers.

Apple’s tool known as “neuralMatch” will use on-device machine learning algorithm to detect images of child sexual abuse without decrypting private communications. The tool will scan the device and once it finds a match, the image will be reviewed by a human who can notify law enforcement if necessary.

“Apple’s method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind … Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations,” said Apple.

Researchers said the tool could be put to other purposes such as government surveillance of dissidents or protesters.

Matthew Green, a cryptography researcher at Johns Hopkins University, warned that the technology could theoretically be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child abuse images.

NeuralMatch will land in iOS 15 and macOS Monterey which is scheduled to be released in the next month or two.


© Fourth Estate® — All Rights Reserved.
This material may not be published, broadcast, rewritten or redistributed.