Apple unveils plans to scan US iPhones for images of child sex abuse

Apple will roll out an update later this year that will include technology in iPhones and iPads that allows the tech giant to detect images of child sexual abuse stored in iCloud, the company announced Thursday. 

The feature is part of a series of updates Apple unveiled aimed at increasing child safety, but security researchers and advocates are warning the scanning update — along with one that aims to give parents protective tools in children’s messages — could pose data and security risks beyond the intended purpose. 

With the new scanning feature, Apple will be able to report detected child sexual abuse material to the National Center for Missing and Exploited Children (NCMEC) which acts as a comprehensive reporting center and works in collaboration with law enforcement agencies across the country. The company will also disable users accounts if the abusive content is found, Apple said in the update. 

Apple said its method to detect the abusive material is “designed with user privacy in mind.” Instead of scanning images in the cloud, the system performs “on-device matching” using a database of known child sexual abuse material image hashes provided by child safety organizations. 

Before an image is stored in iCloud photos, an on-device matching process is used against the known hashes. Apple said another technology ensures that the system of the safety vouchers cannot be interpreted by Apple unless the iCloud account crosses a threshold of known child sexual abuse content. 

“The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account,” Apple said.

Apple can only interpret the content if the threshold is exceeded, and Apple will then manually review each report to confirm there is a match, disable the user’s account and send the report to NCMEC. Users who feel an account has been mistakenly flagged can file an appeal to have their account reinstated. 

The update was first reported by the Financial Times. Security researchers who spoke to the Times voiced concerns about potential risks the update could have. 

“It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of … our phones and laptops,” Ross Anderson, professor of security engineering at the University of Cambridge, told the newspaper. 

Similarly, Matthew Green, a security professor at Johns Hopkins University, told the Times “This will break the dam — governments will demand it from everyone.” 

The Center for Democracy and Technology (CDT) also released a statement Thursday warning that the update will threaten messaging security and urged the platform to abandon the plans.

“Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the U.S., but around the world,” Greg Nojeim, co-director of CDT’s Security & Surveillance Project, said in a statement. “Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”

Along with the scanning feature, Apple announced an update to the messages app that will warn children and their parents when receiving or sending sexually explicit photos. 

When such content is received, the photo will be blurred and the child will be warned and “presented with helpful resources,” according to Apple. Children will also be told if they do view the content that their parents will get a message. 

The CDT also voiced concern over the messaging update, warning that the tool Apple intends to fight predators could be used to expose sensitive information about young people’s sexual identities to “unsympathetic adults.” 

“Apple’s retreat from providing secure end-to-end encrypted services opens the door to privacy threats for all users, while creating new threats for young people. In particular, LGBTQ youth and children in abusive homes are especially vulnerable to injury and reprisals, including from their parents or guardians, and may inadvertently expose sensitive information about themselves or their friends to adults, with disastrous consequences,” Nojeim said. 

Apple is also planning to expand guidance in its Siri and Search features, and update the features to “intervene” when users perform searches for queries related to child sexual abuse material. 

The “interventions” will explain to users that interest in the topic searched is “harmful and problematic” and provide resources for help with the issue, Apple said.

Tags Apple ICloud Internet privacy

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..

 

Main Area Top ↴

Testing Homepage Widget

More Technology News

See All

 

Main Area Middle ↴
Main Area Bottom ↴

Most Popular

Load more

Video

See all Video