Thousands sign open letter arguing against Apple plan to scan US iPhones for child sexual abuse images
A group of security and privacy tech advocates are pushing back against Apple’s recently announced plan to scan iPhones and iPads for images of child sexual abuse stored in the cloud, citing concerns around privacy and surveillance.
A open letter, made public online late last week, had as of Monday afternoon been signed by a coalition of almost three dozen organizations and over 6,600 individuals made up of cryptographers, researchers and security, privacy and legal experts.
The groups and individuals raised concerns around Apple’s new policy, unveiled last week, which would allow the it to scan photos stored on some Apple devices for child abuse imagery and report them to the National Center for Missing and Exploited Children, along with disabling user accounts if the content is found.
The open letter and its signatories, however, emphasized that the policy could open a “backdoor” for wider surveillance.
“While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple’s proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products,” the letter reads.
They specifically raised concerns around end-to-end encryption being bypassed and user privacy being compromised through the use of the checks Apple will use on devices to scan for child abuse imagery.
They pointed to concerns raised last week by groups including the Electronic Frontier Foundation, which accused Apple of “opening the door to broader abuses,” and the Center for Democracy and Technology, which said in a statement that the policy would “mark a significant departure from long-held privacy and security protocols.”
The signatories requested that Apple halt its proposed new policy, and that the company issue a statement “reaffirming their commitment to end-to-end encryption and user privacy.”
“Apple’s current path threatens to undermine decades of work by technologists, academics and policy advocates towards strong privacy-preserving measures being the norm across a majority of consumer electronic devices and use cases,” the letter reads. “We ask that Apple reconsider its technology rollout, lest it undo that important work.”
Apple did not respond to The Hill’s request for comment on the petition.
In announcing the new policy last week, Apple stressed its desire to “protect children from predators,” and that the method to detect images of child sexual abuse was “designed with user privacy in mind.”
“Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations,” the statement announcing the new policy read. “Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.”
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..