Apple delaying plan to scan phones for child sex abuse images

Apple announced Friday that it will delay a suite of features aimed at limiting the spread of Child Sexual Abuse Material (CSAM) that had raised serious privacy concerns.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple said in a statement to The Hill.

Two out of the three features had brought significant criticism.

One would alert parents if their children were sending or receiving sexually explicit images. The other would scan photos in a user’s iCloud for CSAM and report any infringing images to Apple moderators.

Apple would then report detected material to the National Center for Missing and Exploited Children, a national clearinghouse that works with law enforcement.

The company had said the cloud scanning feature was “designed with user privacy in mind.” The feature would use a database of known CSAM image hashes and check for matches before photos are uploaded.

If a certain threshold of known CSAM was passed, Apple would then step in and determine if the material should be flagged.

Despite those privacy checks, watchdogs, experts and advocates quickly lined up to oppose the features.

Critics said that the image scanning capability could function as a backdoor for new surveillance and censorship.

The final feature in the anti-CSAM suite would direct its Siri voice assistant and search features to “intervene” when users perform related searches.

Tags Apple child sex abuse IPhone

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..

 

Main Area Top ↴

Testing Homepage Widget

 

Main Area Middle ↴
Main Area Bottom ↴

Most Popular

Load more

Video

See all Video