Apple confirms that it has stopped plans to roll out CSAM detection system

Back in 2021, Apple announced a number of new child safety features, including Child Sexual Abuse Material (CSAM) detection for iCloud Photos. However, the move was widely criticized due to privacy concerns. After putting it on hold indefinitely, Apple has now confirmed that it has stopped its plans to roll out the CSAM detection system.

more…

The post Apple confirms that it has stopped plans to roll out CSAM detection system appeared first on 9to5Mac.



from 9to5Mac https://ift.tt/NczyOXl
December 07, 2022 at 11:58PM
Tags

Post a Comment

0 Comments
* Please Don't Spam Here. All the Comments are Reviewed by Admin.