As part of its efforts to improve child safety features, Apple revealed its plans to scan iCloud Photos for potential Child Sexual Abuse Material (CSAM) earlier last month. Following backlash from security experts and digital rights groups like Electronic Frontier Foundation, Apple has now delayed the rollout of CSAM detection.
Apple was initially all set to roll out CSAM detection later this year. It is applicable to accounts set up as families in iCloud for iOS 15, iPadOS 15, and macOS Monterey. The Cupertino giant has not revealed the new date for rolling out the feature just yet. Apple has also not detailed on what aspect of CSAM detection it is planning to improve or how it will approach the feature to offer a healthy balance between privacy and safety.