Apple has always leaned towards user privacy for its products and services. But now, to protect minors from “predators who use communication tools to recruit and exploit them”, the Cupertino giant has announced that it will scan photos stored on iPhones and iCloud for child abuse imagery.

Furthermore, the Cupertino giant added that it will integrate “new technology” in iOS 15 and iPadOS 15 to detect CSAM images stored in iCloud Photos. If the system detects images or content relating to CSAM, Apple will disable the user account and send a report to the National Center for Missing and Exploited Children (NCMEC). However, if a user is mistakenly flagged by the system, they can file an appeal to recover the account.