Apple has silently removed details on its CSAM (Child Sexual Abuse Material) Detection feature from its website, giving us an inkling that it may have decided to abandon the feature completely after delaying it due to all the negative it has received. However, that might not be the case.
Apple’s Child Safety page no longer mentions the CSAM Detection feature. The CSAM Detection, which has been a subject of controversy ever since it was announced in August, uses machine learning algorithms to detect sexually explicit content in a user’s iCloud Photos while maintaining users’ privacy. But, the feature was widely scrutinized as it hindered people’s privacy and raised concerns on how it could be misused easily.
Now, it remains to be seen how and when Apple plans to make the CSAM detection feature official. Since the feature hasn’t received a warm welcome from people, Apple has to be careful whenever it is ready for an official release. We will keep you posted on this, so stay tuned for updates.