
Last week, Apple announced the controversial child safety features for iOS, iPadOS, and macOS. Despite the excellent use, CSAM features have received a lot of backlashes, raising questions about privacy. Today, the Cupertino-based giant published a detailed FAQ answering some critical questions regarding CSAM and user privacy.
Apple posted a six-page FAQ after the feature drew a lot of backlashes. Privacy advocates, including Edward Snowden, have slammed the part saying the quality will turn iPhone into “iNarcs.”
Apple CSAM FAQ
The rest of the FAQ details the three features, namely Communication safety in Messages, CSAM detection, and Security for CSAM detection for iCloud Photos. You can read more about the FAQ here.
With the FAQ, Apple is aiming to resolve all the issues regarding privacy and CSAM. But, do you feel convinced? How do you think about Apple searching the iCloud Photo Library for CSAM? Do you think this is a breach of your privacy? Drop a comment and let us know your thoughts!