Apple unveils limits to Its child sex abuse scanner after critics question users’ privacy
Sputnik news agency and radio 14:12 GMT 14.08.2021
Last week, Apple announced details of its “neuralMatch” software, designed to detect child sexual abuse material on US users’ devices.
Apple has unveiled limits to its new scanner looking for child abuse content after the technology was criticised for possible violations of user privacy.
The company will allow the “neuraMatch” software, designed to detect Child Sexual Abuse Material (CSAM) to scan photos uploaded to the iCloud by US users. If a case of child sexual abuse imagery is confirmed, the National Centre for Missing and Exploited Children (NCMEC) will be notified of the user’s account.
Many, however, were quick to claim that the system could be expanded to scan for images unrelated to child abuse, something that could ride roughshod over the privacy of Apple users.
In a 14-page document released on Friday, the company addressed those concerns, explaining that “instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations”.
The document then notes that “Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices”.
According to the explainer, it would take 30 matching images for the system to activate, which means that “the possibility of any given account being flagged incorrectly is lower than one in one trillion”.
“If and only if you meet a threshold of something on the order of 30 known child pornographic images matching, only then does Apple know anything about your account and know anything about those images, and at that point, only knows about those images, not about any of your other images”, Apple SVP Craig Federighi, for his part, said in an interview with The Wall Street Journal.
The remarks follow a group of security and privacy tech advocates releasing an open letter, in which they warned that Apple’s new software “introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products”.
The letter came after the company said in a statement that by rolling out the new system, they want “[…] to help protect children from predators who use communication tools to recruit and exploit them”.