Apple criticised for system that detects child abuse
Tech & IT desk || shiningbd
Apple is facing criticism over a new system that finds child sexual abuse material (CSAM) on US users' devices.
The technology will search for matches of known CSAM before the image is stored onto iCloud Photos.
But there are concerns that the technology could be expanded and used by authoritarian governments to spy on its own citizens.
WhatsApp head Will Cathcart called Apple's move "very concerning".
Apple said that new versions of iOS and iPadOS - due to be released later this year - will have "new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy".
The system will report a match which is then manually reviewed by a human. It can then take steps to disable a user's account and report to law enforcement.
The company says that the new technology offers "significant" privacy benefits over existing techniques - as Apple only learns about users' photos if they have a collection of known child sex abuse material in their iCloud account.
But WhatsApp's Mr Cathcart says the system "could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable".
He argues that WhatsApp's system to tackle child sexual abuse material has reported more than 400,000 cases to the US National Center for Missing and Exploited Children without breaking encryption.