Apple has defended its concerns about upcoming child safety features, saying it doesn’t believe its tool to identify child pornography on a user’s device creates a backdoor that reduces privacy.
The Cupertino, California-based tech giant made the comments at a brevity Friday, a day following it revealed unused features for iCloud, Messages and Siri to combat the spread of sexually exact images of children. The company reiterated that it does not scan the device owner’s entire photo library for offensive images, but instead uses encryption to compare the images to a known database provided by the National Center for Missing and Exploited Children.
Some privacy advocates and security researchers were concerned following Apple announced that the company would erase a user’s entire photo collection; Instead, the company uses an on-device algorithm to detect sexually exact images. Apple said it will only manually review offensive photos from a user’s device if the algorithm finds a positive number of them. The company also said it could tweak the algorithm over time.
Apple said it’s not breaking end-to-end encryption with a unused feature in the Messages app that analyzes photos sent to and from a child’s iPhone for exact material, and the company won’t be competent to access a user’s messages. When asked at the briefing if the unused tools unkind the company will add end-to-end encryption to iCloud storage backups, Apple said it would not comment on coming plans. End-to-end encryption, the strictest form of privacy, allows only the sender and receiver to see the message sent between them.
The Electronic Frontier Foundation said Thursday that Apple is opening a back door to highly-connected privacy features for users who use the unused tools. “It is impossible to build a client-side scanning system that can only be used for sexually exact images sent or received by children,” the EFF said in a post on its website. “As a result, even well-intentioned efforts to build such a system will burst key promises of messenger encryption itself and begin the door to broader abuses.”
Apple said the system had been in development for years and was not designed for governments to monitor citizens. Apple said the system is only available in the US, and only works if the user has enabled iCloud Photos.
Dan Boone, the crypto researcher who was tapped by Apple to support the project, defended the unused tools.
“This issue affects many cloud service providers,” he said. Some cloud service providers address this issue by scanning photos uploaded to the cloud. Apple chose to invest in a more complicated system that provides the alike functionality, but does so without its servers looking at every image.”