According to WhatsApp, WhatsApp will not adopt Apple’s unused child safety measures, which are aimed at stopping the spread of child abuse images WhatsApp Chief Will Cathcart. In a series of tweets on Twitter, he explained his belief that Apple “built software that can scan all private photos on your phone,” and said that Apple had taken the mistaken path in trying to improve its response to child sexual abuse material, or CSAM.
Apple’s plan, announced Thursday, involves taking a hash of photos uploaded to iCloud and comparing it to a database containing hashes of known CSAM photos. According to Apple, this allows it to preserve user data encrypted and Run the analysis on the device While allowing it to report users to authorities if they are found to be sharing child abuse images. Another aspect of Apple’s child safety strategy includes elective warning for parents if their child is Under 13 years old Submit or display images that contain sexually exact content. An inner Apple memo acknowledged that people would be “concerned about the fallout” of the regulations.
I read the information provided by Apple yesterday and I am concerned. I think this is the mistaken approach and a setback for the privacy of people all over the world.
People asked if we would adopt this system for WhatsApp. The answer is no.
– Will Cathcart (@wcathcart) August 6, 2021
Catcart Apple’s approach calls “Very worrying,” said it, saying that it would allow governments with distinct ideas about what considerate of images are acceptable and not acceptable to require Apple to add non-CSAM images to databases with which they compare images. Cathcart says WhatsApp’s anti-child exploitation system, which partly uses user reports, maintains encryption like Apple’s and has led the company to More than 400,000 cases reported To the National Center for Missing and Exploited Children in 2020. (Apple is also working with the Center for CSAM discovery efforts.)
The owner of WhatsApp, Facebook, has reasons to pounce on Apple over privacy concerns. Apple’s changes to how ad tracking works in iOS 14.5 have started a battle between the two companies, with Facebook buying newspaper ads criticizing Apple’s privacy changes as harmful to little businesses. Apple hit back, saying the change “simply requires” giving users a choice about whether or not to be tracked.
WhatsApp wasn’t alone in criticizing Apple’s unused child safety measures. The list of people and organizations of concern includes Edward Snowden, the Electronic Frontier Foundation, professors, and others. We’ve collected some of that feedback here to serve as an overview of some of the criticisms leveled at Apple’s unused policy.
Matthew Green, an assistant professor at Johns Hopkins University, retracted the feature prior it was publicly announced. He tweeted about Apple’s plans and how the hashing system could be misused by governments and malicious actors.
These tools will allow Apple to scan your iPhone photos for photos that match a positive perceptual hash, and report them to Apple servers if too many appear.
– Matthew Green (@matthew_d_green) August 5, 2021
The EFF issued a statement criticizing Apple’s plan, more or less describing it as a “meticulously documented, carefully considered, and narrow-minded backdoor.” The Electronic Frontier Foundation (EFF) press release goes into detail about how it believes Apple’s child safety measures can be abused by governments and how it reduces user privacy.
Apple’s filtering of iMessage and iCloud isn’t a slippery slope for backdoors that block speech and make our communications less secure. We are already there: This is a completely built system that is waiting for outside pressure to make the slightest change. https://t.co/f2nv062t2n
– EFF (EFF) August 5, 2021
Kendra Albert, a trainer at Harvard University’s Cyberlaw Clinic, has a topic about the potential risks to gay children and Apple’s initial lack of clarity about age groups for the Parent Notices feature.
The idea that parents are safe people for teens to have sex conversations or send sexual messages with is impressive, but in many cases, it’s not true. (And as far as I can tell, these things don’t just apply to children under 13)
– Kendra Albert (@KendraSerra) August 5, 2021
The EFF reports that iMessage nudity notifications won’t go to the parents if the child is between 13 and 17 but that’s nowhere in the Apple documentation that I can find. https://t.co/Ma1BdyqZfW
– Kendra Albert (@KendraSerra) August 6, 2021
Edward Snowden retweeted a profile financial timeAn article about the system, giving its own description of what Apple does.
Apple plans to modify iPhones to always search for contraband:
“It’s an absolutely horrific idea, because it would lead to widespread surveillance of our phones and laptops,” said Ross Anderson, professor of security engineering. https://t.co/rS92HR3pUZ
– Edward Snowden August 5, 2021
Politician Brianna Wu called the system “the worst idea in Apple’s history”.
This is the worst idea in Apple history, and I don’t say it lightly.
It destroys their credibility in privacy. It will be misused by governments. It will lead to the killing and disavowal of gay children. This is the worst idea ever. https://t.co/M2EIn2jUK2
– Brianna Wu (@BriannaWu) August 5, 2021
Just to say: Apple’s scan does not detect child abuse images. It detects a list of known banned images added to the database, which were initially child abuse images found circulating elsewhere. What images are added over time is arbitrary. He does not know what the child is.
– SoS (SwiftOnSecurity) August 5, 2021
Writer Matt Blaze has also tweeted about concerns about misuse of the technology by bypassing governments, in an effort to block non-CSAM content.
In other words, not only must policy be exceptionally robust, so should implementation.
– Matt Blaze (@mattblaze) August 6, 2021
Epic CEO Tim Sweeney also criticized Apple, saying the company “empties everyone’s data into iCloud by default.” He also promised to share more ideas specifically about Apple’s child safety system.
It’s dreadful how Apple dumps everyone’s data into iCloud by default, hiding over 15 divide options to turn off parts of them in Settings under your name, and forcing you to have a spam email account. Apple would never allow a third party to ship an app like this.
– Tim Sweeney (@TimSweeneyEpic) August 6, 2021
I will share some detailed thoughts on this related topic later.
– Tim Sweeney (@TimSweeneyEpic) August 6, 2021
However, not every reaction was decisive. Ashton Kutcher (who has done advocacy work to end child sex trafficking since 2011) described Apple’s work as a “major step forward” for efforts to end child sexual abuse.