A single-line train displaying Google signs passes a billboard advertising Apple iPhone security during the 2019 Consumer Electronics Show (CES) in Las Vegas, Nevada, U.S., on Monday, Jan. 3. 7, 2019.
Bloomberg | Bloomberg | Getty Images
Apple this week announced a system that will enable it to report child abuse photos uploaded to iCloud storage in the US and report them to the authorities.
Child protection advocates praised the move. John Clark, CEO of the National Center for Missing and Exploited Children — a nonprofit organization created by congressional mandate — called it a “game-changing” in a statement.
But the unused system, which is now being tested in the US, has been fiercely opposed by privacy advocates who have warned that it is a slippery slope and could be further modified and exploited to censor other types of content on people’s devices.
Apple is not unique in its efforts to get rid of the cloud storage of illegal child pornography. Other cloud services already do this. Google has been using hashing technology since 2008 to identify illegal images in its services. In 2019, Facebook said it removed 11.6 million pieces of content related to child nudity and child sexual exploitation in just three months.
Apple says its system is an improvement over industry-standard methods because it uses its control of hardware and sophisticated math to learn as little as conceivable about the photos on a person’s phone or cloud account while still reporting illegal child pornography on cloud servers. It does not scan the actual images, it only compares the hashes, the unique numbers that correspond to the image files.
But privacy advocates see the move as the start of a policy change in which Apple could be pressured by foreign governments, for example, to reuse the system to scrap political rhetoric by requiring Apple to tag images of protests or political memes. Skeptics are not concerned about how the system works today nor do they support for people who collect known child exploitation images. They worry about how it might develop in the coming years.
Skeptics worry about how the system might evolve
“Make no mistake: If they can look up child porn today, they can look up anything tomorrow,” NSA whistleblower Edward Snowden wrote on Twitter.
The Electronic Frontier Foundation (EFF), which has supported Apple’s policies on encryption and privacy in the former, criticized the move in a blog post, calling it a “back door,” or a system created to donate governments a way to access encrypted data.
“Apple can detail how its technical implementation will maintain privacy and security in its proposed backdoor, but at the end of the day, a well-documented, carefully considered, and meticulously limited backdoor is still a backdoor,” the influential nonprofit said in a blog post.
Apple’s unused platform has also been criticized by the company’s competitors, including Facebook subsidiary WhatsApp, which also uses end-to-end encryption for some of its messages and is under pressure to provide more access to people’s content to forbid child exploitation.
“Instead of focusing on making it easier for people to report what content is shared with them, Apple has created software that can scan all private photos on your phone – even photos you haven’t shared with anyone” on Friday, WhatsApp chief Will Cathcart tweeted. He said that WhatsApp would not adopt a similar system. “This is not privacy.”
Privacy has become an essential part of iPhone marketing. Apple has been public about the security architecture of its systems and is one of the most vocal advocates of end-to-end encryption, meaning it doesn’t even know the content of messages or other data stored on its servers.
Most notably, in 2016, she faced the FBI in court to protect the integrity of cryptographic systems in a mass shooting investigation.
Apple has been heavily criticized for this position. Law enforcement officials around the world have pressured the company to weaken its encryption for iMessage and other software services like iCloud to investigate child exploitation or terrorism.
Apple considers it a triumph-triumph situation
Apple sees the unused system as part of its tradition of protecting privacy: a triumph-triumph situation where it protects user privacy while eliminating illegal content. Apple also claims that the system cannot be redirected to other types of content.
But this is also why privacy advocates see the unused system as a betrayal. They feel they have lost an ally who built computers designed to forbid – as much as conceivable – data from leaking to governments, Apple and other companies. Now they see, Snowden said, a system that compares user photos to a “secret blacklist.”
This is due to Apple’s own marketing. In 2019, she bought a giant billboard in Las Vegas during an electronics trade sincere with the slogan “What happens on your iPhone, stays on your iPhone.”
Apple CEO Tim Cook addressed the “scary effect” of knowing that what’s on your device may be intercepted and reviewed by third parties. Cook said the lack of digital privacy could lead people to censor themselves even if the person using the iPhone did nothing mistaken.
“In a world without digital privacy, even if you do nothing mistaken but think differently, you start censoring yourself,” Cook said in his 2019 graduation speech at Stanford University. “Not quite at first. Just a little, little by little. To risk less, to hope less, to imagine less, to dare less, to create less, to try less, to speak less, to think less. The frightening effect of digital surveillance is profound and touches everything.”
Apple’s focus on privacy has been a success for the company. This year, it introduced paid privacy services, such as Private Relay, a service that hides a user’s IP addresses and thus location.
Privacy has also been part of the sales arena as Apple has stormed into lucrative unused industries such as personal finance with a Goldman Sachs-powered credit card, and healthcare with software that allows users to download medical records to their iPhones.
But reputation can wear off quickly, especially when it seems to contradict previous public attitudes. Privacy and security are complicated and are not accurately conveyed by marketing slogans. Critics of Apple’s unused plan to eliminate child exploitation don’t see a better-engineered system that improves on what Google and Microsoft have been doing for years. Instead, they see a major policy shift from the company that said “what happens on your iPhone stays on your iPhone”.