Apple iPhone, iPad and Mac owners are currently facing multiple threats, but Apple’s new CSAM detection system has generated greater controversy than all the rest combined. And it just took another twist.
In a stunning new post, Edward Snowden has delved into Apple’s CSAM (child sexual abuse material) detection system coming to Apple’s approx 1.65BN active iPhones, iPads and Macs next month. He states: “Apple’s new system, regardless of how anyone tries to justify it, will permanently redefine what belongs to you, and what belongs to them.” He also shows what you can do about it — for now.
CSAM detection works by matching a user’s images to illegal material. “Under the new design, your phone will now perform these searches on Apple’s behalf before your photos have even reached their iCloud servers, and… if enough “forbidden content” is discovered, law-enforcement will be notified,” Snowden explains. “Apple plans to erase the boundary dividing which devices work for you, and which devices work for them.”
“The day after this system goes live, it will no longer matter whether or not Apple ever enables end-to-end encryption, because our iPhones will be reporting their contents before our keys are even used [his emphasis],” says Snowden. Moreover, while he cites “compelling evidence” from researchers that Apple’s CSAM detection system is seriously flawed, he draws attention to a much bigger point:
“Apple gets to decide whether or not their phones will monitor their owners’ infractions for the government, but it’s the government that gets to decide what constitutes an infraction… and how to handle it.”
Furthermore, Snowden points out that the entire system is easily bypassed which undermines the stated aim behind its creation:
“If you’re an enterprising pedophile with a basement full of CSAM-tainted iPhones, Apple welcomes you to entirely exempt yourself from these scans by simply flipping the ‘Disable iCloud Photos’ switch, a bypass which reveals that this system was never designed to protect children, as they would have you believe, but rather to protect their brand. As long as you keep that material off their servers, and so keep Apple out of the headlines, Apple doesn’t care.”
And, for those of you already thinking ahead, Snowden points out there is an obvious next step to this process: governments compelling Apple to remove the option to Disable photo uploads to iCloud.
“If Apple demonstrates the capability and willingness to continuously, remotely search every phone for evidence of one particular type of crime, these are questions for which they will have no answer. And yet an answer will come — and it will come from the worst lawmakers of the worst governments. This is not a slippery slope. It’s a cliff.”
Researchers have already pointed out all the ways this could be exploited and the markets Apple could be removed from if it doesn’t comply with government requests. There is already precedent here. In May, Apple was accused of compromising on censorship and surveillance in China after agreeing to move the personal data of its Chinese customers to the servers of a state-owned Chinese firm. Apple also states that it provided customer data to the US government almost 4,000 times last year.
“I can’t think of any other company that has so proudly, and so publicly, distributed spyware to its own devices… There is no fundamental technological limit to how far the precedent Apple is establishing can be pushed, meaning the only restraint is Apple’s all-too-flexible company policy, something governments understand all too well.”
Interestingly, Snowden doesn’t touch upon a further key threat: if Apple gets hacked. Creating a backdoor into such a far reaching detection system means it is possible Apple would not be aware of how its devices are being scanned and manipulated.
“[Apple is] inventing a world in which every product you purchase owes its highest loyalty to someone other than its owner. To put it bluntly, this is not an innovation but a tragedy, a disaster-in-the-making.”
To date, Apple has defended its CSAM detection system saying it was poorly communicated. But last week researchers, who worked on a similar system for two years, concluded “the technology was dangerous” saying “we were baffled to see that Apple had few answers for the hard questions we’d surfaced.”
CSAM will launch on iOS 15, iPadOS 15, watchOS 8 and macOS Monterey next month. I have reached out to Apple for a comment and will update this post when/if I receive a response.
In the meantime, I would advise all Apple fans to read Snowden’s full post and make up your own mind.
Follow Gordon on Facebook
More On Forbes
New iPhone iMessage Flaw Enables ‘Zero Click’ Hack
Researchers Label Apple’s CSAM Detection System ‘Dangerous’