iPhone owners are currently facing multiple threats, but Apple’s new CSAM detection system has generated greater fears than even the biggest hacks. And it just got worse.
In a shocking new report (via BleepingComputer), a team of researchers at Imperial College London have found fundamental flaws in the technology behind Apple’s CSAM (Child Sexual Abuse Material) detection system. Apple intends to launch CSAM across all iPhones and iPads running iOS 15, but the report states that it is simple for images to both evade detection and “raise strong privacy concerns” for users.
CSAM operates by comparing image hashes (IDs) of pictures shared privately between iOS users to databases provided by child safety organizations. If matches are found then authorities and, where appropriate, parents are automatically notified. In theory, this makes the system rigorous and private. The problem is Imperial researchers found the whole system (and all systems of its type) can be bypassed simply by applying a hashtag filter to any image.
The filter sends an alternative hashtag to the detection systems and this fooled them 99.9% of the time. Moreover, the filter is virtually invisible so images appear unchanged to the human eye. The researchers also found that the only countermeasure Apple could take would be to increase the hash size (from 64 to 256), but such a move significantly increases false positives while also encoding more user data into images which introduces serious privacy concerns.
“Our results shed strong doubt on the robustness to adversarial black-box attacks of perceptual hashing-based client-side scanning as currently proposed,” explain the researchers. “The detection thresholds necessary to make the attack harder are likely to be very large, probably requiring more than one billion images to be wrongly flagged daily, raising strong privacy concerns.”
CSAM has already been widely condemned. In August, Edward Snowden said it “will permanently redefine what belongs to you, and what belongs to them”, pointing out that governments could force Apple to search for any images they desire.
“I can’t think of any other company that has so proudly, and so publicly, distributed spyware to its own devices… There is no fundamental technological limit to how far the precedent Apple is establishing can be pushed, meaning the only restraint is Apple’s all-too-flexible company policy, something governments understand all too well.”
Having initially tried to defend CSAM as poorly communicated, Apple has subsequently delayed its mass release on iPhones and iPads until 2022. Following these latest revelations, however, questions must be asked about the viability of the system as a whole.
I have reached out to Apple for comment on these findings and will update this post when/if I receive a reply.
Follow Gordon on Facebook
More On Forbes
Apple ‘Tic-Tac’ Chip Causes iPhone 13 Repair Headache
iPhone Alert: Apple Leaves 2 Zero-Day Hacks Inside iOS 15.1