As the old saying goes: If you aren’t doing anything illegal, then you have nothing to fear from surveillance.
Smartphones already act like tracking devices broadcasting the whereabouts of their owners, but Apple is about to open the door to far more advanced forms of smartphone-based voluntary surveillance by launching a new program designed to detect and report iPhone users who are found to have child pornography – known by the academic-speak acronym CSAM – which stands for Child Sexual Abuse Materials. According to a handful of academics who were offered a sneak preview of the company’s plans – then promptly spilled the beans on Twitter, and in interviews with the press.
The new system, called “neuralMatch”, is expected to be unveiled by Apple later this week. The software is expected to be installed on American iPhones via a software update. According to the FT, the automated system can proactively alert a team of human reviewers if it believes CSAM is present on a user’s iPhone. If the reviewers can verify the material, law enforcement will be contacted.
This is how “neuralMatch” will work, per the FT:
Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as “hashing”, will be compared with those on a database of known images of child sexual abuse.
[…]
The system has been trained on 200,000 sex abuse images collected by the US non-profit National Center for Missing and Exploited Children.
One academic who was offered a preview of the software explained why this could create serious privacy risks. Apple has gotten a lot of positive press for its commitment to user privacy – remember when it refused to crack an iPhone belonging to one of the San Bernardino shooters? Well, this encryption technology has become a perennial headache for law enforcement. Last January, Apple quietly abandoned plans to allow users to fully encrypt their iCloud backups due to complains from law enforcement.
Now, Apple has found a middle ground: it will assume responsibility for policing iPhones – well, at least to a degree. To accomplish this, the company is rolling out a new machine-learning tool that will scan iPhones for images that match certain “perceptual hashes” known to represent child pornography. But as academics have complained, could potentially be misled.
What’s more, the tool that’s today being used to unearth child pornography could one day be abused by authoritarian governments (like the CCP). And once Apple has committed to using this type of surveillance, governments will demand it from everyone.
COVID Vaccine Passport company ENTRUST and their rich NAZI heritage.
Green isn’t the only ‘expert’ who objects to the idea. “It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of…our phones and laptops,” said Ross Anderson, professor of security engineering at the University of Cambridge. Another researcher said it’s only a few steps removed from ‘1984’-style surveillance.
Though the FT managed to find at least one academic willing to defend Apple’s approach.
Apple’s system is less invasive in that the screening is done on the phone, and “only if there is a match is notification sent back to those searching,” said Alan Woodward, a computer security professor at the University of Surrey. “This decentralised approach is about the best approach you could adopt if you do go down this route.”
Still, others warned that the system is only a few steps removed from ‘1984’-style surveillance. Alec Muffett, a security researcher and privacy campaigner who formerly worked at Facebook and Deliveroo, said Apple’s move was “tectonic” and a “huge and regressive step for individual privacy”. “Apple are walking back privacy to enable 1984.”