Apple Swears Governments Can’t Co-Opt Its Child Abuse Detection Tools for Surveillance

Apple recently confirmed a report that claimed it was planning to go hunting for illegal materials on iPhones. This, from a company that has long promoted its privacy protections. Condemnation was swift from civil liberties groups, but many have withheld judgment because of Apple’s intended target: images of child sexual abuse. Apple has now circled back to promise that it won’t allow governments to co-opt this tool for surveillance purposes.
As detailed last week, Apple’s upcoming “Expanded Protections for Children” has two features. First, iOS 15 will be able to scan the files on devices for known Child Sexual Abuse Material (CSAM). The system creates hashes of the files on your phone, and compares those strings to hashes of child abuse images from the National Center for Missing and Exploited Children (NCMEC). This check will be completed before anything is uploaded to iCloud. The other side of the coin is iMessage protection for child accounts. Using machine learning, iOS can determine if an incoming photo is sexually explicit. It can then warn the user before showing the image.
This is a delicate situation because “won’t someone please think of the children” has been used to justify a lot of questionable activities both online and off. On the other hand, Apple has created a tool that can scan your phone for files based on file hashes, and it says only known CSAM will trigger an alarm. However, there is theoretically nothing stopping Apple from updating the hashes to go snooping for non-CSAM pictures. Well, nothing but Apple’s word.
In its latest statement (PDF), Apple has tried to assuage fears of file scanning overreach. “Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” Apple said.

Apple has also clarified that a matching hash is not forwarded to law enforcement immediately. Instead, Apple will conduct human verification before disabling the account and forwarding the data to NCMEC. In addition, the same set of CSAM hashes will be stored locally on every iOS device, so it will be “impossible” for someone to inject a hash that causes a false positive.
Again, this is what Apple says about the technology. Your willingness to believe Apple will depend on how you feel about the company’s track record. It’s fair to say that Apple has stood up to law enforcement, often in high-profile ways. However, it’s still a company that exists to make money, and governments have monetary and legal tools to force companies to do their bidding.
Continue reading

Qualcomm Has New Wearable Chips, And It Swears This Time They’ll Be Good
Yes, we've said this before, but the new Qualcomm Snapdragon W5+ and W5 could finally be the processors your watch needs.

This Muzzle, Er, Microphone Lets You Swear Around Friends, Family
The company's upcoming Mutalk microphone fits over your mouth to keep your online rantings quiet. It also looks like a dystopian science fiction nightmare.

Samsung Swears to Deliver Foldable Phone By End of 2018
Samsung is promising to deliver a foldable phone by the end of the year. Whether anyone wants a foldable phone is another question altogether.

Black Ops 4 Will Make Your Base Consoles Sweat
Christmas has come early this year for FPS enthusiasts! Black Ops 4 was released in mid-October this time around – weeks before the typical November release window. While it was almost certainly moved up to accommodate a certain gun-slinging juggernaut, this iteration isn't light on content in the least.