Google’s Child Abuse Detection Tools Can Also Identify Illegal Drawings of Children

Apple was hit with a wave of criticism earlier this year when it announced plans to scan iPhones to stop the distribution of Child Sexual Abuse Material (CSAM). Critics fretted that Apple’s hash-checking system could be co-opted by governments to spy on law-abiding iPhone users. In response to the backlash, Apple might end up making changes to that program, but Google has its own way of spotting CSAM, and it might be even more intrusive for those who use all of Google’s cloud services.
The specifics on Google’s CSAM scanning come by way of a warrant issued in early 2020 and spotted by Forbes. According to the filing, Google detected CSAM in Google Drive, its cloud storage platform. And here’s where things get a little weird; the warrant stemming from this report targeted digital artwork, not a photo or video depicting child abuse.
Apple’s system under its “Expanded Protections for Children” banner uses hashes for known child abuse materials, scanning iDevices for matching hashes. This should prevent false positives and it doesn’t require Apple to look at any of the files on your phone. The issue cited most often with this approach is that Apple is still scanning your personal files on your smartphone, and it could be a privacy nightmare if someone manages to substitute different hashes. Apple says this isn’t possible, though.
Google, as it turns out, does something similar. It uses a technique initially developed for YouTube to look for hashes of known CSAM, but it also has an AI that has been trained to use machine learning to identify new images of child abuse. It’s not clear how Google spotted the problematic files in 2020, but the unidentified individual is described as an artist. That suggests he is the one who created the drawings at issue, and Google’s systems identified it as CSAM.

After Google’s system spotted the drawings, it sent the data to the National Center for Missing and Exploited Children, and from there it went to the DHS Homeland Security Investigations unit. Investigators filed the warrant in order to get access to the user’s data. The artist has not been identified as no charges were ever brought. However, US law holds that drawings, like those depicting child abuse, can still be illegal if they lack “serious literary, artistic, political, or scientific value.” That’s hard to prove — even agreeing on a definition of “art” can be a challenge. This may explain why there were no charges brought in this case.
While Google’s use of AI is more aggressive than Apple’s, it’s also seemingly restricted to cloud services like Gmail and Drive. So, Google isn’t set up for scanning Android phones for hashes like Apple is on the iPhone, but Google’s approach can sweep up original artwork that may or may not be illegal, depending on who you ask. Regardless of what is “art,” Google isn’t doing this just to do it — there is an undeniable problem with CSAM on all cloud services. Google says that it reported 3.4 million pieces of potentially illegal material in 2021, and that’s up from 2.9 million the year before.
Continue reading

Google, Seagate AI Identifies Problem Hard Drives Before They Fail
Google and Seagate have built an AI model to track which hard drives are more likely to fail than others, ideally before any of them have failed at all.

Netflix Imposes Harsher Restrictions on VPNs, Residential IP Addresses
Netflix has strengthened its grip on accounts attempting to bypass location-based restrictions. Over the last few days, Netflix subscribers worldwide have begun to notice that some of their favorite content is missing.

IRS Drops Facial Recognition Program Used to Verify Identities
The IRS was facing intense pressure to find a better solution, and surprisingly it capitulated.

Study Identifies Which Parts of an Asteroid Make It to Earth’s Surface
Scientists have only recently gotten an up-close look at asteroids, as well as the chance to examine material from them in a lab. Before that, all we had were the remains of asteroids that fell to Earth, but what part of the original space rocks even are they?