Apple’s Image Scanning Tool is, Well, Complicated
At first blush, the idea of scanning images synced up to iCloud for child sexual abuse materials against the hash list of known CSAM images seems like a good idea. As a survivor of childhood sexual abuse myself, I want tech companies to takes some initiative to deal with this issue. They also want to scan images on kids’ phones using AI to see if kids are getting into any trouble with sending or receiving sexual material. Again, that sounds like a good thing. But, as the EFF points out, this all requires a backdoor, and backdoors, once created, almost never remain used for just one purpose.
And, that’s a problem.
That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change. Take the example of India, where recently passed rules include dangerous requirements for platforms to identify the origins of messages and pre-screen content. New laws in Ethiopia requiring content takedowns of “misinformation” in 24 hours may apply to messaging services. And many other countries—often those with authoritarian governments—have passed similar laws. Apple’s changes would enable such screening, takedown, and reporting in its end-to-end messaging. The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers.
See that’s the complication, once the system exists for one purpose, it’s nothing for some government actor to start demanding it is used for the next thing, and the next thing, and the next. Apple will end up cooperating or choosing to not sell iPhones in that country anymore, and I’m not holding my breathe that they will choose the latter.
That doesn’t even discuss someone misusing the tool either. Once there is a way to scan what should have been encrypted content, it becomes a juicy target for internal and external bad actors as well. The idea that your photos and iMessages are private may have to be a thing of the past. Maybe we are OK with that if it means catching some child predators and protecting kids from sexting, but are we OK with that if it also enables Russia to persecute LGBTQ+ folks? Because we won’t be the only ones drawing the line on what it’s scanning for, everyone is going to want a say on the matter, and Apple will no longer be able to claim it’s not possible. It clearly is possible.
There’s no simple answer here. We can have full encryption and privacy in our messages and photos, or we can have an ever-changing definition of what Apple will end up looking for. There’s absolutely no guarantee that Apple will stop here, not when there are so many other groups who will be applying pressure to use the tools for their own purposes too.
What price will you be willing to pay, and what line would Apple have to cross with scanning for you to get angry about it?
For another take, where Daring Fireball comes away saying the all-important phrase “if they only work as described” you can check here – https://daringfireball.net/2021/08/apple_child_safety_initiatives_slippery_slope