At first blush, the idea of scanning images synced up to iCloud for child sexual abuse materials against the hash list of known CSAM images seems like a good idea. As a survivor of childhood sexual abuse myself, I want tech companies to takes some initiative to deal with this issue. They also want to scan images on kids’ phones using AI to see if kids are getting into any trouble with sending or receiving sexual material. Again, that sounds like a good thing. But, as the EFF points out, this all requires a backdoor, and backdoors, once created, almost never remain used for just one purpose.