At first blush, the idea of scanning images synced up to iCloud for child sexual abuse materials against the hash list of known CSAM images seems like a good idea. As a survivor of childhood sexual abuse myself, I want tech companies to takes some initiative to deal with this issue. They also want to scan images on kids’ phones using AI to see if kids are getting into any trouble with sending or receiving sexual material. Again, that sounds like a good thing. But, as the EFF points out, this all requires a backdoor, and backdoors, once created, almost never remain used for just one purpose.
I hope the folks who lost data can somehow get it back. Losing data to a failure of any kind is a pain in the ass. On the other hand, if the ransomware plague has taught us anything, it’s to have backups, online and offline. Because anything connected to the infected device is at risk, but if I have a copy that isn’t connected to anything, it’s safe.
Yes, it’s more work. Yes, it takes time and effort.
So does figuring out how to deal with losing all of your data.
If you’ve seen references to a court ruling sort of redefining the Computer Fraud and Abuse Act recently, or even if you haven’t, this paragraph from the folks at McGuire Woods boils down the real life implications pretty well.
Truthfully though, I think this story shows a couple of really important points.
Recognize that mistakes happen
The importance of two-factor identification
The importance of taking action as soon as you realize the mistake
The importance of getting the technical folks involved immediately instead of hiding it
AI is not without bias, and maybe the best thing we can do is know that going in, instead of assuming that technology would solve this problem.
Hmm, looks like Customs and Border Protection is taking aim at cars, and all the various information stored within: According to statements by Berla’s own founder, part of the draw of vacuuming data out of cars is that so many drivers are oblivious to the fact that their cars are generating so much data in…