Shared Links (weekly) Dec. 21, 2025
For more like this, subscribe to the newsletter and get these links and more in your email.
Follow these topics: Weekly Links
Every employee is probably learning about AI because their job demands it, learning new features after new features of the tools they use to do their job, learning new systems that get rolled out every year, and dealing with technological change at a ridiculous pace.Â
Then, we make them responsible for learning how to stay secure and deal with all of the hack attempts that may come their way, too.Â
It’s all too much. Most of your users aren’t going to put in that kind of effort, and a yearly reminder about data security isn’t going to help them keep up with the variety of risks that are out there. It might not be worth the money you spend on it.Â
At first blush, the idea of scanning images synced up to iCloud for child sexual abuse materials against the hash list of known CSAM images seems like a good idea. As a survivor of childhood sexual abuse myself, I want tech companies to takes some initiative to deal with this issue. They also want to scan images on kids’ phones using AI to see if kids are getting into any trouble with sending or receiving sexual material. Again, that sounds like a good thing. But, as the EFF points out, this all requires a backdoor, and backdoors, once created, almost never remain used for just one purpose.
For more like this, subscribe to the newsletter and get these links and more in your email.
Charlton makes some interesting observations, and draws on some history to make this point, but it’s one we all need to keep in mind when we think of how we might be relying on AI to help us solve societal issues: We often call on technology to help solve problems. But when society defines, frames,…
For more like this, subscribe to the newsletter and get these links and more in your email.