At first blush, the idea of scanning images synced up to iCloud for child sexual abuse materials against the hash list of known CSAM images seems like a good idea. As a survivor of childhood sexual abuse myself, I want tech companies to takes some initiative to deal with this issue. They also want to scan images on kids’ phones using AI to see if kids are getting into any trouble with sending or receiving sexual material. Again, that sounds like a good thing. But, as the EFF points out, this all requires a backdoor, and backdoors, once created, almost never remain used for just one purpose.
Cybersecurity for Attorneys: The Ethics of Incident Response
Legal Tech Trends from 2020 and How to Prepare for 2021
New Report Shows Cellphone Encryption Isn’t Really Stopping Cops From Searching Phones
Small Business: Mental Health Resources During the Pandemic
The Challenges of Chat in eDiscovery as COVID Brings Changes in Work Behaviour and Working From Home
3,000 law firms “could be forced to close or merge”
Your Boss is Your Biggest Cyber-threat, Global Remote Work Survey Finds
Cybersecurity giant FireEye says its hacking tools were stolen by a nation-state
How to get your boss to approve the training you want
The eDiscovery Channel (Blog) Has Become the History Channel
The Importance Of Authentic Networking
Here Are 4 Ways You Can Address And Support Employee’s Mental Health
While watching the Alex Winter film about the Panama Papers, this quote stood out to me given all of the talk about the “dangers” of encryption.
While working with the International Consortium of Investigative Journalists in the early days of investigating the data leaked from Mossack Fonseca law firm, the importance of not letting anyone know that the data had leaked, or that it was all being investigated, they lived with this slogan:
“Shut Up and Encrypt”