Try not to get whiplash, but at the same time that many public officials in the US, and all over the world, are screaming about how much information the tech companies have about us, and the violation of our privacy that is involved in that, when one of them even hints at creating a fully encrypted communication tool that even the company can’t access, the FBI starts losing their minds.
Here is what they have to say about encryption:
““It can’t be a sustainable end state for there to be an entirely unfettered space — that’s utterly beyond fully lawful access — for criminals, terrorists and spies to hide their communications,” Wray said at the RSA security conference. He said he wants tech companies and law enforcement to reach a compromise.”
So if you’re a tech company, especially in light of what happened overnight in New Zealand, and with it being streamed by the attacker, here are the things we seem to be asking of you.
1. Block any content that might be considered fake or part of a radicalization effort.
2. Block violent and/or racist/sexist content. Or content that violates a copyright.
3. Share information with law enforcement when requested.
4. Don’t violate the privacy of any of your users.
5. Keep all the data, but never get hacked.
Sure, good luck with that.
Look, the reality is that if we want tech companies and social media companies to do better at blocking things we find offensive, they will, as a matter of fact, have to start actively monitoring all of their users. Waiting for other users to “report” a live stream of violence is too late. It’s out there. Then it’s all just whack-a-mole.
But, you can’t actively monitor users without violating their privacy, and probably making some bad calls along the way. We are seeing that, and we are seeing the downside of it. It’s not just completely innocent content getting taken down, or biased enforcement issues either. That much information being available is going to be misused. Somewhere, somehow, people will access it who shouldn’t, or people who should access it will use it against someone in ways they shouldn’t.
But if you can’t see it, and you can’t collect it, you also can’t block it. And you can’t share it with law enforcement.
Eventually we’re going to have to make a choice.