Just Once I’d Like To See a Tech Company Not Release New Toys Before Realizing the Obvious Risks

Just Once I’d Like To See a Tech Company Not Release New Toys Before Realizing the Obvious Risks

I think we all knew this would happen, right? Thousands scammed by AI voices mimicking loved ones in emergencies. The description of what happened: “Tech advancements seemingly make it easier to prey on people’s worst fears and spook victims who told the Post they felt “visceral horror” hearing what sounded like direct pleas from friends…

Linked: Dutch MPs in video conference with deep fake imitation of Navalny’s Chief of Staff
|

Linked: Dutch MPs in video conference with deep fake imitation of Navalny’s Chief of Staff

As I think about this, it occurs to me that a lot of the things that we think would give away deep fake videos are things that happen all the time in Zoom or Teams calls, right? The video being a little slow, or jerky, or not keeping up fluidly with the movement of people on screen, etc. So it could be harder to tell that the “person” on the call with you isn’t really who you think it is, and then we can begin to wonder who it was, and what information they got from being there, pretending to be someone else.

Are we ready for that?