Fooling your cars GPS system into believing you are somewhere else, apparently, has already been done:
“In December, CNN confirmed that instead of showing the cars where they really were — cruising along the Moskva River — the GPS suddenly insisted the cars were 20 miles away, at the Vnukovo International Airport.”
What’s next? How about sending your autonomous car fake information?
“The self-driving car doesn’t have ESP,” Humphreys says, “it gets information from its sensors. It determines its locations from its sensors, if there’s a crash coming up ahead, if the light is green or red — from its sensors.” Right now, a hacker could still send a confusing signal to a car, interrupting the real data coming from satellites and show it somewhere else. But drivers are still in control and they typically know where they are, regardless of what a GPS says.
In an autonomous car, however, if the operating systems are sent bad data, the car can make the decisions itself, allowing a hacker to remotely send a vehicle off the road or drive it down a different course.
Hopefully, by the time the companies making these fully autonomous cars get done, they will take the steps necessary to secure them from outside interference.
On the other hand, given the way IOT devices were released with almost no thought given to security, we should hold their feet to the fire on it, and be glad there are people already working on it like this.