#SpookyNews: Hackers can make your Tesla speed up suddenly
All 'connected' devices come with the risk of hacking. We have a number of cases to back up this claim, but the latest one is raising major alarms. A group of researchers has shown that autonomous cars made by Tesla, a leading EV-maker, can be fooled to speed up even when it is not needed/permitted. Here are more details.
Tesla cars 'see' and drive by using a combination of cameras, radar, and AI in real-time. The cameras and radar collect data from all corners of the street, while the AI, powered by a neural network chip, processes this information, allowing the vehicle to take appropriate action depending on the situation on the road. There is another chip on board for backup.
As Tesla has built several security features to prevent the camera and radar systems of its cars from being hijacked, the researchers, who hailed from security firm McAfee, took a different approach to attack. They made manipulations to a speed limit sign on the side of a road and managed to trick multiple vehicles from the EV maker into speeding up by 80 km/hr.
The researchers stuck a tiny hard-to-notice sticker on a 35 mph (56 km/hr) speed limit sign to make it look like 85 mph (136 km/hr). Their manipulation was so subtle that the driver couldn't see any change while passing through, but the car's MobilEye EyeQ3 camera system picked up 85 mph instead of 35 mph - and accelerated by 50 mph (80 km/hr).
The EyeQ3 system has been specifically designed to read speed limit signs and feed them into autonomous driving features built into Tesla cars, including automatic cruise control. This was exactly why the readings taken by the camera were able to make a 2016-shipped Tesla Model X and Model S speed up even when it was not needed.
The work done by McAfee shows how easily the machine learning algorithms of self-driving cars can be fooled into going haywire. Theoretically, this trick can be used to fool an autonomous car into accelerating on a slow-moving lane, resulting in major accidents. Or, in an opposite situation, a vehicle could be tricked into slowing down or stopping on a fast-moving highway, again causing accidents.
That said, it is worth noting that this is not the first attack on Tesla's AI system. Just last year, a similar sticker-based trick was used to fool the company's algorithms and veer a car into a wrong lane in traffic.
The dangerous loophole was disclosed to Tesla and MobilEye last year, but it's not clear if a patch has been issued. When the folks at MIT Technology Review reached out for comment, the company did not say anything and indicated that the issue won't be fixed on that generation of hardware. Meanwhile, MobilEye said that even a human could be fooled this way.