Elite hackers from China have found a way to trick a Tesla Model S into going into the wrong lane by strategically placing some simple stickers on the road.
Keen Labs, widely regarded as one of the most technically ingenious cybersecurity research groups in the world, developed two kinds of attack to mess with the Tesla autopilot’s lane-recognition tech.
First, the researchers sought to make alterations to lane markings, first by adding a large number of patches to the line to make it appear blurred. It worked, but as the patches looked much too conspicuous, the Keen hackers decided that it’d be too difficult to carry out in the real world.
So the researchers tried to create a “fake lane.” They discovered that Tesla’s autopilot would detect a lane where there were just three inconspicuous tiny squares strategically placed on the road. When they left small stickers at an intersection, the hackers believed they would trick the Tesla into thinking the patches marked out the continuation of the right lane. On a test track, their theory was proved correct, as the autopilot took the car into the real left lane.
“Our experiments proved that this architecture has security risks and reverse-lane recognition is one of the necessary functions for autonomous driving in non-closed roads,” the Keen Labs wrote in a paper. “In the scene we build, if the vehicle knows that the fake lane is pointing to the reverse lane, it should ignore this fake lane and then it could avoid a traffic accident.”
In other attacks, the Keen crew claimed to have the ability to remotely control the steering wheel and start up the windscreen wipers. In the former, via a complex series of steps that broke through some of the security barriers put up around the onboard network, Keen discovered a way to control the steering wheel with a gamepad, though they were in the vehicle at the time. While that initially sounds serious, the attack didn’t work when a car had been taken manually from reverse to drive mode at any speed above 8 km per hour. However, when in cruise control, the attack worked “without limitations.”
As for the windscreen hack, it’d be tricky, in a real-world scenario, to deploy the specially-crafted image that fooled the Tesla into believing it was raining. But the fake lane would be easy to recreate using cheap materials, Keen Labs said.
Tesla hadn’t responded to a request for comment at the time of publication.
It’s not the first time Keen Labs has exposed potential problems in the safety and security of Tesla’s digital systems. Back in 2016, the hackers discovered a way to remotely take control of a Tesla’s brakes.
In March, during the CanSecWest security conference in Canada, prizes totalling more than $900,000 were on offer to anyone who could hack a Tesla. Only one team demonstrated a successful exploit: a hack of the onboard browser that let researchers Richard Zhu and Amat Cama display their own messages on the infotainment system. They walked off with $35,000 and the car. None of the car’s control systems were commandeered, however.
UPDATE: A Tesla spokesperson told Forbes that it had addressed the vulnerabilities regarding remote control of they steering wheel before the Keen researchers had even been in touch. As for the other issues, the spokesperson added: “The rest of the findings are all based on scenarios in which the physical environment around the vehicle is artificially altered to make the automatic windshield wipers or Autopilot system behave differently, which is not a realistic concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should always be prepared to do so, and can manually operate the windshield wiper settings at all times.”