resla

Researchers have used off-the-shelf tools to trick the autopilot sensors on a Tesla Model S, demonstrating that it’s simple to blind the car so it doesn’t see obstacles in its path.

The research comes from a group of scientists from the University of South Carolina, China’s Zhejiang University and the Chinese security firm Qihoo 360.

They gave details in a talk last week at the Defcon hacker conference in Las Vegas, and on Thursday they published this paper on their work.

As they say in the paper, unlike traditional network security, autonomous vehicles rely heavily on sensory input about their surroundings to make driving decisions, which makes sensors an obvious place for attackers to focus their efforts.

Their work was done in a lab as well as outdoors on a Tesla Model S. It’s not time to worry about rampant carnage, though: both Tesla and the researchers agree that we won’t see any crashes caused by signal tampering any time soon.

This is not the first time that Tesla’s autopilot has been in the news. In May, 40-year-old Joshua Brown lost his life when a tractor-trailer turned in front of his Tesla Model S, which was set on the autonomous cruise control.

According to a preliminary report from the National Transportation Safety Board, Brown was using the car’s automatic emergency braking and lane-keeping features at the time of the crash: features that are part of the Tesla S autopilot system.

As of late July, Tesla engineers had two theories about the cause of the crash.

The first theory: the car’s camera and radar failed to detect the truck as it turned left from the oncoming lane, stretching across the lane of the divided highway where Brown was driving, because the white trailer was hard to distinguish from the bright sky behind it.

The other theory: the cameras didn’t see the rig and the car’s computer thought the radar signal was false, possibly detecting an overpass or sign.

Whichever theory is correct, at the end of the day, what they show is that the sensors can get it wrong. Brown’s car didn’t brake.

The sensors that Tesla’s Model S autopilot relies on include millimeter-wave radar, ultrasonic sensors, and forward-looking cameras, all of which measure the echoes of signals reflected by obstacles.

The Chinese researchers set out to see how easy it would be to fool those sensors using the techniques of spoofing – sending carefully crafted signals similar to real ones to a device – and jamming, which is done by sending out signals that overwhelm the sensors.

They experimented on all three types of sensor, but they found that only the radar attacks had the potential to cause a high-speed crash.

The researchers built the DIY ultrasonic jammer based on an Arduino board. Typically, the Tesla Model S ultrasonic sensor is used for detecting nearby objects, such as when the car’s parking itself.

These are the attacks the researchers found they could carry out:

  • Jamming attacks can prevent ultrasonic sensors from detecting objects and cause collisions. In self-parking and summon mode, the Tesla model S car will ignore obstacles and crash into them during a jamming attack.
  • Spoofing attacks can manipulate the sensor measurements and cause autos to register a pseudo-obstacle that isn’t really there.
  • Acoustic cancellation is possible in theory, though sophisticated hardware and algorithms are required.

This video shows their equipment, sitting on a cart, being detected as another vehicle: it registers on the car’s screen as a blue car icon.

When the radio interference is switched on, the radio waves bouncing from the cart back to the Tesla are drowned out: you can see, in the video below, how the blue car icon disappears from the screen, meaning that the car’s autopilot has been blinded to the obstacle in its path.

[embedded content]

Wired quotes University of South Carolina computer science professor Wenyuan Xu:

It’s like a train has gone by and it’s loud enough to suppress our conversation.

The worst case scenario would be that while the car is in self-driving mode and relying on the radar, the radar is obscured and fails to detect an obstacle ahead of it.

Tesla has downplayed the results, saying it “[hasn’t] been able to reproduce any real-world cases that pose risk to Tesla drivers.”