Hello, welcome toChina Autonomous Driving Testing & Validation Technology Innovation Conference 2020!

Ruh Roh! Autonomous Car Crashes Could be Caused by Phantom Images

Release Date:2020-03-25

  MARCH 24, 20207:00 AM

Life imitates art. An autonomous-driving research group has found a way to fool self-driving cars into reacting to images projected onto roads or other objects in the line of sight of a car’s cameras. Isn’t there an episode of Scooby Doo where the dastardly villain does something very similar? And he would have gotten away with it, if not for those meddling kids.

  • A research team has been able to trick Tesla’s Autopilot by projecting images in the environment around the car at night.

  • A picture of a person caused the car to brake; projected lane markings caused the car to drive into the oncoming lane, and speed limit information could be changed.

  • The published paper suggests hackers could use these techniques to cause everything from traffic jams to car crashes.




Tesla autopilot deceived by image

This image projected on the ground was enough to make a Tesla brake while being driven autonomously. I would argue, the majority of human drivers would do the same. (Photo: YouTube screenshot, video below)

A new research paper concerning vulnerabilities of autonomous cars has been published by the International Association for Cryptologic Research (IACR). The paper delivers the results of intensive testing by researchers at Georgia Tech and Ben-Gurion University of the Negev in Israel, using a Tesla Model X and a light projector.

The testing procedure involved driving autonomously at night and researchers would project realistic objects on the road or on buildings or trees around the car. Systems like Tesla’s Autopilot use cameras instead of radar or laser sensors to guide the vehicle. In the first test, an image of a person is projected in the center of the road. Ironically, the team chose to use an image of Elon Musk. The Autopilot system applied the brakes and slowed the car to 15 mph, but still drove over the image.

Tesla following phantom lane markings


This test, involving the projection of phantom lane markings to make the car steer into the oncoming lane is straight from the Wile E. Coyote playbook. I suspect most drivers are as smart as a roadrunner. (Photo: YouTube screen capture, video below)

In another test, speed limit signs were projected onto buildings, billboards and trees, which altered the car’s speed. Later, the team projected lane markings on the road that caused the car to turn into the oncoming lane. This is probably the most concerning of the tests, and the one that represents the biggest threat to an autonomously piloted vehicle.

The group claims that hackers could use any of these techniques to cause everything from traffic jams to car accidents. Although most of the testing done here was performed using a stationary projector positioned on the side of the road, the theory is: hackers could strap a projector to a drone. This would allow the “attack” to take place remotely, and the hackers could remain anonymous and leave no traces at the crime scene. There would be no clues to lead Shaggy and Scooby to Old Man Witherspoon’s hideout.

WHY THIS MATTERS

Is this a case where we expect too much from autonomous and ADAS systems? If human drivers suddenly saw an image of a person projected in front of them, they might react as well. False lane markings and speed limits signs probably wouldn’t fool an attentive human in context, but might fool AI systems. This is another caution light for over-reliance on technology to deliver flawless safety.

Source

Link to pdf of research paper


Source: https://ride.tech/self-driving/jinkies-autonomous-car-crashes-could-be-caused-by-phantom-images/

  • 电话咨询
  • +862122306692
  • 15021948198
None