A security researcher used a homemade $60 system to outsmart self-driving car lidar sensors that cost thousands. He was able to trick an autonomous vehicle into slowing down and even launched a denial of service attack on a self-driving car’s tracking system so that it came to a complete stop.
Lidar, a remote sensing technology, is most commonly known as the circular “eye” mounted on the roof of most self-driving cars; it acts somewhat like radar as the lasers spin around to scan the area and detect objects. Lidar devices come in various sizes and prices. The lidar (Light Detection and Ranging) market is estimated to be a $1 billion market by 2020. It’s not used exclusively for driverless cars, as seen in recent news about autonomous golf carts and surveying drones. Yet Jonathan Petit, a principal scientist at Security Innovation, believes lidar sensors are “the most susceptible technologies” in self-driving vehicles.
At Black Hat Europe in November, Petit will present Self-driving cars: Don’t trust your sensors! “Automated vehicles are equipped with multiple sensors (LiDAR, radar, camera, etc.) enabling local awareness of their surroundings. A fully automated vehicle will solely rely on its sensors readings to make short-term (i.e. safety-related) and long-term (i.e. planning) driving decisions,” he wrote. His Black Hat presentation will focus on “remote attacks on camera-based system and LiDAR using commodity hardware. Results from laboratory experiments show effective blinding, jamming, replay, relay, and spoofing attacks. We propose software and hardware countermeasures that improve sensors resilience against these attacks.”
“I can take echoes of a fake car and put them at any location I want. And I can do the same with a pedestrian or a wall,” he told IEEE Spectrum. Petit recorded pulses from a commercial IBEO Lux; the lidar device, which is advertised to be used in “in urban traffic and on the motorway,” has sensors that “work accurately and reliably even at high speeds, in poor weather conditions and heavy traffic.” Yet since the lidar pulses were not encrypted or encoded, Petit could replay the recorded pulses.
He compared his $60 attack device to a “laser pointer,” before adding, “You don’t need the pulse generator when you do the attack. You can easily do it with a Raspberry Pi or an Arduino. It’s really off the shelf.”
The timing of the attack is the “only tricky part” as it must fool the lidar into sensing an object that isn’t really there. It could slow an autonomous car or spoof so many signals that it forces the vehicle into stopping “for fear of hitting phantom obstacles.”
IEEE added:
Petit was able to create the illusion of a fake car, wall, or pedestrian anywhere from 20 to 350 meters from the lidar unit, and make multiple copies of the simulated obstacles, and even make them move. “I can spoof thousands of objects and basically carry out a denial of service attack on the tracking system so it’s not able to track real objects,” he says. Petit’s attack worked at distances up to 100 meters, in front, to the side or even behind the lidar being attacked and did not require him to target the lidar precisely with a narrow beam.
Petit previously co-authored “Revisiting attacker model for smart vehicles,” which identified “two principal threats to solo robot cars: blinding cameras or inserting fake video into vision systems, and jamming or spoofing GPS signals.” Petit told SAE International: “Cameras are mobile eyes. They’re hacked easily. You could feed a system’s recorded images or mess up the cameras by playing with the brightness using something as simple as a laser pointer. And it doesn’t take a great amount of resources to do GPS spoofing.” Medium-level risks to self-driving cars included “electromagnetic pulses (EMPs) that could shut down the electronics altogether or environmental confusion inflicted on radar and lidar scanners.”
Google, Apple, Audi, Ford, Lexus, Mercedes and Toyota are but a few of the car manufacturers that use lidar sensing systems in their self-driving vehicle prototypes. Petit said his attack works on only one specific lidar unit, but he believes car manufacturers haven’t thought of his attack scenario yet; he claimed lidar systems are “ripe for attack” and he hopes this acts as a “wake up call.”
Source: $60 device spoofs phantom objects and tricks self-driving cars into stopping | Network World