
"LIDAR is used on most self-driving cars - Tesla is the exception - and uses laser pulses to measure the physical environment, but is known to struggle with reflective surfaces. Last year a team of eggheads managed to fool LIDAR with tinfoil and colored swatches. The European researchers went one better with a technique they called an Object Removal Attack (ORA) which used mirrors of various sizes to cover a traffic cone."
""We show that by exploiting the physics of specular reflection, an adversary can inject phantom obstacles or erase real ones using only inexpensive mirrors," the researchers wrote in a paper submitted to the journal Computers & Security. "Experiments on a full AV platform, with commercial-grade LIDAR and the Autoware stack, demonstrate that these are practical threats capable of triggering critical safety failures, such as abrupt emergency braking and failure to yield.""
Inexpensive mirrors can manipulate LIDAR perception to create phantom obstacles or hide real ones, producing dangerous AV behaviors. Demonstrations in a university parking lot showed a LIDAR-equipped car running Autoware either failed to recognize a masked traffic cone or braked to avoid a nonexistent object. Two attack types were developed: Object Removal Attack (ORA) that masks obstacles using mirrors, and Object Addition Attack (OAA) that injects phantom objects using small reflective tiles. Adjusting mirror size and placement can fully mask an object or create detections 20 meters away. Experiments used commercial-grade LIDAR and a full AV stack and produced safety-critical failures.
Read at Theregister
Unable to calculate read time
Collection
[
|
...
]