Lists

Sunday, July 02, 2017

LiDAR Spoofing Proposed and Tested

The Register reports about KAIST paper on automotive LiDAR spoofing published at the International Association for Cryptologic Research's pre-print archive: "Illusion and Dazzle: Adversarial Optical Channel Exploits against Lidars for Automotive Applications" by Hocheol Shin, Dohyun Kim, Yujin Kwon, and Yongdae Kim.

"In this work, we have presented and experimentally verified two types of attacks that can severely degrade the reliability of lidars. Although we have listed many mitigative approaches in the discussion, they are either technically/economically infeasible or are not definitive solutions to the presented attacks. We do not advocate the complete abandonment of the transition toward autonomous driving, because we believe that its advantages can outweigh the disadvantages, if realistic adversarial scenarios are appropriately mitigated. However, such considerations are currently absent; therefore, automakers and device manufacturers need to start considering these future threats before too late."

3 comments:

  1. One can attack a car already by blinding the driver with a strong light.

    ReplyDelete
    Replies
    1. The word "attack" in this case doesn't involve brute force blinding of the person with a powerful laser but instead involves surreptitiously: altering the attacker's location or size (from enormous to microscopic), creating false targets (possibly making other vehicles invisible) and causing collision avoidance to create an accident, moving buildings or the roadway, altering elevation from hill to chasm causing acceleration or braking, etc.

      Delete
  2. This is why there should/needs/will be a sensory network of different modalities.

    ReplyDelete

All comments are moderated to avoid spam and personal attacks.