Monday, April 23, 2018

LiDAR Patents Review

EETimes publishes Junko Yoshida article "Who’s the Lidar IP Leader?" Few quotes:

"Pierre Cambou, activity leader for imaging and sensors at market-research firm Yole Développement (Lyon, France), said he can’t imagine a robotic vehicle without lidars.

Qualcomm, LG Innotek, Ricoh and Texas Instruments.. contributions are “reducing the size of lidars” and “increasing the speed with high pulse rate” by using non-scanning technologies. Quanergy, Velodyne, Luminar and LeddarTech... focus on highly specific patented technology that leads to product assertion and its application. Active in the IP landscape are Google, Waymo, Uber, Zoox and Faraday Future. Chinese giants such as Baidu and Chery also have lidar IPs.

Notable is the emergence of lidar IP players in China. They include LeiShen, Robosense, Hesai, Bowei Sensor Tech.


  1. I am wondering how effective LIDAR is in fog, rain and snow, including water and snow on the roadway? RADAR seems like it would be more effective. Anyone have a good handle on the state of the art in LIDAR for various weather conditions? Seems like a fused multimode system is needed.

    1. Not all LiDARs are equal in that respect:

      - Scanning LiDARs emitting narrow laser beam are quite sensitive to rain. Each raindrop is like a small prism that diverts the beam in random directions, so the reflection comes back from a random direction too. The resulting picture is quite garbled. Flash LiDARs illuminating the whole scene are less sensitive to that.

      - Scanning LiDARs emit relatively low power beam and rely on the fact that it remains narrow. Fog scatters the light so that the beam power density drops much faster with distance. In flash LiDAR emitting, say in 90 or 120 deg angle, the fog or smog scattering is somewhat less of an issue.

      - SPAD detectors have a DR limitation. Once they get a strong reflection from nearby raindrop or snowflake, they basically shut down for recovery for a long time. APD and PD-based LiDARs are less sensitive to that.

      - CW coherent detection LiDARs have difficulties with multipath reflections. Raindrops and snowflakes create a lot of them.

      There are more differences depending on the technology used. Some time ago I prepared a couple of slides comparing them. If there is an interest, I can post them here.

    2. Thanks Vlad. From what you said, I don't understand why people are so focused on LIDAR. Maybe they are just doing research in dry desert cities.

    3. Well, automotive radars have a lot of limitations too. TI presented its latest generation reference design at one of the forums at ISSCC. If my memory serves me, they have SNR of 10 at 20m distance and 5ms averaging time. SNR=10 is about a minimum needed to achieve a false error rate of 1e-5 a detection probability of 0.99999, the typical automotive requirements.

      BTW, TI and NXP are the leaders in automotive radars.

      In order to get a longer range, they need to increase averaging time at the price of getting motion blur - not an easy trade-off.

      AIn order to get a reasonable angular resolution, radars work in 77 or 79GHz band. Then, they are quite limited in power they can transmit at these frequencies and Rx noise is not as good as at lower frequencies.

      On the other hand, one does not expect driving at 100km/h in heavy rain or fog. What is the reasonable speed then? 10km/h? At least, humans can drive no faster. Then the LiDAR range can be reduced to 20m or even 10m, may be. Some of the LiDARs can see that far in in bad weather.

  2. Didn't quite understand why flash LiDARs are good at rain and fog. I'm glad if you could post your slides.

    1. The scanning LiDAR emits a narrow beam, say 0.2deg by 0.2deg, for the sake of discussion. Then, a heavy fog scatters it, say, in 360deg sphere, in an extreme case. So, the laser power density drops down by a factor of 2,000,000, give or take.

      Flash LiDAR emits light in an angle, say, 30deg by 90deg for the sake of discussion. A heavy fog scatters it in 360deg. So, the laser power density drops by 50, give or take.

      So, the scanning LiDAR scene power density degrades much more.

      Does this answer on your question?

  3. Thank you.
    As far as I understand, flash LiDARs are like cameras, with each "sensor" (pixel) responsible for only a tiny portion
    (say 0.2 deg) of the whole FOV (correct me if i am wrong) . If that's the case, what matters for each pixel is like 0.2 deg out of the 90 deg illumination. So the situation looks similar to the scanning type for me.

    1. Your view on the flash LiDAR is correct. In your analogy, the laser is also an array of smaller lasers, each one illuminating 0.2deg angle. So, if one of the laser beams in this virtual array is diverted left, while another one - right, it does not change much in terms of the illumination power on the scene.

      In the scanning LiDAR, the diverted light illuminates the wrong part of the scene. The resulting errors is different depending on Rx side scheme, but the net result is worse than in flash LiDAR case.

      That said, not all flash LiDARs are created equal. Some designs are much worse than others.

      In total, there over 90 companies designing automotive LiDARs now. My notes about scanning vs flash is a gross generalization. All designs are different and have different trade-offs.

  4. My understanding of the physical situation was wrong. (I'm not going to explain how I was wrong; it's kind of shame.) I think I understand now. Thanks for the explanation.


All comments are moderated to avoid spam and personal attacks.