Lists

Sunday, January 13, 2019

1550nm LiDAR Damaged Sony Camera at CES

Arstechnica reports that a man who snapped Aeye LiDAR at CES claims that the 1550nm LiDAR permanently damaged his expensive Sony ILC camera. Every image he takes now has two bright spots with vertical and horizontal lines emanating from them (click and download the full resolution images to see the fine details):


AEye CEO Luis Dussan stressed that AEye lidars pose no danger to human eyes in an email to Arstechnica. But he didn't deny that AEye's lidars can cause damage to camera sensors. AEye has offered to buy a new camera instead of the damaged one.

"Cameras are up to 1000x more sensitive to lasers than eyeballs," Dussan wrote. "Occasionally, this can cause thermal damage to a camera's focal plane array."

The 1550nm LiDARs leverage the fact that this wavelength is absorbed in the eye before it reaches the retina, and dramatically increase the laser power. AEye is reported to use a powerful fiber laser. While remaining light safe, that high power apparently can damage image sensors inside cameras.

Thanks to TG for the link!

42 comments:

  1. Hmm. “Absorbed by the eye before it reaches the retina.”
    It wouldn’t be absorbed unless it interacted with the molecules in the lens and vitreous humor.
    Clearly raises some concerns regarding cataracts and similar problems.

    ReplyDelete
  2. OMG! This is pretty bad news... can this issue be « simply » solved by adding a stronger cut filter @ all future camera levels with stronger rejection / extinction of this 1550nm laser frequency?

    ReplyDelete
    Replies
    1. Sure. But the problem is if these lasers are dangerous for the natural eyes.

      Delete
  3. What would be the failure mechanism in this case?

    I suppose at 1550nm the transmission through the IR-CUT filter of the camera is much higher than at 905nm, as most of these filters seem to become transmissive again at >1200nm. So, if the 1550nm laser is 1000x more intense than a LIDAR laser at 905nm, the ratio of the actual light hitting the sensor may be again 10-1000x higher.

    The absorption in the camera chip would mostly be through free carrier absorbtion, so what we see is actually non-reversible thermal damage instead of a carrier generation effect?

    ReplyDelete
    Replies
    1. Interesting that the camera "sees" the car Lidar window on the top picture, even though it's supposed to be 1550nm.

      I wonder about microlens and CFA materials. Are they transparent at 1.5um?

      Delete
    2. I share your opinion Vladimir.

      Unless some other material than silicon is used for photon detection, I see micro lenses and CFA materials as the most probable parts that could absorb so much at 1550nm.

      Delete
  4. I am wondering about Lidar and surveillance cameras on streets and in front of banks and stores. I am also wondering really if we understand the impact on our eyes, despite all the calculations and assurances. Sometimes we get it wrong on invisible radiation - something supposedly perfectly safe turns out not to be so safe after all. Consider the lowly shoe-fitting fluoroscope (and other fluoroscopes) that everyone thought were perfectly safe. https://en.wikipedia.org/wiki/Shoe-fitting_fluoroscope

    I remain skeptical of the value of Lidar in self-driving cars, especially in places like New Hampshire where there is a lot of snow. And I worry about the long term impact of laser radiation on infrastructure and people. Visible light sensors, thermal sensors and radar seem to be the more sensible ticket all the way around.

    ReplyDelete
  5. I wonder how much of this case comes from the Aeye iDAR technology specifics, rather than LiDARs in general.

    Aeye iDAR uses adaptive scanning, meaning that the laser does not dully scan other the entire FOV, but rather scans only specific "interesting points", as determined by its internal camera. Indeed, a half of typical FOV is sky, no need to sent laser energy there. Also, the most interesting events happen on the road and nearby area that camera can identify and send the laser beams there.

    The adaptive scanning concept is, in theory, quite nice and has been implemented by some other companies too. In the future, other sensors can be integrated into the search for the "interesting points", such as radar, GPS, Google maps, compass, thermal camera, etc. A car's central CPU with all its neural super-intelligence would decide what is interesting and what's not. Indeed, if the surrounding buildings are mapped by Google and other sensors do not recognize anything unusual, why spend the precious laser energy on them?

    Getting back to the CES incident, what if Aeye iDAR decided that this guy with his camera is an "interesting point" and put all its laser power on him?

    Most of other scanning LiDARs just scan the entire FoV regularly with no smarts. Flash LiDARs just distribute their laser energy through a diffuser over the entire scene. So far, nothing bad has been reported about them.

    ReplyDelete
    Replies
    1. >Getting back to the CES incident, what if Aeye iDAR decided that this guy with his camera is an "interesting point" and put all its laser power on him?
      Interesting point.. The eye safety should be determined by this worst scenario, I wonder if there is any documents based on this.

      Delete
    2. The worst case scenario in eye safety checks is when person puts his eye directly against the laser output window. In that case, it does not matter where this laser beam is directed to. If they passed that test, it's easier to be OK in any "interesting point."

      BTW, passing eye safety regulations is not a science. Rather, it's an art. There are many tricks and non-trivial ideas that can make the device certified eye safe, even if it emits a high laser power.

      Delete
  6. Since the Aeye LIDAR presumably damaged the CMOS imager inside the camera, it would presumably damage those inside the self driving cars too.

    ReplyDelete
    Replies
    1. Indeed, great point. Certainly other cars would be something scanned more than anything else. Goodbye backup camera. Unless every other camera in range is equipped with a cutoff filter that blocks 1550nm, seems like a large liability for Aeye.

      Delete
  7. Silicon is transparent at 1550. The likely problem was probably local heating of the conductors and/or dopat

    ReplyDelete
  8. In defence of the company. We have strong regulations with respect to laser radiation for eye safety. It may well be that the laser power, wavelength, dispersion and duration mean this is well within all legal eye safe limits.

    The requirement for being suitable for other beings and for scenarios such as this (cameras), may well have not been a base or legal requirement. It is perhaps unfortunate for this guy and his camera, but it if this was never part of the requirements, then it is no wonder the design is not directly suitable.

    In fact, from the comments above, if the laser power needs to satisfy both eye safety and image sensors, it may be that the LIDAR is unable to image correctly due to too low a return. I echo Eric's comment, whereby the other methods may need to take up the mantle in this use case.

    Regarding new specifications found after a product has been completed, there are many products that this also applies too and this is why we loop back around when starting a new design rather than attempting to jerry-rig an existing design. For example, how often do we consider a new-unforeseen application for an image sensor, only to find the sensor requires too much power. Rather than trying to modify the product we take 'low power' as a requirement for a second generation design. For the company here, it looks like an unforeseen case that would need to be included in further generations of the product.

    ReplyDelete
  9. None of these comments actually contains science. All speculation. I suggest to actually start a study to see if the LIDAR could've really damaged the imager and how. From that point directions can be discussed. Right now this discussion looks like a coffee table at my grandma's place...

    ReplyDelete
  10. At AltaSens we had a similar customer failure from a sensor in a electronic news gathering (ENG) camera that was hit by a green laser while filming in a nightclub. The failure mode was pretty much identical to shown here.

    We could see significant physical damage under a SEM, but never really understood the failure mechanism.

    ReplyDelete
    Replies
    1. The sensor iis powered. Light sends something into conduction that shouldn't be (possibly causing SCR latchup?) and blows it out?
      The energy then comes from a power supply. The too-intense light is a trigger.

      Delete
  11. It's known to photographers for a long time that disco lasers can kill sensors. Here is a video from 2010 where someone filmed with his camera when it happened:
    https://www.youtube.com/watch?v=J0TgaGePhJA

    ReplyDelete
  12. someone explain why the horizontal line is not flat?

    ReplyDelete
    Replies
    1. My guess, it just follows a scan line of LiDAR, a little curved by the laser scanning optics. The vertical lines are straight because they are damaged column out of the pixels, possibly.

      Delete
    2. Then it must be very strong laser, I tried my 30mW gren laser pointer before, can hardly damage like this.

      Delete
    3. I think that the camera has a lens distortion correction inside. So a horizontal line seems curved when the lens distortion correction is applied on.

      Delete
  13. Two-photon absorption of 1550nm focused laser light has already been observed in silicon PIN photodiodes and image sensors:
    https://tmo.jpl.nasa.gov/progress_report/42-173/173C.pdf
    This effect can be probably taken into account in this case.

    ReplyDelete
    Replies
    1. 2-Photon absorption is a known issue in Silicon Photonics. But you need quite a photon density to yield a significant Probability of exciting an electron into conduction band. Hence, 2photon microscopy works with such high resolution. What laser intensity is needed to yield a realistic probability of yielding enough carriers to heat up silicon at a realistic beam divergence and given distance? I didn't do the math, but it sounds unrealistic that this is the issue...

      Delete
    2. Also see: Sub bandgap laser-induced single event effects: carrier generation via two-photon absorption
      https://doi.org/10.1109/TNS.2002.805337
      This is a non-linear effect which depends on laser power. Notice that laser light is diffraction limit so it can focus into a very small spot size.

      Delete
  14. I agree it might look unrealistic, but I still think that this hypothesis deserves a small calculation before being ruled out completely. Consider that the laser is pulsed and thus the peak power can be huge, even though the average power is not so large. The camera optical system can also focus the laser beam in a very small spot, so the power density might be locally very large.

    ReplyDelete
  15. I agree that two photon absorption isn't realistic, but suspect might local plasmonic metal heating.

    ReplyDelete
  16. it will be interesting to know the model of this Sony camera. that will tell us if the sensor is a FSI or BSI sensor. if it BSI, the heat dissipation might not be good (I assume) and maybe more prone for local heat damage.

    ReplyDelete
  17. ok, just checked the news link and it is indeed a BSI sensor.

    ReplyDelete
  18. maybe a newbie question... what type of sensor is used for 1550nm Lidar? is it InGaAs based? but standard pixels or some kind of avalanche diode type to measure the echo of the laser pulse accurately? Is there some kind of basic tech overview for this IR laser based lidar systems you could recommend?

    ReplyDelete
    Replies
    1. Agree, InGaAs APD is most probable. Sorry, I'm not aware of any LiDAR tutorial.

      Delete
  19. How powerful are these lasers? I tried to look it up, but only came up with a figure of 8KW, which seems ridiculous!!! The pulses may be extremely small, so the average amount of power may be small, but that seems a highly dangerous amount of energy to be firing about in public, what if something goes wrong and it gets stuck on a single point for a somewhat longer time...

    ReplyDelete
    Replies
    1. In case of Aeye, their tiny MEMS mirror suspended in vacuum severely limits the maximum power they can transmit. If I have to guess, their peak power is 100W or less.

      Delete
    2. There are numerous safety measures one has to take to make this feasible but it's all within the standards. So can be done and surely is done.

      Delete
    3. It does seem this Lidar has a shorter range than some of their others so maybe only 100W, but that is still a lot of power to concentrate on an area which judging from the horizontal line on the image is not much bigger than 1 pixel. They may be retina safe, because the beam is absorbed by the "water" in the rest of the eye so doesn't reach the retina, but that doesn't mean it can't damage other things such as the lens (cataracts?) and other sensors such as our dashcams. Maybe they need a bit of independent research into safety before being allowed out in public, after all they are invisible, people can't tell when they are being scanned, if there are problems they won't normally be able to make the connection.

      Delete
  20. Something about this doesn't add up. Laser damage is notoriously difficult to predict due to a statistical nature of thresholds. But I've burnt some optics in the past and it took a bit of doing.

    35mm lens at f/4, 1/60 sec exposure. The laser has reflected off the column, from mulitple spots each with very different geometry and surface properties, into the 0.6 cm^2 aperture, through glass not optimized for 1.5um light and an IR cut (which probably leaks like crazy there), onto silicon that mostly transmits at that wavelength.

    There are not any published power specs that I can find for that system, but one could probably limit the power based on part costs, size and power requirements. 1W is probably a good upper limit and I doubt it's that much. It also has to NOT destroy the optical system designed to measure it. That system IS optimized to measure 1.5um and handle specular reflections from close up.

    To get a pencil beam backscatter off a hard target at the distances they advertise doesn't require all that much power. Measuring clouds that are kms away is a milliwatt problem. Close up stuff is more a matter of timing than power.

    Also notice the top-most artifact in the first image that doesn't replicate in the later images as charge generation site, the one with crossed dead row and column and no laser spot visible. Side note, there are enough CMOS experts here, is that what a laser damage site looks like on a color sensor? Do they bleed like that? (The curved line is probably caused by a distortion correction in the camera software.)

    All told it's conceivable but unlikely; it would be good to see the .RAW images. v It would also be good to see the .RAW images taken just before this.

    ReplyDelete
    Replies
    1. 1. Sony mirrorless cameras have image sensor open all the time due to a live preview. So, the time is not limited by the 1/60s exposure.

      2. Why do you think the laser is reflected off the column? There could be a direct hit during the live preview time.

      3. Tx power requirements depend on the target reflectivity, FoV, resolution and a measurement speed. In case of cloud monitoring, you probably mean single point measurement, narrow FoV, long integration time. In automotive LiDARs, the frame time is short, FoV is wide, angular resolution is high, and the target might be black. So, the Tx power requirements are different.

      4. Your curved horizontal line explanation looks very probable.

      Delete
    2. just compare the different scene, if the curved pixel is same, then it is not by software.

      Delete
  21. Most of what one can do here is speculation due to lack of any sort of spec from AEye and sparse data. I'm not convinced those artifacts don't predate that first image. I'm just trying to ground things in a back-of-envelope way to ameliorate fear of sensor (and eye) damage from what seems to me to be a pretty unlikely configuration.

    1) I had been assuming that the posted image was the exposure wherein damage had occurred, which is certainly disputable and I hadn't considered continuous readout/preview of a mirrorless sensor. But then you need multiple direct damaging hits of a small beam into a 8mm diameter aperture? If the shutter is open continuously and doing sustained damage, wouldn't you see some smear from the camera moving or lots of dots?

    2) That was my best guess of how you'd make that pattern if the damage occured in that image and I admit that was my interpretation. Not that it is necessarily any worse than other speculation occurring -- in fact it is specular speculation! The images available are of a low enough quality that it's hard to judge. I'd still be somewhat surprised if a direct hit of an eye-safe 1.5um laser could do that anyway through commercial camera optics.

    3) My point about cloud measurement is that a mW is a lot of light and the scaling properties for any lidar are pretty well known. I agree that the power requirements differ, but hard target ranging at these distances isn't a J/cm2 laser problem, which is where I think you'd need to be to see this kind of damage. Anything much above that is in danger of blatantly exceeding eye safety limits even at 1.5um.

    p.s. - long-time lurker and big fan of your site

    ReplyDelete
    Replies
    1. I fully agree with you on the lack of the solid knowledge on Aeye iDAR. That makes the whole discussion to be highly speculative. On the other hand, I doubt that somebody from Aeye would intervene and give us an info on this sensitive matter.

      Delete
  22. If they use a fiber laser, then the peak power will be HUGE. A traditional laser diode can not provide a very high peak power simply it's too hard to generate a huge current pulse to drive the laser diode.

    ReplyDelete
  23. Small update: https://spectrum.ieee.org/cars-that-think/transportation/sensors/keeping-lidars-from-zapping-camera-chips

    As reported above, AEye offered to buy the damaged camera, but apparently the owner "can't find it"...

    ReplyDelete

All comments are moderated to avoid spam and personal attacks.