Wednesday, June 06, 2012

Infineon Cooperates with PMD on Automotive ToF Imagers

All-Electronics.de interviewed Jochen Hanebeck, president of the Automotive Division of Infineon Technologies. As Google translation states, Jochen presented:

"A world first is our 3D camera chip with 100,000 pixels, which is the result of cooperation with the firm of PMD. In principle, this sensor sends out infrared light and measures the return stroke, the phase shift. It delivers to each pixel so that not only a 2D but additionally a depth information. Other important characteristics are low latency and high frame rates of up to 100 frames/s at a very small form factor.

We have taken up the cause, enhance sensitivity and performance of this imager to significantly further and integrate. The use of high performance A/D converters on the chip we already deliver digital signals at the output.

Our 3D chip covers the entire area of ​​interest to about 50 feet away. Due to the active illumination of an infrared LED array, the system has a night vision capability, so that is possible even in the dark a pedestrian and a gesture recognition controller. Vehicles with this technology could already from 2014 the production line.
"

Jochen Hanebeck shows the new 3D imager. The picture
was taken with such an imager. Depending on the distance
from the camera the colors vary from red (close) through
yellow and green to blue (away). (Source: Infineon
Technologies)

Thanks to BB for the link!

9 comments:

  1. Nice achievement! What is the LED power for 50 feet distance??

    ReplyDelete
  2. How compare this to Eric Fossum's work at Samsung??

    ReplyDelete
  3. Hey, it is hardly "my" work at Samsung. For sure our published work has been a team effort. I am architecture and pixel concepts, but details, process, circuit design, and ISP all come from other team members.

    Anyway, looks like a nice image.

    ReplyDelete
  4. I will make a technical comment since you asked. I think PMD uses 45x45 um pixels - 2025 um^2. The Samsung RGBZ reported at 2012 ISSCC uses a 2.25x9 um pixel - 20.25 um^2, exactly 1% the size of the PMD pixel. Anyone doing image sensors knows what an impact this has on camera performance. The PMD sensor has 100k pixels, and the Samsung RGBZ sensor was 172k pixels.

    For large pixels, the problem is demodulation contrast (DC) so short transit times are critical. Large pixels also result in large chips and large optics and hence large size. Not so good for embedded applications such as mobile or displays, and thus at this time confined to a bit more of a niche market.

    For small pixels DC is less important at the same modulation frequency because transit time varies (typically) as L^2 and the problem is more of QE and read noise. For small pixels, one does not have the luxury of layout room for (straight-forward) in-pixel ambient light cancellation circuitry so sunlight suppression is more of a challenge.

    So, really, two very different design spaces between what we have worked on at Samsung and what is reported here.

    ReplyDelete
  5. I still wonder if an active 3D sensor using LED become defacto standard.

    ReplyDelete
  6. Eric, I've question on ambient light cancellation. Basically we can use somewhat differential sensing for ambient light continous component cancellation. But tthe shot noise from ambient light can not be removed. A large signal energy is still needed to overcome the shot noise from strong ambient light. What is your comment on this issue please ?

    -yang ni

    ReplyDelete
  7. Yang, you are correct that you cannot suppress the shot noise from the ambient light component. The problem with ambient light is that it uses up the full well capacity of the pixel. So, you subtract the "d.c." signal of the ambient light differentially thru some switch cap circuit so as to preserve pixel signal capacity. Subtracting the large signal still leaves shot noise in the pixel, not to mention additional noise and FPN from the subtraction circuit. More modulation signal intensity can improve SNR. There are other strategies for dealing with ambient light, esp. since we can control the light source. Someday, perhaps, we will publish our strategy for the case of small pixels. Still, at large distance, it is hard to get enough optical power in mobile applications (not counting cars of course).

    ReplyDelete
  8. I gues, controlling light source means probably just emitting more optical power, burst mode operation or so.

    ReplyDelete
  9. Well, these 3D sensors have been around and under investigation in the automation and automotive industry for several years. Most of them still suffer from limitations due to ambient light or low re-emission levels from certain surfaces. Further, in the interview, which was originally in German, a range of 50m is stated. Did Google Translate simply swap meters for feet? Just 50ft range wouldn't be impressive.

    ReplyDelete

All comments are moderated to avoid spam and personal attacks.