Friday, June 05, 2020

EETimes on iPad Pro LiDAR: Apple Sparked a Race to LiDAR Scanners

EETimes reporter Junko Yoshida publishes an article "Breaking Down iPad Pro 11’s LiDAR Scanner" derived from an interview with SystemPlus and Yole Developpement analysts:


"Apple has sparked a race to use LiDAR scanners. Apple built one into its iPad Pro 11, and now it seems everyone wants one in their products.

What makes this LiDAR scanner significant — and why other mobile device vendors, including Huawei and Vivo, appear going after it — is a specific technology used inside the unit to sense and measure depth.

In EE Times’ interview, Sylvain Hallereau, senior technology and cost analyst at System Plus, explained that iPad Pro 11’s “LiDAR scanner” consists of an emitter — a vertical cavity surface emitting laser (VCSEL) from Lumentum, and a receptor — near infrared (NIR) CMOS image sensor that does direct measurement of time of flight, developed by Sony.

Sony integrated the NIR CMOS image sensor with SPAD using 3D stacking for ToF sensors for the first time. In-pixel connection made it possible to put the CMOS image sensor together with the logic wafer. With the logic die integrated, the image sensor can do simple calculations of distance between the iPad and objects, Hallereau explained.

Sony has elbowed its way into the dToF segment by developing this new generation SPAD array NIR CMOS image sensor featuring 10 µm size pixels and a resolution of 30 kilopixel.
"

8 comments:

  1. So no scanning. Just depth to each point...
    Why call it lidar. This is the usual buzz and hype from apple. From arkit api - You cant get a depth map to evaluate the data Just a very coarse mesh. Dont believe the hype.

    ReplyDelete
    Replies
    1. What's exactly the hype? A 4 layer ibeo lidar is also coarse while costing thousands and being much larger than 2 stacked coins.

      So the API does not allow raw pointcloud access? What does that have to do with the sensor not being a LiDAR?

      Delete
  2. Why does the DOE have what seems to be electrical contacts and a structured ITO-layer?

    ReplyDelete
  3. Is this using die to wafer or wafer to wafer hybrid bonding?

    ReplyDelete
    Replies
    1. not possible using die to wafer. Too expensive

      Delete
  4. where on those pictures can you actually see the proof that it is a direct TOF device using a SPAD, and not an indirect TOF (e.g. similar to the one that Sony acquired from Softkinetic)?

    ReplyDelete
  5. It is not really 30K resolution (30K depth points) in operation. If one observes the dot-pattern formed by VCSEL array (4x16) + DOE, it is 4x16x9 = 576 dots. So, not all 30K SPADs are used as if calculating 30K depth points. Only 576 depth point are being measured here. Let us not call it 30K resolution 3D sensor.

    ReplyDelete
  6. Systemplus is contradicting itself. They write in their second slide that structured light needs a DOE and time-of-flight uses a diffusor. Yet in the next slides they "discover" a DOE and yet they claim it is direct time-of-flight system...

    ReplyDelete

All comments are moderated to avoid spam and personal attacks.