Friday, March 27, 2020

Cambridge Mechatronics 3D Sensing Technology

Cambridge Mechatronics uses Apple iPad Pro LiDAR announcement opportunity to emphasize advantages of its 3D sensing technology:

"Systems using Indirect Time of Flight (iToF) technology have shipped in Android smartphones for some time, but their practical working range is only around two metres. This has limited their use to camera enhancements such as portrait photo background blurring. Apple advise their Direct Time of Flight (dToF) technology has a useful range of five metres.

To unlock the broadest range of AR user experiences, accurately measuring depth of ten metres or more is necessary. All technologies in use today compromise system resolution and performance when increasing range. However, CML has developed technology combining optical components, actuators and software to increase working range to ten metres and more without any compromise to measurement resolution or performance. This gives a best of both worlds solution targeted at smartphones, tablets and other mobile devices.

CML’s 3D sensing enhancement technology is available to licence now. We are working with our global partners, including major device brands and their supply chains, to bring the most engaging and immersive next generation AR experiences to consumers.

Update: A PCT Patent Application WO2020030916 "Improved 3D Sensing" by David Richards and Joshua Carr describes the company's approach:

"...there is provided an apparatus for use in generating a three-dimensional representation of a scene, the apparatus comprising: a time-of-flight (ToF) imaging camera system comprising a multipixel sensor and a light source and arranged to emit illumination having a spatially-nonuniform intensity over the field of view of the sensor; and an actuation mechanism for moving the illumination across at least part of the field of view of the sensor, thereby enabling generation of the representation. This may be achieved without moving the sensor.

The non-uniform illumination may be any form of illumination, including a beam of light, a pattern of light, a striped pattern of light, a dot pattern of light.


  1. Most of the smartphone iToFs have 1/d^4 receive power factor and range-limited by time-aliasing of phase modulated light. Looks like CML and/or Apple removed the diffuser and put a sort of coarse beam steering mechanism enabled via a mechanical actuation, IMO - Correct me if I am wrong here. This gives a better range, of course, and is not new - one is a classic flash dToF and the other one a beam-steered dToF. Of course, the key is to have an actuation component that is suitable for phones and pads.


    1. Yes, your interpretation is correct. The actuation motors for smartphones is the main business for CML. One difference from the classic (multi-)beam steered dTOF is that the Rx part is a static image sensor.

      It appears to be similar to Apple. It would be interesting to see if Apple LiDAR teardown confirms that.

    2. Yes, a static image sensor (1D or 2D) is used here for reception/detection. I guess the output beam is a coarse, moderately-collimated one, so a static image sensor to capture return lights from multiple target points makes sense. For such small distances, no need to complicate Rx.

      I referred to flash dToF and beam-steered dToF more in the sense of receive light power behavior.


    3. Once they remove the diffuser, is the VCSEL still eye-safe? Or was the diffuser required for that certification?


All comments are moderated to avoid spam and personal attacks.