Thursday, March 22, 2018

Smartphone Companies Scrambling to Match Apple 3D Camera Performance and Cost

EETimes publishes an article "Can Huawei Match Apple TrueDepth?" by Junko Yoshida. Few quotes:

Pierre Cambou, activity leader for MEMS and imaging at Yole Développement, predicts that it may take a year or longer for competitors to offer 3D sensing technologies comparable to iPhone X.

...3D sensing will be a tougher challenge for most smartphone vendors — because a 3D camera contains myriad components that need to be aligned. It also requires competent supply chain management. Cambou called the 3D camera “a bundle of sub-devices.”

As for Samsung’s Galaxy S9, some reviewers are already calling its front-facing sensing technology “a disappointment.” ...People were able to fool Samsung's technology on last year's Galaxy S8 by using photos. Apparently, that trick still works with the S9.

Huawei’s triple cameras appear to illustrate the company’s effort to enhance depth-sensing technology. While no confirmation is available, Huawei’s suspected 3D sensing partner is Qualcomm.


SystemPlus' and Yole's cost estimation of iPhone X 3D camera

Reuters shares the same opinion:

Most Android phones will have to wait until 2019 to duplicate the 3D sensing feature behind Apple’s Face ID security, three major parts producers have told Reuters.

According to parts manufacturers Viavi Solutions Inc, Finisar Corp and Ams AG, bottlenecks on key parts will mean mass adoption of 3D sensing will not happen until next year, disappointing earlier expectations.

Tech research house Gartner predicts that by 2021, 40 percent of smartphones will be equipped with 3D cameras, which can also be used for so-called augmented reality.

3 comments:

  1. I wouldn’t take Yole reports too seriously... even their picture for structure light is the Tango tablet bar with Mantis Technology, while giving the curtesy to Intel...

    ReplyDelete
  2. with all due respect to Yole, I can tell you that while there is some good info in there, also there is much insider knowledge of state of the art and history they do not know, that is critical if you were making huge bets in this area. Caveat emptor.

    ReplyDelete
  3. Glaring error - the Apple structured light is not "megapixel", and I am getting quite tired of explaining to people (including Yole) that structured light resolution is based on the emitter pattern, not the receiver. If you have an 18 megapixel receiver and a single dot emitter, what is the number of depth points you have? One. The number of depth points Apple puts out (after their spec relaxation) is less than 40K depth points, and closer to 30K if you do pattern measurements.

    ReplyDelete

All comments are moderated to avoid spam and personal attacks.