Saturday, July 25, 2015

Pelican Imaging Layoffs?

Somebody posted in comments that Pelican Imaging has just layed off most of the company. Whether this is true or not, their technology page points to a big shift in direction: instead of being a main camera for smartphone, the company now proposes a supplementary camera array for just a depth map:

"This unique "array + main camera" approach allows mobile handset manufacturers to choose their own primary camera module (whether it’s 8MP, 13MP, or 20MP), and benefit from excellent image quality paired with the depth data from the scene."

Update: EETimes confirms the layoffs:

"The company said that approximately half the staffing of 25 was let go.

A spokesperson for Pelican Imaging confirmed that the company has "reduced some of its staff recently, to allow the company to focus on core product development."

"Regrettably, Mark Fulks has left the team at Pelican. Mark is a very well respected leader and we wish him success in his next endeavour," the spokesperson said.


  1. So, the great bubble bursts! It will end with all hyped-up companies incl. Invisage going down the drain.

  2. The killer app for this technology is to support consumer 3D printers? Are they serious? Seems like an even riskier business plan than a refocusable main camera.

    1. I'm not sure. I think I'd never actually use refocusing. But having a handheld based 3D printer would certainly get my curiosity. However, for this the whole bundle needs to be right. Scanner and software would need to be reasonably priced and easy to use. The output file format shouldn't be proprietary but compatible to all standard tools and 3D printers. Also, I think 3D printers would need to become much cheaper to reach the consumer market - >600 Euros is a bit much... But do you actually need a 3D camera for this? As you'd need to manually rotate the camera around the object to be printed (or the other way around), you'd anyway need to stitch together subsequent frames. Thus, I think one might be able to solve this purely on a software level (no need for intra-frame 3D)!?

      - Andreas Süss

  3. If purely a depth camera, wouldn't this compare poorly with ToF and structured light approaches for near field depth? With the narrow baseline, the depth range will anyway be quite small for measurements usecases, etc. So, not too sure if this is such a good idea.

  4. It is true. Half of the engineering staff left over the last year in disgust. They laid off the other half of the remaining staff. Only people left are the core algorithm and imaging team and of course the CTO and CEO.

  5. I interviewed with these guys in the early days. It was clear they were smoking crack. The level of sw integration work required on the device was a moon-launch. It required all the avail CPU. There were so many problems with the model, they had no hope of adoption no matter how well the technology worked, assuming it ever did.


All comments are moderated to avoid spam and personal attacks.