Lists

Friday, February 21, 2014

Google Project Tango

IEEE Spectrum, Mashable, Wall Street Journal write about Google project Tango allowing 3D mapping with a compact IR 3D sensor based on unknown technology. The 3D computer vision processing uses Movidius chip. Google Youtube video shows Project Tango in action:



The 3D IR sensor is on the bottom of the phone

Update: Cnet publishes an interview with Remi El-Ouazzane, Movidius CEO. Few quotes:

"Over multiple iterations, we developed an architecture optimized for computer vision. The architecture favors parallelism, and not frequency. We're running [the Project Tango phone] on hundreds of megahertz, using a mix of hardware and programmable resources. On today's battery, running our processor will divide your battery power by a factor of 10."

6 comments:

  1. Their website says it measures 250k 3D points per second. Assuming this is 60 fps that would translate to a depth camera resolution of 65x65.

    ReplyDelete
    Replies
    1. Which would suggest that it's a ToF camera. If you look carefully at their outputs (shown in the video), the point clouds aren't very dense, and they don't extend very far forward. It's probably a short range ToF camera.

      Delete
    2. I was told that their 3D camera is based on structured light.

      Delete
  2. This kind of systems exist already in industrial applications. But bring it to consumer market with low cost, simplicity and versatility is a real challenge ! What are the consumer applications that we can imagine ??
    -yang ni

    ReplyDelete
  3. If this is ToF or structured light, where is the light source? Behind the plastic cover? Possible, but it is not mentioned with any words. However it would explain the high power consumption. My first guess is a plenoptic/lightfield camera approach, like e.g. Pelican. In the video at 0:27 the user changes the focus of an ordinary image by a finger tip.

    ReplyDelete
  4. It uses a IR ToF principle to obtain range which means it will suffer from poor contrast in sunlight leading to poor depth map quality in outdoor conditions. 4MP sensor subsamples green at 25% and IR at 25% of field. This explains why the images look like crap.

    This is another ivory project from Google/Motorola with little long-term, mass-production potential. Poor overall architecture for an outdoor mapping device. My prediction is this will be used by academic and government types for silly projects but long-term it is a dead donkey given there are superior architectures on the horizon.

    ReplyDelete

All comments are moderated to avoid spam and personal attacks.