Lists

Thursday, October 08, 2009

Gigavision Sensor Proposed

New Scientist: A team led by Edoardo Charbon, a professor from the Swiss Federal Polytechnic Institute (EPFL) in Lausanne presented their so-called "gigavision" sensor at an OMNIVIS 2009 workshop on Oct. 4 in Kyoto, Japan.

While Charbon's idea is new and has a patent pending, the principle behind it is not. It has long been known that memory chips are extremely sensitive to light. The charge stored in every cell corresponds to whether that cell is in a light or dark area. Memory cell is small, so for every pixel on one of today's sensors, the memory-based sensor could have 100 pixels. A chip the size of a 10MP camera sensor will have 100 times as many sensing cells if implemented in memory technology - hence the choice of the gigavision name.

Unlike the pixels in a conventional sensor, which record a greyscale, the cells in Charbon's memory-chip sensor are simple on-off devices: they can only store a digital 0 or 1, for which read either light or dark. To build a sensor that can record shades of grey, EPFL engineer Feng Yang, who presented the Kyoto paper, is developing a software algorithm that looks across an array of 100 pixels to estimate their overall greyscale value.

On the surface this sounds a lot like Eric Fossum's Digital Jot idea. Once EPFL patent application is published, one can see what is the difference. I'm almost sure that EPFL team is aware of the prior art and did something different.

Comments to the New Scientist article point that using of RAMs as sensors is known since 70s, so this part of the work is hardly new.

6 comments:

  1. "Imaging DRAM" was the term used by Toshiba et al. Let hope they've done something different.

    ReplyDelete
  2. Micron also did this a long long time ago. They had a cute name for it but I forget what it is now - something like the Micron Eye or some such.

    I have not seen the paper yet so I won't comment on what they did. My idea (also patent pending, duh) is based on single photoelectron detection and I discarded any idea that involved thresholding multiple photoelectrons (e.g. 1 bit ADC). It is easy to see the kind of result you will get by doing that if you take an ordinary 8 Mpix image sensor and digitize it to one bit and then try to recover grey scale. It is not satisfactory.

    I wonder what they present. Perhaps the result gets better with more pixels or they have a better algorithm. I also don't think DRAM sense circuits are designed for uniformity - which might introduce an FPN issue.

    I look forward to seeing the paper.

    ReplyDelete
  3. Dark current is usually important.

    ReplyDelete
  4. This comment has been removed by the author.

    ReplyDelete
  5. I usually try to hit delete before posting, but this time I had to delete after posting.

    I have seen the paper and it contains some interesting algorithm work and results from a synthetic image. It nicely builds on the digital jot sensor concept but does not acknowledge any prior art in this field.

    ReplyDelete
  6. Couldn't any FPN issue be compensated for in the reconstruction algorithm, as it is in astronomical and other scientific sensors?

    Has anyone seen measurements on quantum efficiency? How does sensitivity vary with angle of illumination? In any new technology we wouldn't expect these to be great yet, but I'm just wondering where they are.

    ReplyDelete

All comments are moderated to avoid spam and personal attacks.