Lists

Friday, February 24, 2017

No Lens is Needed to See Simple Images

University of Utah, Salt Lake City, USA, researches show that the simple 32x32 images can be seen by a bare image sensor with no lens. The open-access paper "Lensless Photography with only an image sensor" by Ganghun Kim, Kyle Isaacson, Racheal Palmer, and Rajesh Menon has been published in arxiv.org. From the abstract:

"Photography usually requires optics in conjunction with a recording device (an image sensor). Eliminating the optics could lead to new form factors for cameras. Here, we report a simple demonstration of imaging using a bare CMOS sensor that utilizes computation. The technique relies on the space variant point-spread functions resulting from the interaction of a point source in the field of view with the image sensor. These space-variant point-spread functions are combined with a reconstruction algorithm in order to image simple objects displayed on a discrete LED array as well as on an LCD screen. We extended the approach to video imaging at the native frame rate of the sensor. Finally, we performed experiments to analyze the parametric impact of the object distance. Improving the sensor designs and reconstruction algorithms can lead to useful cameras without optics."

Example images taken with the VGA sensor. The left column shows
the objects displayed on the LED matrix. The 2nd column shows the raw
sensor images. The 3rd column shows the reconstructed images before
any processing. The right column shows the reconstructed images after
binary thresholding.

6 comments:

  1. Very nice.

    I wonder if this could be combined with the other lensless techniques?

    ReplyDelete
  2. Lots of light and simple scenes. If noise or grey scale quality become important in intended applications, then I guess this might not be so great. I am thinking there is a lot of subtraction with noise pile up among the arithmetic operations, sort of like CMY filters but much worse.

    Someday I will have to understand the point of working so hard to get rid of a lens. Lenses are the ultimate in low power computation, cost almost nothing, and give much better image quality. For this incarnation of lensless imaging, I wonder how it compares to a pinhole or even a fat pinhole, when SNR and IQ is considered.

    ReplyDelete
    Replies
    1. Maybe a phase array radar can inspire you :)

      Delete
    2. Phase array would need synchronous detection. Could be an isea with laser illumination and using the laser as well on the detection surface.

      You read it here first, so now it is not patentable. Oh yeas, of course the laser is modulated cq the reference is modulated to allow for differences in reflected light.

      Delete
    3. The scene must ne 2D, even. You can imagine the bare sensor + huge pinhole will see some kind of depth. Without the base truth of 'everything is in a single plane' there are many more different solutions to the observed light pattern.

      Delete
  3. I thought quite a few places (Rice, Columbia) had working 'Lenless' (flat lens) Computational Photography part way to perfected (large images, sometimes a bit blurry, in one case refocusable) - why work on micropixel (as opposed to Megapixel) sized Sensors.

    A Pinhole Camera is the least expensive but dirt or a drop of water can ruin the whole Shot. Maybe develope 'Pinhole Mesh Computational Photography' (where the hole positions are precalculated and then laser or electron beam drilled into a sheet, possibly with a drop of dye to make a water drop lens and form a Bayer Filter).

    ReplyDelete

All comments are moderated to avoid spam and personal attacks.