Lists

Friday, November 20, 2009

Live Lens Expands Sensor's DR

Australian TV has a piece about LiveLens, a patented ‘active’ electro-optical filter providing a WDR solution for cameras. The LiveLens combines liquid crystal on silicon (LCoS) technology with image sensor, where the liquid crystal controls its local transparency based on the local illumination level at the sensor plane. A picture from the company site explains the idea:



Live Technologies, the company behind LiveLens idea, has manufactured an 1MP prototype sensor on X-FAB. Then the liquid crystal assembly was done at Liquid Crystal Institute Advanced Materials Department, Kent State University, Ohio. Now the company demos WDR imaging results on its web site. The company has US patent 5,953,082 on its technology.

10 comments:

  1. There has been some discussion of this at DPR:
    http://forums.dpreview.com/forums/read.asp?forum=1000&message=33747719

    There seems to be many unanswered questions about the utility of the technology, not to mention the business proposition, especially compared to implementing WDR in the sensor itself.

    I have to say that the demo imaging results on their web site certainly show feasibility but the resulting image quality is quite poor.

    ReplyDelete
  2. One advantage of LiveLens is that it's orthogonal to all sensor-plane DR extensition techniques. That is, when everything is said and done to extend DR at the sensor level, LiveLens can add another 20-40db on top of it.

    The technology is immature, so I would not be too critical of its image quality now. After all, the first pictures out of CCD, CMOS, Xerox copier and film looked much, much worse than the LiveLens demo.

    ReplyDelete
  3. Their website says that the absorption when "on" is equivalent to 1 aperture stop. I take that to mean a factor of 50% light reduction. They also say this adds 4-6 stops of latitude (not to be confused with DR). Can you explain how you calculated 20-40 dB?

    How will they eliminate bright and dark halos from misalignment of the LCD pixels and sensor pixels?

    There are a host of other sort of fundamental issues with this approach, I think.

    In any case, they are going to have to find at least one application on which they can build their business that you can't do with a CCD, CMOS, Xerox copier or film.

    ReplyDelete
  4. I agree, there are not that many applications requiring the extreme HDR. It would be hard for these guys to justify the development cost.

    My 20-40db guestimation of DR extension is based on 1:100 or higher contrast ratio of commonly used LCD TV panels.

    The alignment problem could be possibly solved by a calibration routine run upon the final assembly. The routine builds a map of LCD pixels vs sensor pixels. Then the LCD control machine and a post-processing ISP know the misalignment, possibly at sub-pixel level, and can make an intelligent use of it.

    ReplyDelete
  5. To me it looks like the Light Absorption and Dynamic Range Extension figures that Dr. Fossum mentions are for a preliminary product based on the technology. This preliminary product seems to be an electrically-switchable neutral density filter add-on for CCTV systems. The add-on sits between the camera body and a lens with an auto-iris system. When the camera calls for an iris change, the LiveLens module can be switched on or off, with the LiveLens device “electrically swung out of the optical path” (or into it).

    The “stops of latitude” figure may be analogous to adding a front chain ring on a bicycle. In theory you’d get as many additional gears as there are chain rings on the rear cassette, but some may be inaccessible, while others are pretty similar to what’s available already.

    The web site also shows the technology in a direct overlay on an imaging chip. The How It Works illustration depicts something that is not a neutral density filter and that seems to be a separate product.

    I think it looks and acts like an array of optical successive-approximation A/D converters, only instead of driving the signal minus the approximation to zero it tries to drive the product of the signal and an estimate of the reciprocal to a non-zero constant.

    There aren’t specifics, but the obvious route would be to get feedback from the imaging chip, which sees the optical residue. This suggests multiple imaging cycles and multiple settling times for the Livelens array.

    Additionally, there’s likely to be fixed-pattern noise from gain matching errors in the LiveLens array that may make using it as an add-on for high dynamic range somewhat challenging.

    ReplyDelete
  6. I am not sure I understand the bicycle chain ring analogy here. I also don't understand why the "on" state gives only 50% signal reduction, but anyway, it seems at best one can dim a sensor pixel (or cluster) 50%, or basically double the scene brightness without blow out and gain 6 dB.

    I think the LC device will have larger pixels than the sensor pixels. To me that means that enabling the LCD will darken pixels that should not be darkened, or blow out pixels that should have been darkened, when the device is used when the sensor can no longer accomodate the dynamic range. Without a priori information about what the LCD is doing, it would be hard to compensate in the sensor ISP.

    Well, it is fun to contemplate a device that we don't really understand the details on, and it is hard to resist fixing imagined engineering problems.

    As I said in DPR, I hope they have a camera company as a secret alpha customer who is advising them on problems in real cameras and what image sensors can and cannot do.

    ReplyDelete
  7. My interpretation is that "on" state is when the LC pixel is most transparent. So, even in its most transmissive state LC device loses 50% of light. The "off" state should reduce light by much more.

    To CDM: Yes, having feedback like successive approximation A/D is something what I thought initially as well. One should limit the algorithm to bright parts of the image only and try to keep all the bright parts close to the full well of the pixels.

    Gain matching errors... we do not see much of them in LCD TVs. Why do you think it should be worse in Live Lens?

    ReplyDelete
  8. Their website says

    LIGHT ABSORPTION:
    1 aperture stop, neutral density when on.
    Zero when off.

    This seems unambiguous to me but maybe there is a monumental error on their website. The video looks a lot like 50% light reduction when activated, and no light absorption when off. And despite the claim of neutral density, it seems from the video that the color balance is way off. Look at the demo about 2/3 of the way thru, with the guy with blue shirt and baseball cap. With the LCD on, his face is blue and his neck is orange.

    I know it is just a prototype but I think someone jumped the gun posting it on their website in color. They should have gone for a B&W video.

    ReplyDelete
  9. The next sentence right after "1 aperture stop, neutral density when on" says that the LC device can be swung out of optical path. In this context I understand that 1 aperture stop is the light loss of the LC device in the optical path while in fully transparent state. But I agree with you that the web site wording opens a room for different interpretations.

    My logic says that in LCDs the contrast ratio is much higher than 50%, so I see no reason why Live Lens light modulation should be limited by 50%.

    Yes, the colors are distorted. Still, for investors a distorted color video might be more convincing than a perfect B&W, I think. After all, they are used to see half-baked prototypes all the time.

    ReplyDelete
  10. I missed an important point. This idea was filed in 1996. In 1996 there was relatively little activity on wide dynamic range imaging. At JPL we had done something with individual pixel reset sensors - mostly for star trackers - as well as the work Orly Yadid Pecht and I did on a multiple-exposures-per-frame sensor. I am sure there was probably a little work elsewhere as well like log sensors and of course the infamous DALSA DynaSensor.

    Nevertheless, high performance, commercially competitive devices like the recent 110 dB Aptina device (see Solhusvik et al., 2009 IISW) were a dozen years away.

    So, perhaps what happened here was the long pursuit of an idea whose time had passed during its development cycle.

    Anyway, I remain unconvinced of the business proposition, even if it might have sounded tantalizing in 1996.

    ReplyDelete

All comments are moderated to avoid spam and personal attacks.