Friday, June 19, 2020

Quanta Burst Photography

University of Wisconsin–Madison: In a dark room or a motion-heavy scene, conventional cameras face a choice: a quick look that freezes movement nicely but turns out dark, or a longer exposure that captures more light but blurs moving parts.

That’s always been a fundamental trade-off in any kind of photography,” says Mohit Gupta, a University of Wisconsin–Madison computer sciences professor. “But we are working on overcoming that trade-off with a different kind of sensor.

The researchers are using SPADs for what they call quanta burst photography — taking many images in bursts, and then processing those many images to squeeze one good picture from a poorly lit or fast-moving subject. The EPFL The SwissSPAD array from Edoardo Charbon's group used in the burst photography work is fast enough to record 100,000 single-photon frames per second.

The result is good image quality in low-light, with reduced motion blur, as well as a wide dynamic range,” says Gupta, whose work is supported by the DARPA. “We have had good results even when the brightest spot in view is getting 100,000 times as much light as the darkest.

The paper also compares the algorithms running on SPAD and Jot-based sensors:


  1. A solid advancement in the state of the art of Quanta Image Sensors (QIS) computational imaging and builds on (a lot) of well-referenced prior work. With the concurrent work of Stanley Chan and the early demonstration of QIS concepts by the Edinburgh group it is pretty satisfying. But, these days, especially after EPFL has achieved 1/2Mpixel SPAD, I find that we have two types of QIS implementation (which I sort of predicted), the Dartmouth CMOS image sensor-based QIS that I now refer to as CIS-QIS, and SPAD-based QIS, or SPAD-QIS. There are pros and cons to both approaches. CIS-QIS is aiming for faster time resolution (fast enough for many commercial applications) and SPAD-QIS is aiming at smaller pixels and lower DCR (good enough for many commercial applications). CIS-QIS or SPAD-QIS, it is still a revolution in how we approach digital image capture.

  2. Speedy yet dark or slow but blurred? This has been an age old problem and one that needs to be finally solved for good! Many new cameras are now taking multiple sensors and using software to post process the ultimate image. To do this at sensor level will cut down heavy processing and heavy bandwidth to the cloud. Nice ! thanks Vladimir. Mike

    1. The heavy processing is still there. Wisconsin University PR article says:

      "While the SPADs chips could drop right into smartphones, the computing power and storage required to handle large bursts of information from the blazing-fast sensors still outstrips even the highest-performance handheld devices.

      “We still need to develop engineering approaches to deal with this data deluge,” says Gupta.

    2. I think the question is if QIS benefits justify co-innovation in focal-plane processing or near-focal-plane processing. Time will tell.

      From 2011, The Quanta Image Sensor (QIS): Concepts and Challenges,

      "In the QIS, subsequent sub-frames may be shifted or morphed prior to integration to allow functions such as motion compensation, TDI (in arbitrary track direction) and wavefront correction. Depending on the application, multiple sub-frames may be stored prior to shift and integration to allow motion-flow analysis to be performed. A billion jots
      at 1,000 scans/s leads to Tbit/s data rates. Getting this data off the sensor and into memory may be difficult in the near future so that some aggregation of data in the image sensor may be required."


All comments are moderated to avoid spam and personal attacks.