dpreview News Forum has an interesting discussion of New Scientist article on low-light imaging improvement.
University of Lund, Sweden researches Almut Kelber and Eric Warrant together with Jonas Ambeck-Madsen and Hiromichi Yanagihara of the Toyota Motor Europe R&D centre in Brussels, Belgium develop an alrorithm mimicking insects vision. Insects make different trade-offs for different parts of the image, perhaps even optimising vision in each part independently. In theory, one region on an insect's retina could sum incoming photons for long periods to pick out the details of a flower head, say, while the rest could be optimised to probe the shadows for motion by increasing the number of photoreceptors from which signals are pooled. Warrant and his team call this mechanism "local adaptive spatiotemporal smoothing". It has become the inspiration for a new kind of digital image-processing algorithm, developed jointly by Warrant, mathematicians Henrik Malm and Magnus Oskarsson, also at Lund, and engineers at Toyota Motor Europe.
It works by pooling the signals in neighbouring pixels both in space and time, while automatically looking for clues to determine the optimum amount of pooling for each part of the image. When objects are static the algorithm can pool in time and capture more spatial detail. Mathematically, this is the equivalent of setting the best exposure time and pixel size independently for every spot in every frame.
Toyota explores this approach to improve vehicle night vision systems.