Thursday, November 18, 2021

Event-Based Imaging Thesis

UniversitĂ© Grenoble Alpes publishes Maxence Bouvie's PhD thesis "Study and design of an energy efficient perception module combining event-based image sensors and spiking neural network with 3D integration technologies."

"Event-driven acquisition permits to generate sparse data, with high acquisition speed at the order of the microsecond, while conserving an exceptionally large dynamic range. Event-driven imagers are thus highly suited for deployment in situations where speed and application robustness are of high importance. However, event-based image sensors come with major drawbacks that render them nearly impracticable in embedded situations. They are noisy, poorly resolved and generate an incredible amount of data relatively to their resolution. This Ph.D. study thus focuses on understanding how they can be used, and how their drawbacks can be alleviated. The work explores bio-inspired applications for tasks where frame-based methods are already successful but present robustness flaws because classical frame-based imagers cannot be intrinsically high speed and high dynamic range. This manuscript provides leads to understand and decide why some algorithms matches more than other to their novel data type. It also tries to touch upon the reasons these sensors cannot be used as they are, but how they could be efficiently integrated into classical frame-based algorithmic pipelines and systems by deploying motion compensation of the raw data. In addition, a bio-inspired hardware-based solution to simultaneously reduce the output bandwidth and filter out noise, directly at the output of a grid of event-based pixels, is presented. It consists in the hardware implementation of a bio-inspired convolutional neural network accelerator - a neuromorphic processor – distributed near-sensor, that takes major advantages from being conceived toward a three-dimensional integration. This system was designed for minimizing its power budget, at the 28nm FDSOI node, and demonstrates a 2.86pJ per synaptic operation – or 93.0aJ per input event per pixel. On top of that, it is scalable for megapixel resolution sensors without induced overhead."

Appendix C (pp. 134-135) gives a nice comparison of Samsung, Prophesee (Sony), and Celepixel (Omnivision) event-driven sensor approaches.

1 comment:

  1. Thank you for sharing. ��
    If you have any question, comment or remark, do not hesitate to let me know.
    (Maxence Bouvier, Author)

    ReplyDelete

All comments are moderated to avoid spam and personal attacks.