Sunday, November 13, 2011

HDR Imaging: Sensors and Architectures

Arnaud Darmont, CEO of Aphesa, delivers 1-day course "High Dynamic Range Imaging: Sensors and Architectures" at IS&T/SPIE Electronic Imaging conference on Jan. 22, 2012.

The course is going to:
  • describe various approaches to achieve high dynamic range imaging
  • predict the behavior of a given sensor or architecture on a scene
  • specify the sensor or system requirements for a high dynamic range application
  • classify a high dynamic range application into one of several standard types

The course notes are going to be published as a textbook.


  1. Great course, i highly recommend!

  2. A little expensive, given quite some material on HDR is available online and for free.

  3. is there a lits of all the talked approaches?

  4. PART 1
    - Applications that require HDR
    - HDR scenes
    - HDR photography

    PART 2
    - Image sensor theory (noises, spectral response, FPN, SNR, CDS, etc)
    - Definition of dynamic range
    - Dynamic range gaps / SNR holes
    - HVS
    - Integrating linear pixels (rolling shutter and global shutter)
    - Multi-segment pixels (several approaches)
    - Multiple sampling pixels
    - Multiple sensing nodes pixels
    - Logarithmic pixels
    - Logarithmic photovoltaic pixel
    - Time to saturation pixel
    - Gradient pixel
    - Light to frequency pixel
    - Prism methods
    - Local methods
    - Other methods
    - XDR color imaging

    PART 3
    - Ideal software method
    - Debevec
    - Mann and Picard (brief)
    - Mitsunaga and Nayar (brief)
    - Robertson et al (brief)
    - Tone mapping
    - Special software method

    PART 4
    - Optical limitations
    - Flare and ghosting
    - Automatic HDR exposure
    - Color spaces
    - HDR file formats
    - HDR testing
    - Some demos


All comments are moderated to avoid spam and personal attacks.