Friday, October 06, 2023

Review paper on long range single-photon LiDAR

Hadfield et al. recently published a review paper titled "Single-photon detection for long-range imaging and sensing" in Optica:

Abstract: Single-photon detectors with picosecond timing resolution have advanced rapidly in the past decade. This has spurred progress in time-correlated single-photon counting applications, from quantum optics to life sciences and remote sensing. A variety of advanced optoelectronic device architectures offer not only high-performance single-pixel devices but also the ability to scale up to detector arrays and extend single-photon sensitivity into the short-wave infrared and beyond. The advent of single-photon focal plane arrays is poised to revolutionize infrared imaging and sensing. In this mini-review, we set out performance metrics for single-photon detection, assess the requirements of single-photon light detection and ranging, and survey the state of the art and prospects for new developments across semiconductor and superconducting single-photon detection technologies. Our goal is to capture a snapshot of a rapidly developing landscape of photonic technology and forecast future trends and opportunities.

 

Fig. 1. Examples of imaging LIDAR configurations. (a) Flash LIDAR configuration using an array sensor and full-field illumination (a bistatic system is shown, with source and sensor separated). (b) Scanning LIDAR approach where the source is scanned and an individual sensor is used. (In this illustration, a bistatic configuration is shown; however, a monostatic scanning configuration is often used with a common transmit and receive axis).


 

Fig. 2. Single-photon LIDAR depth profiles taken at a range of greater than 600 m using a 100-channel Si SPAD detector system in scanning configuration. The operational wavelength is 532 nm. (a) Visible-band photograph of scene. (b) Reconstructed depth image of the city scene. (c) Detailed depth profile of the subsection of the scene within the red rectangle in (a). Further details in Z. Li et al. [60]. Figure reproduced with permission of Optica Publishing Group.



Fig. 3. Example of data fusion of a 3D image from a CMOS SPAD detector array and passive imagery of a scene at 150 m range. (a) Retrieved depth information from a SPAD detector array. (b) Intensity information from the SPAD overlaid on top of the retrieved depth information. (c) Intensity information from a color camera overlaid on top of the retrieved depth information [65]. Figure reproduced with permission of Springer Nature publishing.


Fig. 4. Solar irradiance versus wavelength at sea level (red) and in the upper atmosphere (blue). MODTRAN simulation [86]. The following spectral bands beyond the visible wavelength range are denoted by the shaded regions: near infrared (NIR), yellow; short-ware infrared (SWIR), cyan; mid-wave infrared (MWIR), red.



Fig. 5. Example of scanning SWIR single-photon LIDAR imaging. (a) Visible-band image of a residential building taken with an f=200mm camera lens. (b) Depth intensity plot of the building imaged with 32×32 scan points over a range of 8.8 km. (c) Depth plot of the building imaged with 32×32 scan points over a range of 8.8 km; side view of the target [89]. Figure reproduced with permission of Optica Publishing Group.

Fig. 6. Reconstruction results of a mountain scene over a range of 201.5 km using SWIR single-photon LIDAR [91]. (a) Visible-band imaged photograph. (b) Reconstructed depth result using algorithm by Lindell et al. [92] for data with signal-to-background ratio ∼0.04 and mean signal photon per pixel ∼3.58. (c) 3D profile of the reconstructed result. Figure reproduced with permission of Optica Publishing Group.


Fig. 7. Analysis of a scene with an actor holding a wooden plank across his chest and standing 1 m behind camouflage netting at a range of 230 m in daylight conditions. (a) Photograph of the scene, showing the actor holding a wooden plank behind the camouflage. (b), (c) Intensity and depth profiles of the target scene using all the collected single-photon LIDAR data. (d), (e) Intensity and depth profiles after time gating to exclude all data except those with a 0.6 m range around the target location. The pixel format used in the depth and intensity profiles is 80×160 [95]. Figure reproduced with permission of SPIE publishing.



Fig. 8. Schematic diagram of a SWIR single-photon 3D flash imaging experiment. The scene consists of two people walking behind a camouflage net at a stand-off distance of 320 m from the LIDAR system. An RGB camera was positioned a few meters from the 3D scene and used to acquire a reference video. The proposed algorithm is able to provide real-time 3D reconstructions using a graphics processing unit (GPU). As the LIDAR presents only 32×32 pixels, the point cloud was estimated in a higher resolution of 96×96 pixels. The acquired movie is shown in [101]. Figure reproduced with permission of Springer Nature publishing.

Fig. 9. Single-photon detector technologies for infrared single-photon LIDAR, with spectral coverage for each detector type indicated. (a) Schematic diagram cross section of a Si-based SPAD detector. The design is a homojunction. (b) Schematic diagram cross section of a Ge-on-Si structure, illustrating optical absorption in the Ge layer, and multiplication in the intrinsic Si layer. (c) Schematic diagram cross section of an InGaAs/InP SPAD detector; the absorption is in the narrow-gap InGaAs and the multiplication in the wider gap InP layer. In both (b) and (c), the charge sheet is used to alter the relative electric fields in the absorption and multiplication layers. (d) Schematic illustration of SNSPD architecture for near-unity efficiency at 1550 nm wavelength and optical micrograph of chip with single-pixel detector [109]; (d) reproduced with permission of Optica Publishing Group.

 

 


Link to paper (open access): https://opg.optica.org/optica/abstract.cfm?URI=optica-10-9-1124

No comments:

Post a Comment

All comments are moderated to avoid spam and personal attacks.