Wednesday, November 20, 2024

Single Photon Avalanche Diodes - Buyer's Guide

Photoniques magazine published an article titled "Single photon avalanches diodes" by Angelo Gulinatti (Politecnico di Milano).

Abstract: Twenty years ago the detection of single photons was little more than a scientific curiosity reserved to a few specialists. Today it is a flourishing field with an ecosystem that extends from university laboratories to large semiconductor manufacturers. This change of paradigm has been stimulated by the emergence of critical applications that rely on single photon detection, and by technical progresses in the detector field. The single photon avalanche diode has unquestionably played a major role in this process.

Full article [free access]: https://www.photoniques.com/articles/photon/pdf/2024/02/photon2024125p63.pdf

 


Figure 1: Fluorescence lifetime measured by time-correlated single-photon counting (TCSPC). The sample is excited by a pulsed laser and the delay between the excitation pulse and the emitted photon is measured by a precision clock. By repeating multiple times, it is possible to build a histogram of the delays that reproduces the shape of the optical signal.



Figure 3: By changing the operating conditions or the design parameters, it is possible to improve some
performance metrics at the expenses of others.



Conference List - February 2025

Electronic Imaging (EI 2025) - 2-6 Feb 2025 - Burlingame, California - Website

MSS Detectors and Materials, and Passive Sensors Conference (clearance may be required) - 10-14 Feb 2025 - Orlando, Florida, USA - Website

IEEE International Solid-State Circuits Conference (ISSCC) - 16-20 Feb 2025 - San Francisco, California, USA - Website

SPIE Medical Imaging - 16-20 Feb 2025 - San Diego, California, USA - Website

innoLAE (Innovations in Large-Area Electronics) - 17-19 Feb 2024 - Cambridge, UK - Website

Wafer-Level Packaging Symposium - 18-20 Feb 2025 - San Francisco, California, USA - Website

Return to Conference List Index  


Monday, November 18, 2024

IEDM 2024 Program is Live

70th Annual IEEE International Electron Devices Meeting (IEDM) will be held December 7-11, 2024 in San Francisco California. Session #41 is on the topic of "Advanced Image Sensors":

https://iedm24.mapyourshow.com/8_0/sessions/session-details.cfm?scheduleid=58

Title: 41 | ODI | Advanced Image Sensors
Description:
This session includes 6 papers on latest image sensor technologies developments. To be noticed this year the multiple ways of stacking layer with new features. The first stack involves a dedicated AI image processing layer based on neural networks for a 50 Mpix sensor. The second one shows progress on small pixel noise with 2-layer pixel and additional intermediate interconnection. Third stack, very innovative with organic pixel on top of conventional Si based ITOF pixel for true single device RGB-Z sensor. All three papers are authored by Sony Semiconductors. InAs QD image sensors are also reported for the first time as a lead-free option for SWIR imaging by both IMEC and Sony Semiconductors Also progress in conventional IR global shutter with newly nitrated MIM capacitor and optimized DTI filling for crosstalk and QE improvement is presented by Samsung semiconductor.

Wednesday, December 11, 2024 - 01:35 PM
41-1 | A Novel 1/1.3-inch 50 Megapixel Three-wafer-stacked CMOS Image Sensor with DNN Circuit for Edge Processing
This study reports the first ever 3-wafer-stacked CMOS image sensor with DNN circuit. The sensor was fabricated using wafer-on-wafer-on-wafer process and DNN circuit was placed on the bottom wafer to ensure heat dissipation. This device can incorporate the HDR function and enlarge the pixel array area to remarkably improve image-recognition.


Wednesday, December 11, 2024 - 02:00 PM
41-2 | Low Dark Noise and 8.5k e− Full Well Capacity in a 2-Layer Transistor Stacked 0.8μm Dual Pixel CIS with Intermediate Poly-Si Wiring
This paper demonstrates a 2-layer transistor pixel stacked CMOS image sensor with the world’s smallest 0.8μm dual pixel. We improved the layout flexibility with intermediate poly-Si wiring technique. Our advanced 2-layer pixel device achieved low input-referred random noise of 1.3 e−rms and high full well capacity of 8.5k e−.


Wednesday, December 11, 2024 - 02:25 PM
41-3 | A High-Performance 2.2μm 1-Layer Pixel Global Shutter CMOS Image Sensor for Near-Infrared Applications
A high performance and low cost 2.2μm 1-layer pixel near infrared (NIR) global shutter (G/S) CMOS image sensor (CIS) was demonstrated. In order to improve quantum efficiency (QE), thick silicon with high aspect ratio full-depth deep trench isolation (FDTI) and backside scattering technology are implemented. Furthermore, thicker sidewall oxide for deep trench isolation and oxide filled FDTI were applied to enhance a modulation transfer function (MTF). In addition, 3-dimensional metal-insulator-metal capacitors were introduced to suppress temporal noise (TN). As a result, we have demonstrated industry-leading NIR G/S CIS with 2.71e- TN, dark current of 8.8e-/s, 42% QE and 58% MTF.


Wednesday, December 11, 2024 - 03:15 PM
41-4 | First Demonstration of 2.5D Out-of-Plane-Based Hybrid Stacked Super-Bionic Compound Eye CMOS Chip with Broadband (300-1600 nm) and Wide-Angle (170°) Photodetection
We propose a hybrid stacked CMOS bionic chip. The surface employs a fabrication process involving binary-pore anodic aluminum oxide (AAO) templates and integrates monolayer graphene (Gr) to mimic the compound eyes, thereby enhancing detection capabilities in the ultraviolet and visible ranges. Utilizing a 2.5D out-of-plane architecture, it achieves a wide-angle detection effect (170°) equivalent to curved surfaces while enhancing absorption in the 1550 nm communication band to nearly 100%. Additionally, through-silicon via (TSV) technology is integrated for wafer-level fabrication, and a CMOS 0.18-µm integrated readout circuit is developed, achieving the super-bionic compound eye chip based on hybrid stacked integration.


Wednesday, December 11, 2024 - 03:40 PM
41-5 | Pseudo-direct LiDAR by deep-learning-assisted high-speed multi-tap charge modulators
A virtually direct LiDAR system based on an indirect ToF image sensor and charge-domain temporal compressive sensing combined with deep learning is demonstrated. This scheme has high spatio-temporal sampling efficiency and offers advantages such as high pixel count, high photon-rate tolerance, immunity to multipath interference, constant power consumption regardless of incident photon rates, and motion artifact-free. The importance of increasing the number of taps of the charge modulator is suggested by simulation.


Wednesday, December 11, 2024 - 04:05 PM
41-6 | A Color Image Sensor Using 1.0-μm Organic Photoconductive Film Pixels Stacked on 4.0-μm Si Pixels for Near-Infrared Time-of-Flight Depth Sensing
We have developed an image sensor capable to simultaneously acquire high-resolution RGB images with good color reproduction and parallax-free ranging information by 1.0-μm organic photoconductive film RGB pixels stacked on 4.0-μm NIR silicon pixels for iToF depth sensing.


Wednesday, December 11, 2024 - 04:30 PM
41-7 | Pb-free Colloidal InAs Quantum Dot Image Sensor for Infrared
We developed an image sensor using colloidal InAs quantum dot (QD) for photoconversion. After spincoating the QDs on a wafer and standard semiconductor processing, the sensor exhibited infrared sensitivity and imaging capability. This approach facilitates easier production of lead-free infrared sensors for consumer use.


Wednesday, December 11, 2024 - 04:55 PM
41-8 | Lead-Free Quantum Dot Photodiodes for Next Generation Short Wave Infrared Optical Sensors
Colloidal quantum dot sensors are disruptingimaging beyond the spectral limits of silicon. In this paper,we present imagers based on InAs QDs as alternative for 1stgeneration Pb-based stacks. New synthesis method yields 9nm QDs optimized for 1400 nm and solution-phase ligandexchange results in uniform 1-step coating. Initial EQE is17.4% at 1390 nm on glass and 5.8% EQE on silicon(detectivity of 7.4 × 109 Jones). Using metal-oxide transportlayers and >300 hour air-stability enable compatibility withfab manufacturing. These results are a starting point towardsthe 2nd generation quantum dot SWIR imagers.


Also of interest, the following talk in Tuesday's session "Major Consumer Image Sensor Innovations Presented at IEDM"

Description: Authors: Albert Theuwissen, Harvest Imaging
Image Sensors past, and progress made over the years

Sunday, November 17, 2024

Job Postings - Week of 17 November 2024

Apple

Camera Sensing Systems Engineer

Cupertino, California, USA

Link

Camera Image Sensor Digital Design Engineer Lead

Austin, Texas, USA

Link

Fairchild Imaging

Director of Sensor Product Engineering

San Jose, California, USA

Link

NASA

Development of infrared detectors and focal plane arrays for space instruments - Postdoc

Pasadena, California, USA

Link

Ouster

Silicon Photonics Packaging Engineer

San Francisco, California, USA

Link

DESY

PhD Student in cryogenic detector systems

Hamburg, Germany

Link

Teledyne

Senior Program Manager - Thermal Cameras

Montreal, Quebec, Canada

Link

Sandia National Laboratories

Postdoctoral Appointee - Integrated Photonics for Imaging and Remote Sensing Applications

Albuquerque, New Mexico, USA

Link

Shanghai Jiao Tong University

Postdoc-Experiment, Particle and Nuclear Division

Shanghai, China

Link

Wednesday, November 13, 2024

Videos from EPIC Neuromorphic Cameras Online Meeting

Presentations from the recent EPIC Online Technology Meeting on Neuromorphic Cameras are available on YouTube:

 

IKERLAN – DVS Pre-processor & Light-field DVS – Xabier Iturbe, NimbleAI EU Project Coordinator
IKERLAN is a leading technology centre in providing competitive value to industry, since 1974. They offer integral solutions in three main areas: digital technologies and artificial intelligence, embedded electronic systems and cybersecurity, and mechatronic and energy technologies. They currently have a team of more than 400 people and offices in Arrasate-Mondragón, Donostialdea and Bilbao. As a cooperative member of the MONDRAGON Corporation and the Basque Research and Technology Alliance (BRTA), IKERLAN represents a sustainable, competitive business model in permanent transformation.


FlySight – Neuromorphic Sensor for Security and Surveillance – Niccolò Camarlinghi, Head of Research
FlySight S.r.l. (A single member Company) is the Defense and Security subsidiary company of Flyby Group, a satellite remote sensing solutions company.
FlySight team offers integrated solutions of data exploitation, image processing, avionic data/sensors fusion. Our products are mainly dedicated to the exploitation of data captured by many sensors typeand our solutions are intended both for the on-ground as well as for the on-board segments.
Through itsexperience in C4ISR (Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance), FlySight offers innovative software development and geospatial application technology programs (GIS) customized for the best results.
Our staff can apply the right COTS for your specific mission.
The instruments and products developed for this sector can find application as dual use tools also in many civil fields like Environmental Monitoring, Oil & Gas, Precision Farming and Maritime/Coastal Planning.


VoxelSensors – Active Event Sensors : an Event-based Approach to Single-photon Sensing of Sparse Optical Signals – Ward van der Tempel, CTO
VoxelSensors is at the forefront of 3D perception, providing cutting-edge sensors and solutions for seamless integration of the physical and digital worlds. Our patented Switching Pixels® Active Event Sensor (SPAES) technology represents a novel category of efficient 3D perception systems, delivering exceptionally low latency with ultra-low power consumption by capturing a new Voxel with fewer than 10 photons. SPAES is a game-changing innovation that unlocks the true potential of fully immersive experiences for both consumer electronics and enterprise AR/VR/MR wearables.


PROPHESEE – Christoph Posch, Co-Founder and CTO
Prophesee is the inventor of the world’s most advanced neuromorphic vision systems. Prophesee’s patented sensors and AI algorithms, introduce a new computer vision paradigm based on how the human eye and brain work. Like the human vision, it sees events: essential, actionable motion information in the scene, not a succession of conventional images.


SynSense – Neuromorphic Processing and Applications – Dylan Muir, VP, Global Research Operations
SynSense is a leading-edge neuromorphic computing company. It provides dedicated mixed-signal/fully digital neuromorphic processors which overcome the limitations of legacy von Neumann computers to provide an unprecedented combination of ultra-low power consumption and low-latency performance. SynSense was founded in March 2017 based on advances in neuromorphic computing hardware developed at the Institute of Neuroinformatics of the University of Zurich and ETH Zurich. SynSense is developing “full-stack” custom neuromorphic processors for a variety of artificial-intelligence (AI) edge-computing applications that require ultra-low-power and ultra-low-latency features, including autonomous robots, always-on co-processors for mobile and embedded devices, wearable health-care systems, security, IoT applications, and computing at the network edge.


Thales – Eric Belhaire, Senior Expert in the Technical Directorate
Thales (Euronext Paris: HO) is a global leader in advanced technologies specialized in three business domains: Defence & Security, Aeronautics & Space, and Cybersecurity & Digital identity. It develops products and solutions that help make the world safer, greener and more inclusive.

Monday, November 11, 2024

"Photon inhibition" to reduce SPAD camera power consumption

In a paper titled "Photon Inhibition for Energy-Efficient Single-Photon Imaging" presented at the European Conference on Computer Vision (ECCV) 2024 Lucas Koerner et al. write:

Single-photon cameras (SPCs) are emerging as sensors of choice for various challenging imaging applications. One class of SPCs based on the single-photon avalanche diode (SPAD) detects individual photons using an avalanche process; the raw photon data can then be processed to extract scene information under extremely low light, high dynamic range, and rapid motion. Yet, single-photon sensitivity in SPADs comes at a cost — each photon detection consumes more energy than that of a CMOS camera. This avalanche power significantly limits sensor resolution and could restrict widespread adoption of SPAD-based SPCs. We propose a computational-imaging approach called photon inhibition to address this challenge. Photon inhibition strategically allocates detections in space and time based on downstream inference task goals and resource constraints. We develop lightweight, on-sensor computational inhibition policies that use past photon data to disable SPAD pixels in real-time, to select the most informative future photons. As case studies, we design policies tailored for image reconstruction and edge detection, and demonstrate, both via simulations and real SPC captured data, considerable reduction in photon detections (over 90% of photons) while maintaining task performance metrics. Our work raises the question of “which photons should be detected?”, and paves the way for future energy-efficient single-photon imaging.

 







 

Lucas Koerner, Shantanu Gupta, Atul Ingle, and Mohit Gupta. "Photon Inhibition for Energy-Efficient Single-Photon Imaging." In European Conference on Computer Vision, pp. 90-107 (2024)
[preprint link]

Wednesday, November 06, 2024

Hamamatsu acquires BAE Systems Imaging [Update: Statement from Fairchild Imaging]

Press release: https://www.hamamatsu.com/us/en/news/announcements/2024/20241105000000.html

Acquisition of BAE Systems Imaging Solutions, Inc. Strengthening the Opto-semiconductor segment and accelerating value-added growth

2024/11/05
Hamamatsu Photonics K.K.

Photonics Management Corp. (Bridgewater, New Jersey, USA), a subsidiary of Hamamatsu Photonics K.K. (Hamamatsu City, Japan), has purchased the stock of BAE Systems Imaging Solutions, Inc. a subsidiary of BAE Systems, Inc. (Falls Church, Virginia, USA). In recognition of the company’s deep roots starting in 1920 as the Fairchild Aerial Camera Corporation, the company will return to the name first used in 2001, Fairchild Imaging.

Fairchild Imaging is a semiconductor manufacturer specializing in high-performance CMOS image sensors in the visible to near-infrared and X-ray regions, and it has the world’s best low-noise CMOS image sensor design technology. Fairchild Imaging’s core products include scientific CMOS image sensors for scientific measurement applications that simultaneously realize high sensitivity, high-speed readout, and low noise, as well as X-ray CMOS image sensors for dental and medical diagnostic applications.

Fairchild Imaging’s core products are two-dimensional CMOS image sensors that take pictures in dark conditions where low noise is essential. These products complement Hamamatsu Photonics’ one-dimensional CMOS image sensors, which are used for analytical instruments and factory automation applications such as displacement meters and encoders. Therefore, Fairchild Imaging’s technologies will enhance Hamamatsu’s CMOS image sensor product line.

Through the acquisition of shares, we expect the following:

1. Promote sales activities of Fairchild Imaging’s products by utilizing the global sales network currently established by Hamamatsu Photonics Group.
2. While Hamamatsu Photonics’ dental business serves the European and the Asian regions including Japan, Fairchild Imaging serves North America. This will lead to the expansion and strengthening of our worldwide dental market share.
3. Fairchild Imaging will become Hamamatsu’s North American design center for 2D, low-noise image sensors. This will strengthen CMOS image sensor design resources and utilize our North American and Japanese locations to provide worldwide marketing and technical support.
4. Create new opportunities and products by combining Fairchild Imaging’s CMOS image sensor design technology with Hamamatsu Photonics’ MEMS technology to support a wider range of custom CMOS image sensors and provide higher value-added products.

BAE Systems is retaining the aerospace and defense segment of the BAE Systems Imaging Solution’s portfolio, which was transferred to the BAE Systems, Inc. Electronic Systems sector, prior to the closing of this stock purchase transaction.

Fairchild Imaging will continue their operating structure and focus on developing and providing superior products and solutions to their customers.
 
 
[Update Nov 6, 2024: statement from Fairchild Imaging]
 
We are very happy to announce a new chapter in the storied history of Fairchild Imaging! BAE Systems, Inc., which had owned the stock of Fairchild Imaging, Inc. for the past 13 years, has processed a stock sale to Photonics Management Corporation, a subsidiary of Hamamatsu Photonics K.K. Resuming the identity as Fairchild Imaging, Inc., we will operate as an independent, yet wholly owned, US entity.

Fairchild Imaging is a CMOS imaging sensor design and manufacturing company, specializing in high-performance image sensors. Our x-ray and visible spectrum sensors provide class leading performance in x-ray, and from ultraviolet through visible and into near-infrared wavelengths. Fairchild Imaging’s core products include medical x-ray sensors for superior diagnostics, as well as scientific CMOS (sCMOS) sensors for measurement applications that simultaneously realize high sensitivity, fast readout, high dynamic range, and ultra-low noise in 4K resolution.
 
Marc Thacher, CEO of Fairchild Imaging, said:
“Joining the Hamamatsu family represents a great opportunity for Fairchild Imaging. Building upon decades of imaging excellence, we look forward to bringing new innovations and technologies to challenging imaging applications like scientific, space, low-light, machine vision, inspection, and medical diagnostics. The acquisition by Hamamatsu will help drive growth and agility as we continue as a design leader for our customers, partners, and employees.”
 
As part of this new chapter, Fairchild Imaging is unveiling its latest evolution of sCMOS sensors: sCMOS 3.1. These patented, groundbreaking imagers redefine the limits of what is possible in CMOS sensors for the most demanding of imaging applications.

Monday, November 04, 2024

Lynred announces 8.5um pitch thermal sensor

Link: https://ala.associates/wp-content/uploads/2024/09/241001-Lynred-8.5-micron-EN-.pdf

Lynred demonstrates smallest thermal imaging sensor for future Automatic Emergency Braking Systems (AEB) at AutoSens Europe 

Prototype 8.5 µm pixel pitch technology that shrinks by 50% the volume size of thermal cameras is designed to help automotive OEMs meet tougher future AEB system requirements, particularly at night.

Grenoble, France, October 1, 2024 – Lynred, a leading global provider of high-quality infrared sensors for the aerospace, defense and commercial markets, today announces it will demonstrate a prototype 8.5 µm pixel pitch sensor during AutoSens Europe, a major international event for automotive engineers, in Barcelona, Spain, October 8 – 10, 2024. The 8.5 µm pixel pitch technology is the smallest infrared sensor candidate for future Automatic Emergency Braking (AEB) and Advanced Driver Assistance Systems (ADAS).

The prototype, featuring half the surface of current 12 µm thermal imaging sensors for automotive applications, will enable system developers to build much smaller cameras for integration in AEB systems.

Following a recent ruling by the US National Highway Traffic Safety Administration (NHTSA), AEB systems will be mandatory in all light vehicles by 2029. It sets tougher rules for road safety at night.

The NHTSA sees driver assistance technologies and the deployment of sensors and subsystems as holding the potential to reduce traffic crashes and save thousands of lives per year. The European Traffic Safety Council (ETSC) also recognizes that AEB systems need to work better in wet, foggy and low-light conditions.

Thermal imaging sensors can detect and identify objects in total darkness. As automotive OEMs need to upgrade the performance of AEB systems within all light vehicles, Lynred is preparing a full roadmap of solutions set to help achieve this compliance. Currently gearing up for high volume production of its automotive qualified 12µm product offer, Lynred is ready to deliver the key component enabling Pedestrian Automatic Emergency Braking (PAEB) systems to work in adverse conditions, particularly at night, when more than 75% of pedestrian fatalities occur.

Lynred is among the first companies to demonstrate a longwave infrared (LWIR) pixel pitch technology for ADAS and PAEB systems that will optimize the size to performance ratio of future generation cameras. The 8.5µm pixel pitch technology will divide by two the volume of a thermal imaging camera, resulting in easier integration for OEMs, while successfully maintaining the same performance standards as larger-sized LWIR models.

Friday, November 01, 2024

Pixelplus new product videos

 

PKA210 Seamless RGB-IR Image Sensor

PG7130KA Global shutter Image Sensor