Wednesday, November 13, 2024

Videos from EPIC Neuromorphic Cameras Online Meeting

Presentations from the recent EPIC Online Technology Meeting on Neuromorphic Cameras are available on YouTube:

 

IKERLAN – DVS Pre-processor & Light-field DVS – Xabier Iturbe, NimbleAI EU Project Coordinator
IKERLAN is a leading technology centre in providing competitive value to industry, since 1974. They offer integral solutions in three main areas: digital technologies and artificial intelligence, embedded electronic systems and cybersecurity, and mechatronic and energy technologies. They currently have a team of more than 400 people and offices in Arrasate-Mondragón, Donostialdea and Bilbao. As a cooperative member of the MONDRAGON Corporation and the Basque Research and Technology Alliance (BRTA), IKERLAN represents a sustainable, competitive business model in permanent transformation.


FlySight – Neuromorphic Sensor for Security and Surveillance – Niccolò Camarlinghi, Head of Research
FlySight S.r.l. (A single member Company) is the Defense and Security subsidiary company of Flyby Group, a satellite remote sensing solutions company.
FlySight team offers integrated solutions of data exploitation, image processing, avionic data/sensors fusion. Our products are mainly dedicated to the exploitation of data captured by many sensors typeand our solutions are intended both for the on-ground as well as for the on-board segments.
Through itsexperience in C4ISR (Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance), FlySight offers innovative software development and geospatial application technology programs (GIS) customized for the best results.
Our staff can apply the right COTS for your specific mission.
The instruments and products developed for this sector can find application as dual use tools also in many civil fields like Environmental Monitoring, Oil & Gas, Precision Farming and Maritime/Coastal Planning.


VoxelSensors – Active Event Sensors : an Event-based Approach to Single-photon Sensing of Sparse Optical Signals – Ward van der Tempel, CTO
VoxelSensors is at the forefront of 3D perception, providing cutting-edge sensors and solutions for seamless integration of the physical and digital worlds. Our patented Switching Pixels® Active Event Sensor (SPAES) technology represents a novel category of efficient 3D perception systems, delivering exceptionally low latency with ultra-low power consumption by capturing a new Voxel with fewer than 10 photons. SPAES is a game-changing innovation that unlocks the true potential of fully immersive experiences for both consumer electronics and enterprise AR/VR/MR wearables.


PROPHESEE – Christoph Posch, Co-Founder and CTO
Prophesee is the inventor of the world’s most advanced neuromorphic vision systems. Prophesee’s patented sensors and AI algorithms, introduce a new computer vision paradigm based on how the human eye and brain work. Like the human vision, it sees events: essential, actionable motion information in the scene, not a succession of conventional images.


SynSense – Neuromorphic Processing and Applications – Dylan Muir, VP, Global Research Operations
SynSense is a leading-edge neuromorphic computing company. It provides dedicated mixed-signal/fully digital neuromorphic processors which overcome the limitations of legacy von Neumann computers to provide an unprecedented combination of ultra-low power consumption and low-latency performance. SynSense was founded in March 2017 based on advances in neuromorphic computing hardware developed at the Institute of Neuroinformatics of the University of Zurich and ETH Zurich. SynSense is developing “full-stack” custom neuromorphic processors for a variety of artificial-intelligence (AI) edge-computing applications that require ultra-low-power and ultra-low-latency features, including autonomous robots, always-on co-processors for mobile and embedded devices, wearable health-care systems, security, IoT applications, and computing at the network edge.


Thales – Eric Belhaire, Senior Expert in the Technical Directorate
Thales (Euronext Paris: HO) is a global leader in advanced technologies specialized in three business domains: Defence & Security, Aeronautics & Space, and Cybersecurity & Digital identity. It develops products and solutions that help make the world safer, greener and more inclusive.

Monday, November 11, 2024

"Photon inhibition" to reduce SPAD camera power consumption

In a paper titled "Photon Inhibition for Energy-Efficient Single-Photon Imaging" presented at the European Conference on Computer Vision (ECCV) 2024 Lucas Koerner et al. write:

Single-photon cameras (SPCs) are emerging as sensors of choice for various challenging imaging applications. One class of SPCs based on the single-photon avalanche diode (SPAD) detects individual photons using an avalanche process; the raw photon data can then be processed to extract scene information under extremely low light, high dynamic range, and rapid motion. Yet, single-photon sensitivity in SPADs comes at a cost — each photon detection consumes more energy than that of a CMOS camera. This avalanche power significantly limits sensor resolution and could restrict widespread adoption of SPAD-based SPCs. We propose a computational-imaging approach called photon inhibition to address this challenge. Photon inhibition strategically allocates detections in space and time based on downstream inference task goals and resource constraints. We develop lightweight, on-sensor computational inhibition policies that use past photon data to disable SPAD pixels in real-time, to select the most informative future photons. As case studies, we design policies tailored for image reconstruction and edge detection, and demonstrate, both via simulations and real SPC captured data, considerable reduction in photon detections (over 90% of photons) while maintaining task performance metrics. Our work raises the question of “which photons should be detected?”, and paves the way for future energy-efficient single-photon imaging.

 







 

Lucas Koerner, Shantanu Gupta, Atul Ingle, and Mohit Gupta. "Photon Inhibition for Energy-Efficient Single-Photon Imaging." In European Conference on Computer Vision, pp. 90-107 (2024)
[preprint link]

Wednesday, November 06, 2024

Hamamatsu acquires BAE Systems Imaging [Update: Statement from Fairchild Imaging]

Press release: https://www.hamamatsu.com/us/en/news/announcements/2024/20241105000000.html

Acquisition of BAE Systems Imaging Solutions, Inc. Strengthening the Opto-semiconductor segment and accelerating value-added growth

2024/11/05
Hamamatsu Photonics K.K.

Photonics Management Corp. (Bridgewater, New Jersey, USA), a subsidiary of Hamamatsu Photonics K.K. (Hamamatsu City, Japan), has purchased the stock of BAE Systems Imaging Solutions, Inc. a subsidiary of BAE Systems, Inc. (Falls Church, Virginia, USA). In recognition of the company’s deep roots starting in 1920 as the Fairchild Aerial Camera Corporation, the company will return to the name first used in 2001, Fairchild Imaging.

Fairchild Imaging is a semiconductor manufacturer specializing in high-performance CMOS image sensors in the visible to near-infrared and X-ray regions, and it has the world’s best low-noise CMOS image sensor design technology. Fairchild Imaging’s core products include scientific CMOS image sensors for scientific measurement applications that simultaneously realize high sensitivity, high-speed readout, and low noise, as well as X-ray CMOS image sensors for dental and medical diagnostic applications.

Fairchild Imaging’s core products are two-dimensional CMOS image sensors that take pictures in dark conditions where low noise is essential. These products complement Hamamatsu Photonics’ one-dimensional CMOS image sensors, which are used for analytical instruments and factory automation applications such as displacement meters and encoders. Therefore, Fairchild Imaging’s technologies will enhance Hamamatsu’s CMOS image sensor product line.

Through the acquisition of shares, we expect the following:

1. Promote sales activities of Fairchild Imaging’s products by utilizing the global sales network currently established by Hamamatsu Photonics Group.
2. While Hamamatsu Photonics’ dental business serves the European and the Asian regions including Japan, Fairchild Imaging serves North America. This will lead to the expansion and strengthening of our worldwide dental market share.
3. Fairchild Imaging will become Hamamatsu’s North American design center for 2D, low-noise image sensors. This will strengthen CMOS image sensor design resources and utilize our North American and Japanese locations to provide worldwide marketing and technical support.
4. Create new opportunities and products by combining Fairchild Imaging’s CMOS image sensor design technology with Hamamatsu Photonics’ MEMS technology to support a wider range of custom CMOS image sensors and provide higher value-added products.

BAE Systems is retaining the aerospace and defense segment of the BAE Systems Imaging Solution’s portfolio, which was transferred to the BAE Systems, Inc. Electronic Systems sector, prior to the closing of this stock purchase transaction.

Fairchild Imaging will continue their operating structure and focus on developing and providing superior products and solutions to their customers.
 
 
[Update Nov 6, 2024: statement from Fairchild Imaging]
 
We are very happy to announce a new chapter in the storied history of Fairchild Imaging! BAE Systems, Inc., which had owned the stock of Fairchild Imaging, Inc. for the past 13 years, has processed a stock sale to Photonics Management Corporation, a subsidiary of Hamamatsu Photonics K.K. Resuming the identity as Fairchild Imaging, Inc., we will operate as an independent, yet wholly owned, US entity.

Fairchild Imaging is a CMOS imaging sensor design and manufacturing company, specializing in high-performance image sensors. Our x-ray and visible spectrum sensors provide class leading performance in x-ray, and from ultraviolet through visible and into near-infrared wavelengths. Fairchild Imaging’s core products include medical x-ray sensors for superior diagnostics, as well as scientific CMOS (sCMOS) sensors for measurement applications that simultaneously realize high sensitivity, fast readout, high dynamic range, and ultra-low noise in 4K resolution.
 
Marc Thacher, CEO of Fairchild Imaging, said:
“Joining the Hamamatsu family represents a great opportunity for Fairchild Imaging. Building upon decades of imaging excellence, we look forward to bringing new innovations and technologies to challenging imaging applications like scientific, space, low-light, machine vision, inspection, and medical diagnostics. The acquisition by Hamamatsu will help drive growth and agility as we continue as a design leader for our customers, partners, and employees.”
 
As part of this new chapter, Fairchild Imaging is unveiling its latest evolution of sCMOS sensors: sCMOS 3.1. These patented, groundbreaking imagers redefine the limits of what is possible in CMOS sensors for the most demanding of imaging applications.

Monday, November 04, 2024

Lynred announces 8.5um pitch thermal sensor

Link: https://ala.associates/wp-content/uploads/2024/09/241001-Lynred-8.5-micron-EN-.pdf

Lynred demonstrates smallest thermal imaging sensor for future Automatic Emergency Braking Systems (AEB) at AutoSens Europe 

Prototype 8.5 µm pixel pitch technology that shrinks by 50% the volume size of thermal cameras is designed to help automotive OEMs meet tougher future AEB system requirements, particularly at night.

Grenoble, France, October 1, 2024 – Lynred, a leading global provider of high-quality infrared sensors for the aerospace, defense and commercial markets, today announces it will demonstrate a prototype 8.5 µm pixel pitch sensor during AutoSens Europe, a major international event for automotive engineers, in Barcelona, Spain, October 8 – 10, 2024. The 8.5 µm pixel pitch technology is the smallest infrared sensor candidate for future Automatic Emergency Braking (AEB) and Advanced Driver Assistance Systems (ADAS).

The prototype, featuring half the surface of current 12 µm thermal imaging sensors for automotive applications, will enable system developers to build much smaller cameras for integration in AEB systems.

Following a recent ruling by the US National Highway Traffic Safety Administration (NHTSA), AEB systems will be mandatory in all light vehicles by 2029. It sets tougher rules for road safety at night.

The NHTSA sees driver assistance technologies and the deployment of sensors and subsystems as holding the potential to reduce traffic crashes and save thousands of lives per year. The European Traffic Safety Council (ETSC) also recognizes that AEB systems need to work better in wet, foggy and low-light conditions.

Thermal imaging sensors can detect and identify objects in total darkness. As automotive OEMs need to upgrade the performance of AEB systems within all light vehicles, Lynred is preparing a full roadmap of solutions set to help achieve this compliance. Currently gearing up for high volume production of its automotive qualified 12µm product offer, Lynred is ready to deliver the key component enabling Pedestrian Automatic Emergency Braking (PAEB) systems to work in adverse conditions, particularly at night, when more than 75% of pedestrian fatalities occur.

Lynred is among the first companies to demonstrate a longwave infrared (LWIR) pixel pitch technology for ADAS and PAEB systems that will optimize the size to performance ratio of future generation cameras. The 8.5µm pixel pitch technology will divide by two the volume of a thermal imaging camera, resulting in easier integration for OEMs, while successfully maintaining the same performance standards as larger-sized LWIR models.

Friday, November 01, 2024

Pixelplus new product videos

 

PKA210 Seamless RGB-IR Image Sensor

PG7130KA Global shutter Image Sensor


Thursday, October 31, 2024

IISW 2025 Final Call for Papers is out

The 2025 International Image Sensor Workshop (IISW) provides a biennial opportunity to present innovative work in the area of solid-state image sensors and share new results with the image sensor community. The event is intended for image sensor technologists; in order to encourage attendee interaction and a shared experience, attendance is limited, with strong acceptance preference given to workshop presenters. As is the tradition, the 2025 workshop will emphasize an open exchange of information among participants in an informal, secluded setting beside the Awaji Island in Hyōgo, Japan.

The scope of the workshop includes all aspects of electronic image sensor design and development. In addition to regular oral and poster papers, the workshop will include invited talks and announcement of International Image Sensors Society (IISS) Award winners.

Submission of abstracts:
An abstract should consist of a single page of maximum 500-words text with up to two pages of illustrations (3 pages maximum), and include authors’ name(s), affiliation, mailing address, telephone number, and e-mail address.


The deadline for abstract submission is 11:59pm, Thursday Dec 19, 2024 (GMT).
To submit an abstract, please go to: https://cmt3.research.microsoft.com/IISW2025 

 

Wednesday, October 30, 2024

Space & Scientific CMOS Image Sensors Workshop

The preliminary program for Space & Scientific CMOS Image Sensors Workshop to be held on 26th & 27th November in Toulouse Labège is available.

Registration: https://evenium.events/space-and-scientific-cmos-image-sensors-2024/








Tuesday, October 29, 2024

Call for Nominations for the 2025 Walter Kosonocky Award

International Image Sensor Society calls for nominations for the 2025 Walter Kosonocky Award for Significant Advancement in Solid-State Image Sensors.
 
The Walter Kosonocky Award is presented biennially for THE BEST PAPER presented in any venue during the prior two years representing significant advancement in solid-state image sensors. The award commemorates the many important contributions made by the late Dr. Walter Kosonocky to the field of solid-state image sensors. Personal tributes to Dr. Kosonocky appeared in the IEEE Transactions on Electron Devices in 1997. Founded in 1997 by his colleagues in industry, government and academia, the award is also funded by proceeds from the International Image Sensor Workshop.
 
The award is selected from nominated papers by the Walter Kosonocky Award Committee, announced and presented at the International Image Sensor Workshop (IISW), and sponsored by the International Image Sensor Society (IISS). The winner is presented with a certificate, complementary registration to the IISW, and an honorarium.
 
Please send us an email nomination for this year's award, with a pdf file of the nominated paper (that you judge is the best paper published/ presented in calendar years 2023 and 2024) as well as a brief description (less than 100 words) of your reason nominating the paper. Nomination of a paper from your company/ institute is also welcome.
 
The deadline for receiving nominations is January 15th, 2025.
 
Your nominations should be sent to Yusuke Oike (2025nominations@imagesensors.org), Secretary of the IISS Award Committee.

Monday, October 28, 2024

Single Photon Workshop 2024 Program Available

The 11th Single Photon Workshop will be held at the Edinburgh International Conference Centre (EICC) over the five-day period, 18-22nd November 2024.

The full program is available here: https://fitwise.eventsair.com/2024singlephotonworkshop/programme

Here are some image-sensor specific sessions and talks:

Wednesday Nov 20, 2024 Session Title: Superconducting Photon Detectors 1
Chair: Dmitry Morozov
4:40 PM - 5:10 PM
Demonstration of a 400,000 pixel superconducting single-photon camera
Invited Speaker - Adam McCaughan - National Institute of Standards and Technology (NIST)
5:10 PM - 5:15 PM
Company Symposium: Photon Spot Platinum Sponsor Speaker: Vikas Anant
5:15 PM - 5:30 PM
Development of Superconducting Wide Strip Photon Detector Paper Number: 112 Speaker: Shigehito Miki - National Institute of Information and Communications Technology (NICT)
5:30 PM - 5:45 PM
Superconducting nanowire single photon detectors arrays for quantum optics Paper Number: 34 Speaker: Val Zwiller - KTH Royal Institute of Technology
5:45 PM - 6:00 PM
Single photon detection up to 2 µm in pair of parallel microstrips based on NbRe ultrathin films
Paper Number: 80 Speaker: Loredana Parlato - University of Naples Federico II
6:00 PM - 6:15 PM
Reading out SNSPDs with Opto-Electronic Converters Paper Number: 87 Speaker: Frederik Thiele - Paderborn Univeristy
6:15 PM - 6:30 PM
Development of Mid to Far-Infrared Superconducting Nanowire Single Photon Detectors Paper Number: 195 Speaker: Sahil Patel - California Institute Of Technology

Thursday Nov 21, 2024 Session Title: Superconducting Photon Detectors 2
Chair: Martin J Stevens
8:30 AM - 8:45 AM
Opportunities and challenges for photon-number resolution with SNSPDs Paper Number: 148 Speaker: Giovanni V Resta - ID Quantique
8:45 AM - 9:00 AM
Detecting molecules at the quantum yield limit for mass spectroscopy with arrays of NbTiN superconducting nanowire detectors Paper Number: 61 Speaker: Ronan Gourgues - Single Quantum
9:00 AM - 9:30 AM
Current state of SNSPD arrays for deep space optical communication Invited Speaker - Emma E Wollman - California Institute Of Technology
9:30 AM - 9:35 AM
Company Symposium: Quantum Opus/MPD presentation Platinum Sponsors
9:35 AM - 9:50 AM
Novel kinetic inductance current sensor for transition-edge sensor readout Paper Number:238 Speaker: Paul Szypryt - National Institute of Standards and Technology (NIST)
9:50 AM - 10:05 AM
Quantum detector tomography for high-Tc SNSPDs Paper Number: 117 Speaker: Mariia Sidorova - Humboldt University of Berlin
10:05 AM - 10:20 AM
Enhanced sensitivity and system integration for infrared waveguide-integrated superconducting nanowire single-photon detectors Paper Number: 197 Speaker: Adan Azem - University Of British Columbia

 

Thursday Nov 21, 2024 Session Title: SPADs 1
Chair: Chee Hing Tan
11:00 AM - 11:30 AM
A 3D-stacked SPAD Imager with Pixel-parallel Computation for Diffuse Correlation Spectroscopy
Invited Speaker - Robert Henderson - University of Edinburgh
11:30 AM - 11:45 AM
High temporal resolution 32 x 1 SPAD array module with 8 on-chip 6 ps TDCs
Paper Number: 182 Speaker: Chiara Carnati - Politecnico Di Milano
11:45 AM - 12:00 PM
A 472 x 456 SPAD Array with In-Pixel Temporal Correlation Capability and Address-Based Readout for Quantum Ghost Imaging Applications
Paper Number: 186 Speaker :Massimo Gandola - Fondazione Bruno Kessler
12:00 PM - 12:15 PM
High Performance Time-to-Digital Converter for SPAD-based Single-Photon Counting applications
Paper Number: 181 Speaker: Davide Moschella - Politecnico Di Milano
12:15 PM - 12:30 PM
A femtosecond-laser-written programmable photonic circuit directly interfaced to a silicon SPAD array
Paper Number: 271 Speaker: Francesco Ceccarelli - The Istituto di Fotonica e Nanotecnologie (CNR-IFN)

Thursday Nov 21, 2024 Session Title: SPADs 2
Chair: Alberto Tosi
2:00 PM - 2:30 PM
Ge-on-Si Technology Enabled SWIR Single-Photon Detection
Invited Speaker - Neil Na - Artilux
2:30 PM - 2:45 PM
The development of pseudo-planar Ge-on-Si single-photon avalanche diode detectors for photon detection in the short-wave infrared spectral region
Paper Number: 254 Speaker: Lisa Saalbach - Heriot-Watt University
2:45 PM - 3:00 PM
Hybrid integration of InGaAs/InP single photon avalanche diodes array and silicon photonics chip
Paper Number: 64 Speaker: Xiaosong Ren - Tsinghua University
3:00 PM - 3:15 PM
Dark Current and Dark Count Rate Dependence on Anode Geometry of InGaAs/InP Single-Photon Avalanche Diodes
Paper Number: 248 Speaker: Rosemary Scowen - Toshiba Research Europe
3:15 PM - 3:30 PM
Compact SAG-based InGaAs/InP SPAD for 1550nm photon counting
Paper Number: 111 Speaker: Ekin Kizilkan - École Polytechnique Fédérale de Lausanne (EPFL)

Thursday Nov 21, 2024 Session Title: Single-photon Imaging and Sensing 1
Chair: Aurora Maccarone
4:15 PM - 4:45 PM
Single Photon LIDAR goes long Range
Invited Speaker - Feihu Xu - USTC China
4:45 PM - 5:00 PM
The Deep Space Optical Communication Photon Counting Camera
Paper Number: 11 Speaker: Alex McIntosh - MIT Lincoln Laboratory
5:00 PM - 5:15 PM
Human activity recognition with Single-Photon LiDAR at 300 m range
Paper Number: 232 Speaker: Sandor Plosz - Heriot-Watt University
5:15 PM - 5:30 PM
Detection Times Improve Reflectivity Estimation in Single-Photon Lidar
Paper Number: 273 Speaker: Joshua Rapp - Mitsubishi Electric Research Laboratories
5:30 PM - 5:45 PM
Bayesian Neuromorphic Imaging for Single-Photon LiDAR
Paper Number: 57 Speaker: Dan Yao - Heriot-Watt University
5:45 PM - 6:00 PM
Single Photon FMCW LIDAR for Vibrational Sensing and Imaging
Paper Number: 23 Speaker: Theodor Staffas - KTH Royal Institute of Technology

Friday Nov 22, 2024 Session Title: Single-photon Imaging 2
9:00 AM - 9:15 AM
Quantum-inspired Rangefinding for Daytime Noise Resistance
Paper Number:208 Speaker: Weijie Nie - University of Bristol
9:15 AM - 9:30 AM
High resolution long range 3D imaging with ultra-low timing jitter superconducting nanowire single-photon detectors
Paper Number: 296 Speaker: Aongus McCarthy - Heriot-Watt University
9:30 AM - 9:45 AM
A high-dimensional imaging system based on an SNSPD spectrometer and computational imaging
Paper Number: 62 Speaker: Mingzhong Hu - Tsinghua University
9:45 AM - 10:00 AM
Single-photon detection techniques for real-time underwater three-dimensional imaging
Paper Number: 289 Speaker: Aurora Maccarone - Heriot-Watt University
10:00 AM - 10:15 AM
Photon-counting measurement of singlet oxygen luminescence generated from PPIX photosensitizer in biological media
Paper Number: 249 Speaker: Vikas - University of Glasgow
10:15 AM - 10:30 AM
A Plug and Play Algorithm for 3D Video Super-Resolution of single-photon data
Paper Number:297 Speaker: Alice Ruget - Heriot-Watt University

Friday Nov 22, 2024 Session Title: Single-photon Imaging and Sensing 2
11:00 AM - 11:30 AM
Hyperspectral Imaging with Mid-IR Undetected Photons
Invited Speaker - Sven Ramelow - Humboldt University of Berlin
11:30 AM - 11:45 AM
16-band Single-photon imaging based on Fabry-Perot Resonance
Paper Number: 35 Speaker: Chufan Zhou - École Polytechnique Fédérale de Lausanne (EPFL)
11:45 AM - 12:00 PM
High-frame-rate fluorescence lifetime microscopy with megapixel resolution for dynamic cellular imaging
Paper Number: 79 Speaker: Euan Millar - University of Glasgow
12:00 PM - 12:15 PM
Beyond historical speed limitation in time correlated single photon counting without distortion: experimental measurements and future developments
Paper Number: 237 Speaker: Giulia Acconcia - Politecnico Di Milano
12:15 PM - 12:30 PM
Hyperspectral mid-infrared imaging with undetected photons
Paper Number: 268 Speaker: Emma Pearce - Humboldt University of Berlin
12:30 PM - 12:45 PM
Determination of scattering coefficients of brain tissues by wide-field time-of-flight measurements with single photon camera.
Paper Number: 199 Speaker: André Stefanov - University Of Bern

Wednesday, October 23, 2024

Image sensor basics

These lecture slides by Prof. Yuhao Zhu at U. Rochester are a great first introduction to how an image sensor works. A few selected slides are shown below. For the full slide deck visit: https://www.cs.rochester.edu/courses/572/fall2022/decks/lect10-sensor-basics.pdf