Thursday, November 27, 2025

ISSCC 2026 Image Sensors session

ISSCC 2026 will be held Feb 15-19, 2026 in San Francisco, CA.

The advance program is now available: https://submissions.mirasmart.com/ISSCC2026/PDF/ISSCC2026AdvanceProgram.pdf 

Session 7 Image Sensors and Ranging (Feb 16)

Session Chair: Augusto Ximenes, CogniSea, Seattle, WA
Session Co-Chair: Andreas Suess, Google, Mountain View, CA

54×42 LiDAR 3D-Stacked System-On-Chip with On-Chip Point
Cloud Processing and Hybrid On-Chip/Package-Embedded 25V
Boost Generation

VoxCAD: A 0.82-to-81.0mW Intelligent 3D-Perception dToF SoC
with Sector-Wise Voxelization and High-Density Tri-Mode eDRAM
CIM Macro

A Multi-Range, Multi-Resolution LiDAR Sensor with
2,880-Channel Modular Survival Histogramming TDC and Delay
Compensation Using Double Histogram Sampling

A 480×320 CMOS LiDAR Sensor with Tapering 1-Step
Histogramming TDCs and Sub-Pixel Echo Resolvers

A 26.0mW 30fps 400x300-pixel SWIR Ge-SPAD dToF Range
Sensor with Programmable Macro-Pixels and Integrated
Histogram Processing for Low-Power AR/VR Applications

A 128×96 Multimodal Flash LiDAR SPAD Imager with Object
Segmentation Latency of 18μs Based on Compute-Near-Sensor
Ising Annealing Machine

A Fully Reconfigurable Hybrid SPAD Vision Sensor with 134dB
Dynamic Range Using Time-Coded Dual Exposures

A 55nm Intelligent Vision SoC Achieving 346TOPS/W System
Efficiency via Fully Analog Sensing-to-Inference Pipeline

A 1.09e--Random-Noise 1.5μm-Pixel-Pitch 12MP Global-Shutter-
Equivalent CMOS Image Sensor with 3μm Digital Pixels Using
Quad-Phase-Staggered Zigzag Readout and Motion
Compensation

A 200MP 0.61μm-Pixel-Pitch CMOS Imager with Sub-1e- Readout
Noise Using Interlaced-Shared Transistor Architecture and
On-Chip Motion Artifact-Free HDR Synthesis for 8K Video
Applications

Tuesday, November 25, 2025

Ubicept releases toolkit for SPAD and CIS

Ubicept Extends Availability of Perception Technology to Make Autonomous Systems Using Conventional Cameras More Reliable

Computer vision processing unlocks higher quality, more trustworthy visual data for machines whether they use advanced sensors from Pi Imaging Technology or conventional vision systems

BOSTON--(BUSINESS WIRE)--Ubicept, the computer vision startup operating at the limits of physics, today announced the release of the Ubicept Toolkit, which will bring its physics-based imaging to any modern vision system. Whether for single-photon avalanche diode (SPAD) sensors in next-generation vision systems or immediate image quality improvements with existing hardware, Ubicept provides a unified, physics-based approach that delivers high quality, trustworthy data.

“Ubicept’s technology revolutionizes how machines see the world by unlocking the full potential of today's and tomorrow's image sensors. Our physics-based approach captures the full complexity of motion, even in low-light or high-dynamic-range conditions, providing more trustworthy data than AI-based video enhancement,” said Sebastian Bauer, CEO of Ubicept. “With the Ubicept Toolkit, we’re now making our advanced single-photon imaging more accessible for a broad range of applications from robotics to automotive to industrial sensing.”

Ubicept’s solution is designed for the most advanced sensors to maximize image data quality and reliability. Now, the Toolkit will support any widely available CMOS camera with raw uncompressed output, giving perception developers immediate quality gains.

“Autonomous systems need a better way to understand the world. Our mission is to turn raw photon data into outputs that are specifically designed for computer vision, not human consumption,” said Tristan Swedish, CTO of Ubicept. “By making our technology available for more conventional vision systems, we are giving engineers the opportunity to experience the boost in reliability now while creating an easier pathway to SPAD sensor adoption.”

SPAD sensors – traditionally used in 3D systems – are poised to reshape the image sensor and computer vision landscape. While the CMOS sensor market is projected to grow to $30B by 2029 at 7.5% CAGR, the SPAD market is growing nearly three times faster, expected to reach $2.55B by 2029 at 20.1% CAGR.

Pi Imaging Technology is a leader in the field with its SPAD Alpha, a next-generation 1-megapixel single-photon camera that delivers zero read noise, nanosecond-level exposure control, and frame rates up to 73,000 fps. Designed for demanding scientific applications, it offers researchers and developers extreme temporal precision and light sensitivity. The Ubicept Toolkit builds on these strengths by transforming the SPAD Alpha’s raw photon data into clear, ready-to-use imagery for perception and analysis.

“Ubicept shares our deep commitment to advancing perception technology,” said Michel Antolović, CEO of Pi Imaging Technology. “By combining our SPAD Alpha’s state-of-the-art hardware with Ubicept’s real-time processing, perception engineers can get the most from what single-photon imaging has to offer.”

The Toolkit provides engineering teams with everything they need to visualize, capture, and process video data efficiently with the Ubicept Photon Fusion (UPF) algorithm. The SPAD Toolkit also includes Ubicept’s FLARE (Flexible Light Acquisition and Representation Engine) firmware for optimized photon capture. In addition, the Toolkit includes white-glove support to early adopters for a highly personalized and premium experience.

The Ubicept Toolkit will be available in December 2025. To learn how it can elevate perception performance and integrate into existing workflows, contact Ubicept here.

Monday, November 24, 2025

Job Postings - Week of November 23 2025


ByteDance

Image Sensor Digital Design Lead- Pico

San Jose, California, USA

Link

ST Microelectronics

Silicon Photonics Product Development Engineer

Grenoble, France

Link

DigitalFish

Senior Systems Engineer, Cameras/Imaging

Sunnyvale, California, USA [Remote]

Link

Imasenic

Digital IC Design Engineer

Barcelona, Spain

Link

Meta

Technical Program Manager, Camera Systems

Sunnyvale, California, USA

Link

Westlake University

Ph.D. Positions in Dark Matter & Neutrino Experiments

Hangzhou, Zhejiang,

China

Link

General Motors

Advanced Optical Sensor Test Engineer

Warren, Michigan, USA

[Hybrid]

Link

INFN

Post-Doc senior research grant in experimental physics

Frascati, italy

Link

Northrop Grumman

Staff EO/IR Portfolio Technical Lead

Melbourne, Florida, USA

Link

Friday, November 21, 2025

"Camemaker" image sensors search tool

An avid reader of the blog shared this handy little search tool for image sensors: 

https://www.camemaker.com/shop

Although it isn't comprehensive (only covers a few companies), you can filter by various sensor specs. Try it out? 

Monday, November 17, 2025

Event cameras: applications and challenges

Gregor Lenz (roboticist, and cofounder of Open Neuromorphic and Neurobus) has written a two-part blogpost that readers of ISW might find enlightening:

https://lenzgregor.com/posts/event-cameras-2025-part1/

https://lenzgregor.com/posts/event-cameras-2025-part2/ 

Gregor goes into various application domains where event cameras have been tried, but faced challenges, technical and otherwise.

Wide adoption will depend less and less on technical merit and more on how well the new sensor modality will fit into existing pipelines for X where X can be supply chain, hardware, software, manufacturing, assembly, testing, ...  pick your favorite!

Saturday, November 15, 2025

Conference List - May 2026

Quantum Photonics Conference, Networking and Trade Exhibition - 5-6 May 2026 - Erfurt, Germany - Website

Sensors Converge - 5-7 May 2026 - Santa Clara, California, USA -  Website

LOPS 2026 - 8-9 May 2026 - Chicago, Illinois, USA - Website

Embedded Vision Summit - 11-13 May 2026 - Santa Clara, California, USA - Website

CLEO - Congress on Lasers and Electro-Optics - 17-20 May 2026 - Charlotte, North Carolina, USA 

IEEE International Symposium on Robotic and Sensors Environments - 18-19 May 2026 - Norfolk, Virginia, USA - Website

IEEE International Symposium on Integrated Circuits and Systems - 24-27 May 2026 - Shanghai, China - Website

ALLSENSORS 2026 - 24-28 May 2026 - Venice, Italy - Website

Robotics Summit and Expo - 27-28 May 2026 - Boston, Massachusetts, USA - Website


If you know about additional local conferences, please add them as comments.

Return to Conference List index

Thursday, November 13, 2025

Metalenz announces face ID solution

Metalenz and UMC Bring Breakthrough Face Authentication Solution Polar ID to Mass Production

Boston, MA and Hsinchu, TAIWAN, November 12, 2025 - Metalenz, the leader in metasurface innovation and commercialization, and United Microelectronics Corporation (“UMC” NYSE: UMC, TWSW:2303), a leading global semiconductor foundry, today announced Metalenz’s its breakthrough face authentication solution, Polar ID, is now ready for mass production through UMC.

Polar ID is a compact, polarization-based biometric solution that leverages Metalenz’s metasurface technology to bring payment-grade security and advanced sensing capabilities to any device, even the most challenging of form factors. Using a polarization sensitive meta-optic and advanced algorithms, Polar ID extracts additional information sets such as material and contour information to provide secure face authentication in a single image, dramatically reducing cost and complexity over existing secure face unlock solutions.

Metalenz has already demonstrated the product, featuring a polarization sensitive meta-optic directly integrated onto an image sensor, on a smartphone reference platform powered by Snapdragon® mobile processors. UMC manufactures the meta-optic layer using its 40nm process and achieves sensor integration utilizing its wafer-on-wafer bonding technology. Leveraging UMC’s 300mm wafer manufacturing capabilities, as well as the qualification of this supply chain, Metalenz is ready to ramp into volume positioning Polar ID for widespread adoption across consumer electronics, mobile, and IoT platforms.

“By combining our metasurface innovation with UMC’s manufacturing scale and process maturity, Polar ID is ready to meet the demands of high-volume consumer electronics, and to bring secure, affordable face authentication to billions of devices,” said Rob Devlin, CEO and Co-Founder of Metalenz. “Metalenz is the critical enabler of the metasurface market. With the first generation of our technology already at work in the market replacing lens stacks in existing sensing solutions, we are now leveraging the unique capabilities of our technology to bring new forms of sensing to mass markets for the first time. With demand for secure and convenient biometrics rapidly expanding across consumer devices and IoT, Polar ID delivers secure face authentication in the smallest, simplest form factor, making advanced sensing accessible beyond premium tiers and in places it wasn’t previously possible.”

“Our state-of-the art 12-inch facilities and comprehensive portfolio of semiconductor manufacturing process technologies have made us the foundry partner of choice for some of the most advanced fabless semiconductor companies in the world. We have worked with Metalenz on commercializing their metasurface technology since 2021, and we are pleased to be their key manufacturing partner to support the high-volume production of next-generation polarization imaging modules,” said Steven Hsu, Vice President of Technology Development, UMC. “This collaboration will enable UMC to expand our offering into sensor integrated metasurfaces and play a pioneering role in delivering this disruptive imaging technology to market.”

Tuesday, November 11, 2025

Event-Driven Vision Summer School

Unfortunately the registration deadline has already passed, but I'm posting the program here if it's of interest to the blog readers.

https://edpr.iit.it/events/2026-evs

May 17th to May 23rd 2026
Hotel Punta San Martino, Arenzano (GE), Italy

Event-driven cameras adoption in real-world applications is steadily growing, thanks to their low-power, low latency, high temporal resolution, high dynamic range and highly compressive encoding. The EVS school is the first event focused on teaching computer vision with event cameras. Its aim is to offer students an in-depth knowledge of state-of-the-art methods to process event-driven camera data and teach them the required, practical, skills to develop their own applications.

Keynote Speakers

Davide Scaramuzza
University of Zurich, Zurich (Switzerland) 

Kynan Eng
SynSense, Zurich (Switzerland) 

Monday 18th
9:00 – 13:00 Lectures on fundamentals of Event-Driven Vision
14:00 – 18:00 Assignments on algorithmic approaches in Event-Driven Vision

Tuesday 19th
9:00 – 13:00 Lectures on algorithmic approaches in Event-Driven Vision
14:00 – 18:00 Assignments on algorithmic approaches in Event-Driven Vision
Application: optical flow tested on pan-tilt unit
21:30 – 22:30 Keynote, D. Scaramuzza

Wednesday 20th
9:00 – 13:00 Lectures on AI-based approaches for Event-Driven Vision
14:00 – 18:00 Assignments on AI-based approaches for Event-Driven Vision

Thursday 21st
9:00 – 13:00 Lectures on biologically inspired methods and implementation on neuromorphic hardware
14:00 – 18:00 Assignments on biologically inspired methods and implementation on neuromorphic hardware
21:30 – 22:30 Keynote, K. Eng

Friday 22nd
9:00 – 13:00 Lectures on event-based vision for robot control
14:00 – 18:00 Assignments on event-based vision for robot control
Application: closing the loop with robots