Friday, December 05, 2025

Teradyne blog on automated test equipment for image sensors

Link: https://www.teradyne.com/2025/11/11/invisible-interfaces/

Invisible Interfaces: The Hidden Challenge Behind Every Great Image Sensor

Flexible, future-ready test strategies are crucial to the irregular cycle of sensor design and standards development.

Alexander Metzdorf, Teradyne Inc. 

When you snap a photo on your phone or rely on a car’s camera for lane detection, you’re trusting an unseen network of technologies to deliver or interpret image data flawlessly. But behind the scenes, the interface between the image sensor and its processor is doing the heavy lifting, moving megabtyes of data without error or delay.

While much of the industry conversation focuses on advances in resolution and sensor technology, another challenging aspect of modern imaging innovation is the interfaces—the invisible pathways that connect these sensors to the systems around them, including the processors tasked with interpreting their data. One of the most pressing and underappreciated imaging challenges lies in the ability of the interfaces to handle growing demands for speed, bandwidth, and reliability. The challenge isn’t one-size-fits-all. Smartphone cameras may need ultra-high resolution over short distances, while automotive sensors prioritize robustness and wider areas.

As image sensors and the technologies used to interpret the data evolve to deliver higher resolutions and even integrate artificial intelligence directly onto the chip, these interfaces are under more pressure than ever before. The challenge is both technical and practical: how do you design and test interfaces that must support vastly different applications, from the low-power demands of smartphones to the rugged, long-distance requirements of automotive systems?

And even more critically, how do you keep up when the rules change every few months?

The Growing Challenge in Image Sensor Development

The industry’s insatiable appetite for higher resolutions is well known, but what often goes unnoticed is the corresponding explosion in data traffic. A single image sensor on a smartphone might capture 500 megabytes of data in one shot. In automotive systems, that sensor could be sending critical visual information across several meters of cabling to a centralized processor, where decisions like emergency braking or obstacle detection happen in real-time. Industrial imaging is pushing resolutions even higher (up to 500 megapixels in some cases) to support inspection and automation systems, creating enormous data handling and processing demands.

Each of these scenarios represents wildly different demands on the interfaces connecting sensors to the rest of the system. In smartphones, the processor is typically located just millimeters away from the image sensor. Power efficiency is paramount, and interfaces must support blisteringly fast data rates to process high-resolution images without draining the battery. In an automotive application, a vehicle’s safety system might require those same sensors to transmit data over longer distances, and deliver real-time information and decision-making in harsh environments, while meeting stringent reliability and safety standards.

It’s a challenge compounded by the fact that image sensor manufacturers rarely control these interface requirements. Industry-wide, sensor manufacturers are generally forced to adopt a growing variety of interface standards and proprietary solutions, each with unique requirements for bandwidth, distance, latency, and power consumption.

This creates a relentless cycle of adaptation, where manufacturers are forced to develop and validate new interfaces almost as quickly as they can design the sensors themselves. It’s not uncommon for entirely new interface requirements to be handed down with lead times as short as six months. Unpredictability follows for both image sensor designers and the teams responsible for testing these devices.

The Shift Toward Proprietary Interfaces

While MIPI remains the dominant open standard for image sensor interfaces, proprietary protocols are growing. These custom protocols are typically developed privately by major technology companies to support their unique product requirements, for example, to achieve specific performance advantages. These custom interfaces are closely guarded secrets and often remain entirely undocumented outside of the companies that develop them, making it extremely difficult for test equipment vendors to keep pace.
Even a full teardown of a high-end smartphone won’t reveal how its camera interfaces are engineered. Yet, despite having no access to these underlying specifications, test teams are still expected to validate sensor performance against them.

For manufacturers and test engineers, this creates a near-constant state of uncertainty. New protocols can emerge rapidly and without warning, and must be supported almost immediately, which can cause test equipment providers to scramble to retool systems.

Teradyne’s Approach: Flexibility as a Strategic Imperative

Teradyne has set out to solve this challenge, developing a modular, future-ready approach that gives manufacturers the flexibility they need to thrive in unpredictable environments.

At the hardware level, Teradyne’s UltraSerial20G capture instrument for the UltraFLEXplus is designed for adaptability. Its modular architecture allows changes in key components and software to quickly accommodate new protocols.

Additional flexibility is added with Teradyne’s IG-XL software. Customers are empowered to develop highly customized test strategies, controlling every detail of the testing process, from voltage and timing to signal slopes and data handling.

The Path Ahead: Staying Competitive in a Fragmented, Fast-moving Market

For image sensor makers, the message is clear: choose test platforms that are prepared for proprietary protocols, evolving standards, and ever-tighter time-to-market demands.

In this landscape, Teradyne’s modular hardware and powerful, agile software ensure that manufacturers are meeting current demands and are prepared for whatever comes next. With early interface testing capabilities and scalable solutions that can adapt on the fly, Teradyne customers stay ahead of integration risks, control costs, and accelerate time-to-market.

In an industry where speed, innovation, and reliability are everything, that kind of flexibility is more than just a technical feature. It’s a strategic necessity that offers manufacturers the freedom to innovate, knowing they have the flexibility they need in their test solutions.

Wednesday, December 03, 2025

A-SSCC Circuit Insights CMOS Image Sensor

 

A-SSCC 2025 - Circuit Insights #4: Introduction to CMOS Image Sensors - Prof. Chih-Cheng Hsieh

About Circuit Insights: Circuit Insights features internationally renowned researchers in circuit design, who will deliver engaging and accessible lectures on fundamental circuit concepts and diverse application areas, tailored to a level suitable for senior undergraduate students and early graduate students. The event will provide a valuable and inspiring opportunity for those who are considering or pursuing a career in circuit design.

About the Presenter: Chih-Cheng Hsieh received the B.S., M.S., and Ph.D. degrees from the Department of Electronics Engineering, National Chiao Tung University, Hsinchu, Taiwan, in 1990, 1991, and 1997, respectively.,From 1999 to 2007, he was with an IC Design House, Pixart Imaging Inc., Hsinchu. He led the Mixed-Mode IC Department, as a Senior Manager and was involved in the development of CMOS image sensor ICs for PC, consumer, and mobile phone applications. In 2007, he joined the Department of Electrical Engineering, National Tsing Hua University, Hsinchu, where he is currently a Full Professor. His current research interests include low-voltage low-power smart CMOS image sensor IC, ADC, and mixed-mode IC development for artificial intelligence (AI), internet of things (IoT), biomedical, space, robot, and customized applications.,Dr. Hsieh serves as a TPC member of ISSCC and A-SSCC, and an Associate Editor of IEEE Solid–State Circuit Letters (SSC-L) and IEEE Circuits and Systems Magazine (CASM). He was the SSCS Taipei Chapter Chair and the Student Branch Counselor of NTHU, Taiwan.

Monday, December 01, 2025

Time-mode CIS paper

In a recent paper titled "An Extended Time-Mode Digital Pixel CMOS Image Sensor for IoT Applications" Kim et al from Yonsei University write:

Time-mode digital pixel sensors have several advantages in Internet-of-Things applications, which require a compact circuit and low-power operation under poorly illuminated environments. Although the time-mode digitization technique can theoretically achieve a wide dynamic range by overcoming the supply voltage limitation, its practical dynamic range is limited by the maximum clock frequency and device leakage. This study proposes an extended time-mode digitization technique and a low-leakage pixel circuit to accommodate a wide range of light intensities with a small number of digital bits. The prototype sensor was fabricated in a 0.18 μm standard CMOS process, and the measurement results demonstrate its capability to accommodate a 0.03 lx minimum light intensity, providing a dynamic range figure-of-merit of 1.6 and a power figure-of-merit of 37 pJ/frame·pixel. 

Sensors 2025, 25(23), 7228; https://doi.org/10.3390/s25237228

 



Figure 1. Operation principle of conventional CISs: (a) voltage mode; (b) fixed reference; and (c) ramp-down TMD.
Figure 2. Theoretical photo-transfer curve of conventional 6-bit TMDs.
Figure 3. The operation principle of the proposed E-TMD technique.
Figure 4. Theoretical photo-transfer curve of the proposed E-TMD: (a) TS = TU = TD = 2000tCK, Δ = 0; (b) TS = TU = TD = 100tCK, Δ = 0; (c) TS = TU = 0, TD = 45tCK, Δ = 0; and (d) TS = 0, TU = 25tCK, TD = 45tCK, Δ = 0.7.
Figure 5. The conventional time-mode digital pixel CIS adapted from [11]: (a) architecture; (b) pixel schematic diagram.
Figure 6. Architecture and schematic diagram of the proposed time-mode digital pixel CIS.
Figure 7. Operation of the proposed time-mode digital pixel CIS with α representing VDD-vREF-VT: (a) six operation phases and (b) timing diagram.
Figure 8. Transistor-level simulated photo-transfer curve comparison.

Figure 9. Chip micrograph.

 

Figure 10. Captured sample images: (a) 190 lx, TS = 17 ms, tCK = 50 µs; (b) 1.9 lx, TS = 400 ms, tCK = 2 µs.
Figure 11. Captured sample images and their histograms: (a) 20.5 lx, TS = 32.6 ms; (b) 200.6 lux, TS = 4.6 ms; (c) 2106 lux, TS = 0.64 ms; (d) 2106 lux, TS = 0.64 ms, TU = 0.74 ms, TD = 1.84 ms, Δ = 0.5.

Thursday, November 27, 2025

ISSCC 2026 Image Sensors session

ISSCC 2026 will be held Feb 15-19, 2026 in San Francisco, CA.

The advance program is now available: https://submissions.mirasmart.com/ISSCC2026/PDF/ISSCC2026AdvanceProgram.pdf 

Session 7 Image Sensors and Ranging (Feb 16)

Session Chair: Augusto Ximenes, CogniSea, Seattle, WA
Session Co-Chair: Andreas Suess, Google, Mountain View, CA

54×42 LiDAR 3D-Stacked System-On-Chip with On-Chip Point
Cloud Processing and Hybrid On-Chip/Package-Embedded 25V
Boost Generation

VoxCAD: A 0.82-to-81.0mW Intelligent 3D-Perception dToF SoC
with Sector-Wise Voxelization and High-Density Tri-Mode eDRAM
CIM Macro

A Multi-Range, Multi-Resolution LiDAR Sensor with
2,880-Channel Modular Survival Histogramming TDC and Delay
Compensation Using Double Histogram Sampling

A 480×320 CMOS LiDAR Sensor with Tapering 1-Step
Histogramming TDCs and Sub-Pixel Echo Resolvers

A 26.0mW 30fps 400x300-pixel SWIR Ge-SPAD dToF Range
Sensor with Programmable Macro-Pixels and Integrated
Histogram Processing for Low-Power AR/VR Applications

A 128×96 Multimodal Flash LiDAR SPAD Imager with Object
Segmentation Latency of 18μs Based on Compute-Near-Sensor
Ising Annealing Machine

A Fully Reconfigurable Hybrid SPAD Vision Sensor with 134dB
Dynamic Range Using Time-Coded Dual Exposures

A 55nm Intelligent Vision SoC Achieving 346TOPS/W System
Efficiency via Fully Analog Sensing-to-Inference Pipeline

A 1.09e--Random-Noise 1.5μm-Pixel-Pitch 12MP Global-Shutter-
Equivalent CMOS Image Sensor with 3μm Digital Pixels Using
Quad-Phase-Staggered Zigzag Readout and Motion
Compensation

A 200MP 0.61μm-Pixel-Pitch CMOS Imager with Sub-1e- Readout
Noise Using Interlaced-Shared Transistor Architecture and
On-Chip Motion Artifact-Free HDR Synthesis for 8K Video
Applications

Tuesday, November 25, 2025

Ubicept releases toolkit for SPAD and CIS

Ubicept Extends Availability of Perception Technology to Make Autonomous Systems Using Conventional Cameras More Reliable

Computer vision processing unlocks higher quality, more trustworthy visual data for machines whether they use advanced sensors from Pi Imaging Technology or conventional vision systems

BOSTON--(BUSINESS WIRE)--Ubicept, the computer vision startup operating at the limits of physics, today announced the release of the Ubicept Toolkit, which will bring its physics-based imaging to any modern vision system. Whether for single-photon avalanche diode (SPAD) sensors in next-generation vision systems or immediate image quality improvements with existing hardware, Ubicept provides a unified, physics-based approach that delivers high quality, trustworthy data.

“Ubicept’s technology revolutionizes how machines see the world by unlocking the full potential of today's and tomorrow's image sensors. Our physics-based approach captures the full complexity of motion, even in low-light or high-dynamic-range conditions, providing more trustworthy data than AI-based video enhancement,” said Sebastian Bauer, CEO of Ubicept. “With the Ubicept Toolkit, we’re now making our advanced single-photon imaging more accessible for a broad range of applications from robotics to automotive to industrial sensing.”

Ubicept’s solution is designed for the most advanced sensors to maximize image data quality and reliability. Now, the Toolkit will support any widely available CMOS camera with raw uncompressed output, giving perception developers immediate quality gains.

“Autonomous systems need a better way to understand the world. Our mission is to turn raw photon data into outputs that are specifically designed for computer vision, not human consumption,” said Tristan Swedish, CTO of Ubicept. “By making our technology available for more conventional vision systems, we are giving engineers the opportunity to experience the boost in reliability now while creating an easier pathway to SPAD sensor adoption.”

SPAD sensors – traditionally used in 3D systems – are poised to reshape the image sensor and computer vision landscape. While the CMOS sensor market is projected to grow to $30B by 2029 at 7.5% CAGR, the SPAD market is growing nearly three times faster, expected to reach $2.55B by 2029 at 20.1% CAGR.

Pi Imaging Technology is a leader in the field with its SPAD Alpha, a next-generation 1-megapixel single-photon camera that delivers zero read noise, nanosecond-level exposure control, and frame rates up to 73,000 fps. Designed for demanding scientific applications, it offers researchers and developers extreme temporal precision and light sensitivity. The Ubicept Toolkit builds on these strengths by transforming the SPAD Alpha’s raw photon data into clear, ready-to-use imagery for perception and analysis.

“Ubicept shares our deep commitment to advancing perception technology,” said Michel Antolović, CEO of Pi Imaging Technology. “By combining our SPAD Alpha’s state-of-the-art hardware with Ubicept’s real-time processing, perception engineers can get the most from what single-photon imaging has to offer.”

The Toolkit provides engineering teams with everything they need to visualize, capture, and process video data efficiently with the Ubicept Photon Fusion (UPF) algorithm. The SPAD Toolkit also includes Ubicept’s FLARE (Flexible Light Acquisition and Representation Engine) firmware for optimized photon capture. In addition, the Toolkit includes white-glove support to early adopters for a highly personalized and premium experience.

The Ubicept Toolkit will be available in December 2025. To learn how it can elevate perception performance and integrate into existing workflows, contact Ubicept here.

Monday, November 24, 2025

Job Postings - Week of November 23 2025


ByteDance

Image Sensor Digital Design Lead- Pico

San Jose, California, USA

Link

ST Microelectronics

Silicon Photonics Product Development Engineer

Grenoble, France

Link

DigitalFish

Senior Systems Engineer, Cameras/Imaging

Sunnyvale, California, USA [Remote]

Link

Imasenic

Digital IC Design Engineer

Barcelona, Spain

Link

Meta

Technical Program Manager, Camera Systems

Sunnyvale, California, USA

Link

Westlake University

Ph.D. Positions in Dark Matter & Neutrino Experiments

Hangzhou, Zhejiang,

China

Link

General Motors

Advanced Optical Sensor Test Engineer

Warren, Michigan, USA

[Hybrid]

Link

INFN

Post-Doc senior research grant in experimental physics

Frascati, italy

Link

Northrop Grumman

Staff EO/IR Portfolio Technical Lead

Melbourne, Florida, USA

Link

Friday, November 21, 2025

"Camemaker" image sensors search tool

An avid reader of the blog shared this handy little search tool for image sensors: 

https://www.camemaker.com/shop

Although it isn't comprehensive (only covers a few companies), you can filter by various sensor specs. Try it out? 

Monday, November 17, 2025

Event cameras: applications and challenges

Gregor Lenz (roboticist, and cofounder of Open Neuromorphic and Neurobus) has written a two-part blogpost that readers of ISW might find enlightening:

https://lenzgregor.com/posts/event-cameras-2025-part1/

https://lenzgregor.com/posts/event-cameras-2025-part2/ 

Gregor goes into various application domains where event cameras have been tried, but faced challenges, technical and otherwise.

Wide adoption will depend less and less on technical merit and more on how well the new sensor modality will fit into existing pipelines for X where X can be supply chain, hardware, software, manufacturing, assembly, testing, ...  pick your favorite!

Saturday, November 15, 2025

Conference List - May 2026

Quantum Photonics Conference, Networking and Trade Exhibition - 5-6 May 2026 - Erfurt, Germany - Website

Sensors Converge - 5-7 May 2026 - Santa Clara, California, USA -  Website

LOPS 2026 - 8-9 May 2026 - Chicago, Illinois, USA - Website

Embedded Vision Summit - 11-13 May 2026 - Santa Clara, California, USA - Website

CLEO - Congress on Lasers and Electro-Optics - 17-20 May 2026 - Charlotte, North Carolina, USA 

IEEE International Symposium on Robotic and Sensors Environments - 18-19 May 2026 - Norfolk, Virginia, USA - Website

IEEE International Symposium on Integrated Circuits and Systems - 24-27 May 2026 - Shanghai, China - Website

ALLSENSORS 2026 - 24-28 May 2026 - Venice, Italy - Website

Robotics Summit and Expo - 27-28 May 2026 - Boston, Massachusetts, USA - Website


If you know about additional local conferences, please add them as comments.

Return to Conference List index

Thursday, November 13, 2025

Metalenz announces face ID solution

Metalenz and UMC Bring Breakthrough Face Authentication Solution Polar ID to Mass Production

Boston, MA and Hsinchu, TAIWAN, November 12, 2025 - Metalenz, the leader in metasurface innovation and commercialization, and United Microelectronics Corporation (“UMC” NYSE: UMC, TWSW:2303), a leading global semiconductor foundry, today announced Metalenz’s its breakthrough face authentication solution, Polar ID, is now ready for mass production through UMC.

Polar ID is a compact, polarization-based biometric solution that leverages Metalenz’s metasurface technology to bring payment-grade security and advanced sensing capabilities to any device, even the most challenging of form factors. Using a polarization sensitive meta-optic and advanced algorithms, Polar ID extracts additional information sets such as material and contour information to provide secure face authentication in a single image, dramatically reducing cost and complexity over existing secure face unlock solutions.

Metalenz has already demonstrated the product, featuring a polarization sensitive meta-optic directly integrated onto an image sensor, on a smartphone reference platform powered by Snapdragon® mobile processors. UMC manufactures the meta-optic layer using its 40nm process and achieves sensor integration utilizing its wafer-on-wafer bonding technology. Leveraging UMC’s 300mm wafer manufacturing capabilities, as well as the qualification of this supply chain, Metalenz is ready to ramp into volume positioning Polar ID for widespread adoption across consumer electronics, mobile, and IoT platforms.

“By combining our metasurface innovation with UMC’s manufacturing scale and process maturity, Polar ID is ready to meet the demands of high-volume consumer electronics, and to bring secure, affordable face authentication to billions of devices,” said Rob Devlin, CEO and Co-Founder of Metalenz. “Metalenz is the critical enabler of the metasurface market. With the first generation of our technology already at work in the market replacing lens stacks in existing sensing solutions, we are now leveraging the unique capabilities of our technology to bring new forms of sensing to mass markets for the first time. With demand for secure and convenient biometrics rapidly expanding across consumer devices and IoT, Polar ID delivers secure face authentication in the smallest, simplest form factor, making advanced sensing accessible beyond premium tiers and in places it wasn’t previously possible.”

“Our state-of-the art 12-inch facilities and comprehensive portfolio of semiconductor manufacturing process technologies have made us the foundry partner of choice for some of the most advanced fabless semiconductor companies in the world. We have worked with Metalenz on commercializing their metasurface technology since 2021, and we are pleased to be their key manufacturing partner to support the high-volume production of next-generation polarization imaging modules,” said Steven Hsu, Vice President of Technology Development, UMC. “This collaboration will enable UMC to expand our offering into sensor integrated metasurfaces and play a pioneering role in delivering this disruptive imaging technology to market.”