Friday, June 20, 2025

Sony IMX479 520-pixel SPAD LiDAR sensor

Press release: https://www.sony-semicon.com/en/news/2025/2025061001.html

Sony Semiconductor Solutions to Release Stacked SPAD Depth Sensor for Automotive LiDAR Applications, Delivering High-Resolution, High-Speed Performance High-resolution, high-speed distance measuring performance contributes to safer, more reliable future mobility

Atsugi, Japan — Sony Semiconductor Solutions Corporation (SSS) today announced the upcoming release of the IMX479 stacked, direct Time of Flight (dToF) SPAD depth sensor for automotive LiDAR systems, delivering both high-resolution and high-speed performance.

The new sensor product employs a dToF pixel unit composed of 3×3 (horizontal × vertical) SPAD pixels as a minimum element to enhance measurement accuracy using a line scan methodology. In addition, SSSs proprietary device structure enables a frame rate of up to 20 fps, which is the fastest for such a high-resolution SPAD depth sensor having 520 dToF pixels. 

The new product enables the high-resolution and high-speed distance measuring performance demanded for an automotive LiDAR required in advanced driver assistance systems (ADAS) and automated driving (AD), contributing to safer and more reliable future mobility. 

LiDAR technology is crucial for the high-precision detection and recognition of road conditions and the position and shape of the objects, such as vehicles, pedestrians. There is a growing demand for further technical advancements and developments progress in LiDAR toward Level 3 automated driving, which allows for autonomous control. SPAD depth sensors use the dToF measurement method, one of the LiDAR ranging methods, that measures the distance to an object by detecting the time of flight (time difference) of light emitted from a source until it returns to the sensor after being reflected by the object.

The new sensor harnesses SSS’s proprietary technologies acquired in the development of CMOS image sensors, including the back-side illuminated, stacked structure and Cu-Cu (copper-copper) connections. By integrating the newly developed distance measurement circuits and dToF pixels on a single chip, the new product has achieved a high-speed frame rate of up to 20 fps while delivering a high resolution of 520 dToF pixels with a small pixel size of 10 μm square.

Main Features
■ Up to 20 fps frame rate, the fastest for a 520 dToF pixel SPAD depth sensor

This product consists of a pixel chip (top) with back-illuminated dToF pixels and a logic chip equipped with newly developed distance measurement circuits (bottom) using a Cu-Cu connection on a single chip. This design enables a small pixel size of 10 μm square, achieving high resolution of 520 dToF pixels. The new distance measurement circuits handle multiple processes in parallel for even better high-speed processing.

These technologies achieve a frame rate of up to 20 fps, the fastest for a 520 dToF pixel SPAD depth sensor. They also deliver capabilities equivalent to 0.05 degrees vertical angular resolution, improving the vertical detection accuracy by 2.7 times that of conventional products. These elements allow detection of three-dimensional objects that are vital to automotive LiDAR, including objects as high as 25 cm (such as a tire or other objects in the road) at a distance of 250 m.

■ Excellent distance resolution of 5 cm intervals
The proprietary circuits SSS developed to enhance the distance resolution of this product individually processes each SPAD pixel data and calculates the distance. Doing so successfully improved the LiDAR distance resolution to 5 cm intervals.

■ High, 37% photon detection efficiency enabling detection of objects up to a distance of 300 m
This product features an uneven texture on both the incident plane and the bottom of the pixels, along with an optimized on-chip lens shape. Incident light is diffracted to enhance the absorption rate to achieve a high, 37% photon detection efficiency for the 940 nm wavelength, which is commonly used on automotive LiDAR laser light sources. It allows the system to detect and recognize objects with high precision up to 300 m away even in bright light conditions where the background light is at 100,000 lux or higher.

 


 


Wednesday, June 18, 2025

Artilux and VisEra metalens collaboration

News release: https://www.artiluxtech.com/resources/news/1023

Artilux, the leader of GeSi (germanium-silicon) photonics technology and pioneer of CMOS (complementary metal-oxide-semiconductor) based SWIR (short-wavelength infrared) optical sensing, imaging and communication, today announced its collaboration with VisEra Technologies (TWSE: 6789) on the latest Metalens technology. The newly unveiled Metalens technology differs from traditional curved lens designs by directly fabricating, on a 12” silicon substrate, fully-planar and high-precision nanostructures for precise control of light waves. By synergizing Artilux’s core GeSi technology with VisEra’s advanced processing capabilities, the demonstrated mass-production-ready Metalens technology significantly enhances optical system performance, production efficiency and yield. This cutting-edge technology is versatile and can be broadly applied in areas such as optical sensing, optical imaging, optical communication, and AI-driven commercial applications.

Scaling the Future: Opportunities and Breakthroughs in Metalens Technology
With the rise of artificial intelligence, robotics, and silicon photonics applications, silicon chips implemented for optical sensing, imaging, and communication are set to play a pivotal role in advancing these industries. As an example, smartphones and wearables having built-in image sensing, physiological signal monitoring, and AI-assistant capabilities will become increasingly prevalent. Moreover, with high bandwidth, long reach, and power efficiency advantages, silicon photonics is poised to become a critical component for supporting future AI model training and inference in AI data centers. As hardware designs require greater miniaturization at the chip-level, silicon-based "Metalens” technology will lead and accelerate the deployment of these applications.

Metalens technology offers the benefits of single-wafer process integration and compact optical module design, paving the way for silicon chips to gain growth momentum in the optical field. According to the Global Valuates Reports, the global market for Metalens was valued at US$ 41.8 million in the year 2024 and is projected to reach a revised size of US$ 2.4 billion by 2031, growing at a CAGR up to 80% during the forecast period 2025-2031.

Currently, most optical systems rely on traditional optical lenses, which utilize parabolic or spherical surface structures to focus light and control its amplitude, phase, and polarization properties. However, this approach is constrained by physical limitations, and requires precise mechanical alignment. Additionally, the curved designs of complex optical components demand highly accurate coating and lens-formation processes. These challenges make it difficult to achieve wafer-level integration with CMOS-based semiconductor processes and optical sensors, posing a significant hurdle to the miniaturization and integration of optical systems.

Innovative GeSi and SWIR Sensing Technology Set to Drive Application Deployment via Ultra-Thin Optical Modules
Meta-Surface technology is redefining optical innovation by replacing traditional curved microlenses with ultra-thin, fully planar optical components. This advancement significantly reduces chip size and thickness, increases design freedom for optical modules, minimizes signal interference, and enables precise optical wavefront control. Unlike the emitter-end DOE (Diffraction Optical Element) technology, Artilux’s innovative Metalens technology directly manufactures silicon-based nanostructures on 12” silicon substrates with ultra-high precision. By seamlessly integrating CMOS processes and core GeSi technology on a silicon wafer, this pioneering work enhances production efficiency and yield rates, supporting SWIR wavelengths. With increased optical coupling efficiency, this technology offers versatile solutions for AI applications in optical sensing, imaging, and communication, catering to a wide range of industries such as wearables, biomedical, LiDAR, mixed reality, aerospace, and defense.

Neil Na, Co-Founder and Chief Technology Officer of Artilux, stated, "Artilux has gained international recognitions for its innovations in semiconductor technology. We are delighted to once again share our independently designed Meta-Surface solution, integrating VisEra's leading expertise in 12” wafer-level optical manufacturing processes. This collaboration successfully creates ultra-thin optical components that can precisely control light waves, and enables applications across SWIR wavelength for optical sensing, optical imaging, optical communication and artificial intelligence. We believe this technology not only holds groundbreaking value in the optical field but will also accelerate the development and realization of next-generation optical technologies."

JC Hsieh, Vice President in Research and Development Organization of VisEra, emphasized, "At VisEra, we continuously engage in global CMOS imaging and optical sensor industry developments while utilizing our semiconductor manufacturing strengths and key technologies R&D and partnerships to enhance productivity and efficiency. We are pleased that our business partner, Artilux, has incorporated VisEra’s silicon-based Metalens process technology to advance micro-optical elements integration. This collaboration allows us to break through conventional form factor limitations in design and manufacturing. We look forward to our collaboration driving more innovative applications in the optical sensing industry and accelerating the adoption of Metalens technology."

Metalens technology demonstrates critical potential in industries related to silicon photonics, particularly in enabling miniaturization, improved integration, and enhanced performance of optical components. As advancements in materials and manufacturing processes continue to refine the technology, many existing challenges are gradually being overcome. Looking ahead, Metalens are expected to become standard optical components in silicon photonics and sensing applications, driving the next wave of innovation in optical chips and expanding market opportunities.

 

Monday, June 16, 2025

Zaber application note on image sensors for microscopy

Full article link: https://www.zaber.com/articles/machine-vision-cameras-in-automated-microscopy

When to Use Machine Vision Cameras in Microscope
Situation #1: High Throughput Microscopy Applications with Automated Image Analysis Software
Machine vision cameras are ideally suited to applications which require high throughput, are not limited by low light, and where a human will not look at the raw data. Designers of systems where the acquisition and analysis of images will be automated must change their perspective of what makes a “good” image. Rather than optimizing for images that look good to humans, the goal should be to capture the “worst” quality images which can still yield unambiguous results as quickly as possible when analyzed by software. If you are using “AI”, a machine vision camera is worth considering.
A common example is imaging consumables to which fluorescent markers will hybridize to specific sites. To read these consumables, one must check each possible hybridization site for the presence or absence of a fluorescent signal.

Situation #2: When a Small Footprint is Important
The small size, integration-friendly features and cost effectiveness of machine vision cameras make them an attractive option for OEM devices where minimizing the device footprint and retail price are important considerations. How are machine vision cameras different from scientific cameras? The distinction between machine vision and scientific cameras is not as clear as it once was. The term “Scientific CMOS” (sCMOS) was introduced in the mid 2010’s as advancements of CMOS image sensor technology lead to the development of the first CMOS image sensor cameras that could challenge the performance of then-dominant CCD image sensor technology. These new “sCMOS” sensors delivered improved performance relative to the CMOS sensors that were prevalent in MV cameras of the time. Since then, thanks to the rapid pace of CMOS image sensor development, the current generation of MV oriented CMOS sensors boast impressive performance. There are now many scientific cameras with MV sensors, and many MV cameras with scientific sensors.

 




Sunday, June 15, 2025

Conference List - December 2025

18th International Conference on Sensing Technology (ICST2025) - 1-3 December 2025 - Utsunomiya City, Japan - Website

International Technical Exhibition on Image Technology and Equipment (ITE) - 3-5 December 2025 - Yokohama, Japan - Website

7th International Workshop on New Photon-Detectors (PD2025) - 3-5 December 2025 - Bologna,, Italy - Website

IEEE International Electron Devices Meeting - 6-10 December 2025 - San Francisco, CA, USA - Website


If you know about additional local conferences, please add them as comments.

Return to Conference List index

Friday, June 13, 2025

Videos of the day: UArizona and KAIST

 

UArizona Imaging Technology Laboratory's sensor processing capabilities

 


KAIST  Design parameters of freeform color splitters for image sensors

Thursday, June 12, 2025

Panasonic single-photon vertical APD pixel design

In a paper titled "Robust Pixel Design Methodologies for a Vertical Avalanche Photodiode (VAPD)-Based CMOS Image Sensor" Inoue et al. from Panasonic Japan write:

We present robust pixel design methodologies for a vertical avalanche photodiode-based CMOS image sensor, taking account of three critical practical factors: (i) “guard-ring-free” pixel isolation layout, (ii) device characteristics “insensitive” to applied voltage and temperature, and (iii) stable operation subject to intense light exposure. The “guard-ring-free” pixel design is established by resolving the tradeoff relationship between electric field concentration and pixel isolation. The effectiveness of the optimization strategy is validated both by simulation and experiment. To realize insensitivity to voltage and temperature variations, a global feedback resistor is shown to effectively suppress variations in device characteristics such as photon detection efficiency and dark count rate. An in-pixel overflow transistor is also introduced to enhance the resistance to strong illumination. The robustness of the fabricated VAPD-CIS is verified by characterization of 122 different chips and through a high-temperature and intense-light-illumination operation test with 5 chips, conducted at 125 °C for 1000 h subject to 940 nm light exposure equivalent to 10 kLux. 

 

Open access link to full paper:  https://www.mdpi.com/1424-8220/24/16/5414

Cross-sectional views of a pixel: (a) a conventional SPAD and (b) a VAPD-CIS. N-type and P-type regions are drawn by blue and red, respectively.
 

(a) A chip photograph of VAPD-CIS overlaid with circuit block diagrams. (b) A circuit diagram of the VAPD pixel array. (c) A schematic timing diagram of the pixel circuit illustrated in (b).
 
(a) An illustrative time-lapsed image of the sun. (b) Actual images of the sun taken at each time after starting the experiment. The test lasted for three hours, and as time passed, the sun, initially visible on the left edge of the screen, moved to the right.

Monday, June 09, 2025

Image Sensor Opening at Apple in Japan

Apple Japan

Image Sensor Technical Program Manager - Minato, Tokyo-to, Japan - Link

Friday, June 06, 2025