Friday, September 26, 2025
ISSW 2026 call for papers
Tuesday, September 16, 2025
Conference List - March 2026
If you know about additional local conferences, please add them as comments.
Return to Conference List index
Happening Today: Swiss Photonics Lunch Chat
Sunday, September 14, 2025
Job Postings - Week of 14 September 2025
|
Amazon ASIC SoC Manager, Amazon Camera ASIC Team |
Sunnyvale, California, USA |
|
|
GlobalFoundries Principal Engineer Device Engineering (FRCMOS): SPAD/CIS Development |
Singapore |
|
|
Vital Chemicals IC Designer |
Cupertino, California, USA |
|
|
Sony European Graduate Program - Image Sensor Designer |
Oslo, Norway |
|
|
Qualcomm Camera AF/EIS Algorithm System Engineer |
Shanghai, China |
|
|
Teledyne FLIR Sr. Principal ASIC/Analog Design Engineer |
Goleta, California, USA |
|
|
HEPHY PHD STUDENT on semiconductor detector development and readout electronics |
Vienna, Austria |
|
|
Fairchild Imaging Senior Quality Engineer |
San Jose, California, USA |
|
|
Johannes Gutenberg University Particle Detectors for Future Experiments from Concept to Operation |
Mainz, Germany |
Friday, September 12, 2025
Image sensors workshop at IEEE Sensors 2025
Wednesday, September 10, 2025
International Image Sensor Workshop (IISW) 2025 proceedings available
Each article also got a DOI assigned for easy future references, just like all other papers published by IISS since 2007.
Thank you to the all the organizers and volunteers who made this workshop possible!
Monday, September 08, 2025
VoxelSensors Qualcomm collab
VoxelSensors to Advance Next-Generation Depth Sensing Technology with 10x Power Savings for XR Applications
VoxelSensors, a company developing novel intelligent sensing and data insights technology for Physical AI, today announced a collaboration with Qualcomm Technologies, Inc. to jointly optimize VoxelSensors’ sensing technology with Snapdragon® XR Platforms.
Brussels, Aug. 28, 2025 (GLOBE NEWSWIRE) -- VoxelSensors, a company developing novel intelligent sensing and data insights technology for Physical AI, today announced a collaboration with Qualcomm Technologies, Inc. to jointly optimize VoxelSensors’ sensing technology with Snapdragon® XR Platforms.
Technology & Industry Challenges
VoxelSensors has developed Single Photon Active Event Sensor (SPAES™) 3D sensing, a breakthrough technology that solves current critical depth sensing performance limitations for robotics and XR. The SPAES™ architecture addresses them by delivering 10x power savings and lower latency, maintaining robust performance across varied lighting conditions. This innovation is set to enable machines to understand both the physical world and human behavior from user’s point-of-view, advancing Physical AI.
Physical AI processes data from human perspectives to learn about the world around us, predict needs, create personalized agents, and adapt continuously through user-centered learning. This enables new and exciting applications previously unattainable. At the same time, Physical AI pushes the boundaries of operation to wider environments posing challenging conditions like variable lighting and power constraints.
VoxelSensors’ technology addresses both challenges by offering a technology that expands the operative limits of current day sensors, while collecting human point-of-view data to better train physical AI models. Overcoming these challenges will define the future of human-machine interaction.
Collaboration
VoxelSensors is working with Qualcomm Technologies to jointly optimize VoxelSensors’ SPAES™ 3D sensing technology with Snapdragon AR2 Gen 1 Platform, allowing a low-latency and flexible 3D active event data stream. The optimized solution will be available to select customers and partners by December 2025.
“We are pleased to collaborate with Qualcomm Technologies,” said Johannes Peeters, CEO of VoxelSensors. “After five years of developing our technology, we see our vision being realized through optimizations with Snapdragon XR Platforms. With our sensors that are ideally suited for next-generation 3D sensing and eye-tracking systems, and our inference engine for capturing users’ egocentric data, we see great potential in enabling truly personal AI agent interactions only available on XR devices.”
“For the XR industry to expand, Qualcomm Technologies is committed to enabling smaller, faster, and more power-efficient devices,” said Ziad Asghar, SVP & GM of XR at Qualcomm Technologies, Inc. “We see great potential for small, lightweight AR smart glasses that consumers can wear all day. VoxelSensors’ technology offers the potential to deliver higher performance rates with significantly lower power consumption, which is needed to achieve this vision.”
Market Impact and Future Outlook
As VoxelSensors continues to miniaturize their technology, the integration into commercial products is expected to significantly enhance the value proposition of next-generation XR offerings. Collaborating with Qualcomm Technologies, a leader in XR chipsets, emphasizes VoxelSensors’ commitment to fostering innovation to advance the entire XR ecosystem, bringing the industry closer to mainstream adoption of all-day wearable AR devices.
Friday, September 05, 2025
SMPTE awards Dr. Peter Centen
https://www.smpte.org/about/awards-programs/camera-winners
2025 - Dr. Peter G. M. Centen
For pioneering innovations in image sensor technology that transformed electronic cinematography and broadcast imaging. Over a career spanning more than four decades, Dr. Centen played a pivotal role in the industry’s transition from CCD to CMOS image sensors, serving as chief architect of the Xensium family that enabled HD, 4K, HDR, and HFR imaging. During the transitions from SD to HD, narrow-screen to widescreen, and film to digital cinematography, his development of Dynamic Pixel Management—a groundbreaking sub-pixel-control technology—allowed a single sensor to support multiple resolutions and aspect ratios, including ultra-wide formats (~2.4:1), without compromise. This innovation, first implemented in the Viper FilmStream camera, eliminated the need for format-specific imaging systems and laid the foundation for today’s flexible, high-performance camera designs.
The Camera Origination and Imaging Medal, established in 2012, recognizes significant technical achievements related to inventions or advances in imaging technology, including sensors, imaging processing electronics, and the overall embodiment and application of image capture devices.
Wednesday, September 03, 2025
Prophesee announces GenX320 starter kits for Raspberry Pi
Prophesee Brings Event-Based Vision to Raspberry Pi 5 with GenX320 Starter Kit
New starter kit provides developers efficient, cost-effective way to leverage low-power, high-speed neuromorphic vision for IoT, drones, robotics, security and surveillance—with one of the world’s most popular embedded development platforms
PARIS, Aug 26, 2025
Prophesee, the inventor and leader in event-based neuromorphic vision systems, today announces the launch of the GenX320 Starter Kit for Raspberry Pi® 5, making its breakthrough frameless sensing technology available to the Raspberry Pi developer community for the first time. Built around Prophesee’s ultra-compact, ultra-efficient GenX320 event-based vision sensor, the kit connects directly to the Raspberry Pi 5 camera connector to allow development of real-time applications that leverage the advantages of event-based vision for drones, robotics, industrial automation, surveillance, and more.
The kit enables efficient, cost-effective and easy-to-use access to develop solutions based on Prophesee’s advanced Metavision® event-based vision platform, through use of the company’s OpenEB, open-source core of its award-winning Metavision SDK. The Raspberry Pi ecosystem is one of the largest and most active hardware communities in the world, with more than 60 million units sold and millions of developers engaged across open-source and maker platforms.
Event-based vision is a paradigm shift from traditional frame-based approaches. It doesn’t capture entire images at once but instead detects changes in brightness, known as “events,” at each pixel. This makes sensors much faster (responding in microseconds), able to operate with much less data and processing power, and be more power-efficient than traditional sensors.
The kit is purpose-built to enable real-world, real-time applications where traditional frame-based vision struggles:
- Drones & Robotics: Obstacle avoidance, drone-to-drone tracking, real-time SLAM
- Industrial IoT: 3D scanning, defect detection, and predictive maintenance
- Surveillance & Safety: Intrusion detection, fall detection, and motion analytics
ABOUT THE KIT
The GenX320 Starter Kit is built around the Prophesee GenX320 sensor, the smallest and most power-efficient event-based vision sensor available. With a 320×320 resolution, >140 dB dynamic range, event rate equivalent to ~10,000 fps, and sub-millisecond latency, the sensor provides the performance needed for demanding real-time applications on an embedded platform.
Key Features:
- Compact event-based camera module with MIPI CSI-2 interface
- Native integration with Raspberry Pi 5 (board sold separately)
- Power-efficient operation (<50 mW sensor-only consumption)
- OpenEB support with Python and C++ APIs
Software Resources:
- Developers will be able to access drivers, data recording, replay and visualization tools on GitHub.
- Access to the Prophesee Knowledge Center, a centralized location for users to access various resources, including: a download repository, user guides, and FAQs; a community forum to share ideas; a support ticket system; and additional resources such as application notes, product manuals, training videos, and more than 200 academic papers.
AVAILABILITY
The Prophesee GenX320 Starter Kit for Raspberry Pi 5 is available for pre-order starting August 26, 2025, through Prophesee’s website and authorized distributors. For more information or to order, visit: www.prophesee.ai/event-based-starter-kit-genx320-raspberry-pi-5/
Monday, September 01, 2025
Galaxycore 50MP 0.61um CIS
Translated from Baidu news: https://baijiahao-baidu-com.translate.goog/s?id=1839605263838551524&wfr=spider&for=pc&_x_tr_sl=zh-CN&_x_tr_tl=de&_x_tr_hl=de&_x_tr_pto=wapp
GLOBAL HUI, August 5th | GalaxyCore (688728.SH) announced that it has recently achieved mass production and shipment of its 0.61-micron 50-megapixel image sensor product. This product, the world's first single-chip 0.61-micron pixel image sensor, is based on the company's unique Galaxy Cell 2.0 process platform and manufactured in the company's own wafer fab, significantly improving small-pixel performance. This product utilizes a 1/2.88 optical size, reducing the thickness of the camera module and making it widely applicable to smartphone rear-mounted main cameras, ultra-wide-angle cameras, and front-facing cameras. Furthermore, this product integrates single-frame high dynamic range (DAGHDR) technology, achieving wider dynamic range coverage in a single exposure, effectively addressing overexposure and underexposure issues in backlit scenes. It also supports PDAF phase autofocus, ensuring a fast and accurate shooting experience.
The company's 0.61-micron 50-megapixel image sensor has entered mass production and shipment, successfully entering the rear-mounted main camera market for branded mobile phones. This marks further market recognition of the company's innovative high-pixel single-chip integration technology and fully demonstrates the efficiency of its Fab-Lite model. To date, the company has achieved mass production of 0.7-micron 50-megapixel, 1.0-micron 50-megapixel, and 0.61-micron 50-megapixel image sensors based on single-chip integration technology. The company will subsequently leverage this technology platform to further enhance the performance of high-pixel products such as 32-megapixel and 50-megapixel, while also launching products with specifications exceeding 100 megapixels. This will continuously strengthen the company's core competitiveness, increase market share, and expand its leading position.