Prophesee and Qualcomm recently showcased their "blur free" mobile photography technology at the Mobile World Congress in Barcelona.
February 27, 2024 – Paris, France - Prophesee SA, inventor of the most advanced neuromorphic vision systems, today announced that the progress achieved through its collaboration with Qualcomm Technologies, Inc. has now reached production stage. A live demo during Mobile World Congress Barcelona is showcasing Prophesee’s native compatibility with premium Snapdragon® mobile platforms, bringing the speed, efficiency, and quality of neuromorphic-enabled vision to cameras in mobile devices.
Prophesee’s event-based Metavision sensors and AI, optimized for use with Snapdragon platforms now brings motion blur cancellation and overall image quality to unprecedented levels, especially in the most challenging scenarios faced by conventional frame-based RGB sensors, fast-moving and low-light scenes.
“We have made significant progress since we announced this collaboration in February 2023, achieving the technical milestones that demonstrate the impressive impact on image quality our event-based technology has in mobile devices containing Snapdragon mobile platforms. As a result, our Metavision Deblur solution has now reached production readiness,” said Luca Verre, CEO and co-founder of Prophesee. “We look forward to unleashing the next generation of Smartphone's photography and video with Prophesee's Metavision.”
“Qualcomm Technologies is thrilled to continue our strong collaboration with Prophesee, joining efforts to efficiently optimize Prophesee’s event-based Metavision technology for use with our flagship Snapdragon 8 Gen 3 Mobile Platform. This will deliver significant enhancements to image quality and bring new features enabled by event cameras’ shutter-free capability to devices powered by Snapdragon mobile platforms,” said Judd Heape, VP of Product Management at Qualcomm Technologies, Inc.
How it works
Prophesee’s breakthrough sensors add a new sensing dimension to mobile photography. They change the paradigm in traditional image capture by focusing only on changes in a scene, pixel by pixel, continuously, at extreme speeds.
Each pixel in the Metavision sensor embeds a logic core, enabling it to act as a neuron.
They each activate themselves intelligently and asynchronously depending on the amount of photons they sense. A pixel activating itself is called an event. In essence, events are driven by the scene’s dynamics, not an arbitrary clock anymore, so the acquisition speed always matches the actual scene dynamics.
High-performance event-based deblurring is achieved by synchronizing a frame-based and Prophesee’s event-based sensor. The system then fills the gaps between and inside the frames with microsecond events to algorithmically extract pure motion information and repair motion blur.
Learn more: https://www.prophesee.ai/event-based-vision-mobile/
Interesting what's cost
ReplyDeletewhat is the light level needed for "... with microsecond events to algorithmically extract ..." ? If the light level is so strong, why the image from main camera is blurred?
ReplyDeletePixel of the event camera and Smartphone's sensor are different. The event camera continuosly monitor variations of pixels above or below a predefined analog threshold. This trigger a new event with known polarity that are "binary" information. In order to do this few photons are needed and this is why the few microseconds are enough for event detection and event generation event. In the meantime the traditional sensor integrates photons for a longer exposure time providing good quality images but affected by blur. Event information captured all the motion details (tracking of edges) and are used to recover the blur from images coming from the traditional sensor.
DeleteAlpsentek has accounced something similar?
ReplyDeleteusefuless stuffs
DeleteIt will be interesting to see whether the concept of event based deblurring gets traction. Any OEM will need to benchmark this against restoring images and videos using emerging generative ai techniques, which are becoming practical on mobile SoCs.
ReplyDeleteThey are understandably coy about saying anything about the pixel size or pixel count. Based on the vendor product page, this was likely a 4.86um, 1280x720 imager ( https://www.prophesee.ai/event-based-sensors/ ), which is hardly a winning proposition for mobile photography. Prophesee should be respected for making the effort, getting this architecture to 1280x720 is a meaningful improvement that may open new doors. It is an interesting technology, which has been looking for a killer use case for decades. Mobile photography probably isn't it, and betting too much on it may get them to the same place where other elegant and promising technologies ended up.
ReplyDelete