Lists

Wednesday, December 18, 2019

IEDM 2019: Samsung Presents its Event-Based Sensor

Samsung presented a paper "Low-Latency Interactive Sensing for Machine Vision" by Paul K. J. Park, Jun-Seok Kim, Chang-Woo Shin, Hyunku Lee, Weiheng Liu, Qiang Wang, Yohan Roh, Jeonghan Kim, Yotam Ater, Evgeny Soloveichik, and Hyunsurk Eric Ryu at IEDM last week.

"In this paper, we introduce the low-latency interactive sensing and processing solution for machine vision applications. The event-based vision sensor can compress the information of moving objects in a costeffective way, which in turn, enables the energy-efficient and real-time processing in various applications such as person detection, motion recognition, and Simultaneous Localization and Mapping (SLAM). Our results show that the proposed technique can achieve superior performance than conventional methods in terms of accuracy and latency.

For this, we had previously proposed 640x480 VGA-resolution DVS with a 9-um pixel pitch supporting a data rate of 300Meps by employing a fully synthesized word-serial group address-event representation (G-AER) which handles massive events in parallel by binding neighboring 8 pixels into a group [3]. The chip only consumes a total of 27mW at a data rate of 100Keps and 50mW at 300Meps.
"

No comments:

Post a Comment

All comments are moderated to avoid spam and personal attacks.