a lot more people do not understand the potential of neuromorphics, meaning sparse,event-driven, asynchronous semiconductor devices. This is true for computing and sensing and it goes way beyond simple algebra at every frame, therefore every 30ms for most image sensors. Technology interaction will mean playing with time and energy, and with events generated down to 1ns increments, the potentiallity event driven sensing and computing si tremendous.
I think it's you who lacks some understanding. Doing (A-B) with frames from a normal image sensor is not the same. Event driven sensors capture the changes in light asynchroneously and timestamp these per pixel. @emiliano explains it pretty well.
Still a lot of people don't know how to do (A-B) in image algebra and light algebra :)
ReplyDeleteThe same people won't understand this comment.
a lot more people do not understand the potential of neuromorphics, meaning sparse,event-driven, asynchronous semiconductor devices. This is true for computing and sensing and it goes way beyond simple algebra at every frame, therefore every 30ms for most image sensors. Technology interaction will mean playing with time and energy, and with events generated down to 1ns increments, the potentiallity event driven sensing and computing si tremendous.
DeleteI think it's you who lacks some understanding. Doing (A-B) with frames from a normal image sensor is not the same. Event driven sensors capture the changes in light asynchroneously and timestamp these per pixel. @emiliano explains it pretty well.
DeleteI think most applications need ordinary camera along with event driven camera. If so, is event driven necessary?
ReplyDelete