tag:blogger.com,1999:blog-19092890.post4659048747536976032..comments2024-03-28T11:30:04.427+02:00Comments on Image Sensors World: Chronocam Startup Presents Event-Driven SensorVladimir Koifmanhttp://www.blogger.com/profile/01800020176563544699noreply@blogger.comBlogger20125tag:blogger.com,1999:blog-19092890.post-21659873087950243462016-08-10T22:21:27.755+03:002016-08-10T22:21:27.755+03:00Hi,
does anyone know what the difference between t...Hi,<br />does anyone know what the difference between the chronocam and the dynamic vision sensor from inilabs are?<br /><br />i mean the dvs is the sensor for the chronocam. is it right?<br /><br />thx<br />PeterAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-19092890.post-56829121000348793862016-03-30T15:47:56.848+03:002016-03-30T15:47:56.848+03:00There is no direct relation between Chronocam and ...There is no direct relation between Chronocam and Insightness, except that they are using very related vision sensors and that the founders used to collaborate for their research. Insightness is a spin-off from the institute of neuroinformatics in Zurich and it involves Tobi Delbruck, the co-inventor of the Dynamic Vision Sensor.Raphael Bernerhttps://www.blogger.com/profile/11216952903667431369noreply@blogger.comtag:blogger.com,1999:blog-19092890.post-62382773953755184192016-03-29T20:26:43.085+03:002016-03-29T20:26:43.085+03:00What is the relation between Chronocam and Insight...What is the relation between Chronocam and Insightness?Jack Hayesnoreply@blogger.comtag:blogger.com,1999:blog-19092890.post-44721823836142547252015-12-04T19:12:39.068+02:002015-12-04T19:12:39.068+02:00Dear Yang Ni, is there more information on this se...Dear Yang Ni, is there more information on this sensor available?<br /><br />Thanks<br />RaphaelRaphael Bernerhttps://www.blogger.com/profile/11216952903667431369noreply@blogger.comtag:blogger.com,1999:blog-19092890.post-72647110109213043162015-10-27T13:28:47.999+02:002015-10-27T13:28:47.999+02:00You are right, there is no subscibtion. Something ...You are right, there is no subscibtion. Something like that would have to be implemented off-chip in an FPGA or so... The earlier sensor had a way to shut off rows, but that's not really the same thing, and since it was never used we abandoned it for later chips.<br /><br />Btw, there is a new startup that will try to use these sensors for visual positioning, navigation, etc:<br />http://www.insightness.com/Raphael Bernerhttps://www.blogger.com/profile/11216952903667431369noreply@blogger.comtag:blogger.com,1999:blog-19092890.post-50263161780424592392015-10-18T03:29:10.182+03:002015-10-18T03:29:10.182+03:00I was using even-driven computing's terms. Bas...I was using even-driven computing's terms. Basically subscribe-model means the down-stream applications only get specific events that are of interests, e.g., the change of pixels in the center area but not on the edge. Based on your reply, this type of sensors do not do that, they stream out the events regardless and asynchronously. bostonchttps://www.blogger.com/profile/02213116601754091217noreply@blogger.comtag:blogger.com,1999:blog-19092890.post-69637309323882756732015-10-16T16:47:56.627+03:002015-10-16T16:47:56.627+03:00I don't really understand your question, but I...I don't really understand your question, but I'll try to answer anyway. For this I can only talk about the sensors coming out of INI, and it hold for the vision as well as the audition sensors. At the output of the chip you get the addresses of the pixels with very short latency after a change in illumination in the corresponding pixel.<br />The cameras that are sold by iniLabs have a USB interface, and at the driver you get packets of events, where an event is an address and a timestamp with microsecond resolution. Raphael Bernerhttps://www.blogger.com/profile/11216952903667431369noreply@blogger.comtag:blogger.com,1999:blog-19092890.post-78210730706689344542015-10-16T01:12:28.220+03:002015-10-16T01:12:28.220+03:00Thanks a whole bunch for all these information!
I...Thanks a whole bunch for all these information! <br />I bookmarked this page, will check out the inilabs' stuff. Good to see universities in Europe are getting out ivory towers, actively putting their discoveries into good technology/products. I have a naive question, is the asynchronous nature of the pixel allowing you to subscribe the changes whenever happening at the individual pixel level? bostonchttps://www.blogger.com/profile/02213116601754091217noreply@blogger.comtag:blogger.com,1999:blog-19092890.post-43797540514154824122015-10-15T17:46:08.696+03:002015-10-15T17:46:08.696+03:00We have developed a commercial available logarithm...We have developed a commercial available logarithmic differential sensor which can operate correctly to 1lux. But We are always looking for applications for this sensor. If you have any ideas, please let me know. Here is an example : https://www.youtube.com/watch?v=XaV_ZP1CQ1c<br /><br />Thanks !<br />-yang niYangNIhttps://www.blogger.com/profile/14424444367081117570noreply@blogger.comtag:blogger.com,1999:blog-19092890.post-77950819500870412712015-10-15T15:12:02.903+03:002015-10-15T15:12:02.903+03:00Well, we followed your axiom and built single-bit ...Well, we followed your axiom and built single-bit AD converters into the pixel! ;-) <br />The ATIS by Posch et al is really a fully digital pixel where also intensity information is transmitted off-pixel (not just off-chip) in a digital manner, at the cost of a complex pixel. <br /><br />Our dynamic range numbers of 120dB or so actually mean that the sensors work quite decently in low light, however I don't know how well they compare to the latest image sensors from the Sony A7s and the like. But I am curious to see the output of the latest sensor from Tobi's group which will be BSI. In combination with the relatively big pixel that makes a big photodiode...<br /><br />Scanned sensors surely take nicer pictures and are better for lots of applications, but the event-based sensors do have a latency and data reduction advantage which could make them useful in Robotics especially. The future will tell if event-based sensor catch on or not...Raphael Bernerhttps://www.blogger.com/profile/11216952903667431369noreply@blogger.comtag:blogger.com,1999:blog-19092890.post-54504132854376600642015-10-15T13:56:42.762+03:002015-10-15T13:56:42.762+03:00Thanks for the information. Historical event-drive...Thanks for the information. Historical event-driven and data-driven detectors can be found via Google Scholar search but I understand your interest area. Also, it is easy to make a decent sensor that operates under relatively bright light conditions. But what separates the wheat from the chaff is the low light performance, which involves fill factor, QE, and read noise. Anyway, the sensors are interesting but I still think scanned sensors are better. Another axiom: Never do in analog what can be done digitally, despite the allure of analog solutions!<br />Eric R Fossumhttps://www.blogger.com/profile/09740612324630105312noreply@blogger.comtag:blogger.com,1999:blog-19092890.post-43470455854442015112015-10-15T10:24:00.570+03:002015-10-15T10:24:00.570+03:00About DR, we were not able to encounter a scene wh...About DR, we were not able to encounter a scene which surpassed the DR of our sensors, so the DR numbers are scene illumination in full sunlight to scene illumination where the sensor still perceives 50% contrast without changing any bias parameter. The numbers are academic, let's say these sensors have enough DR...<br /><br />Pixel response time is light dependent and usually fast enough to react to flicker, which can cause quite a lot of headache. <br /><br />Lowest light level in electrons I have to admit we never assessed.<br /><br />The austrian institute of technology tried to market traffic sensors (essentially car or people counters) using such a sensor. I think they sold some, but I have no clue how many. Otherwise I am not aware of products using such a sensor. I think largely because there were no algorithms available that could really make use of the new kind of output data. Recently there are more and more publications about algorithms, so I am curious if we see products in the near future.<br /><br />Yeah, Kramer's latest work to which I referred above is from 2003 I think, but as you mentioned there have been silicon retinas since the 80ies. But as far as I know Kramers, and much more so Lichtsteiners sensor where the first ones that were really usable in real world situations.<br /><br />I don't know about event-based readout in detectors. I have developed the readout circuits used in Tobi's group nowadays during my PhD, they are based on Boahens work and use word-serial address events, but compared to Boahen use less transistors in the pixel to make it smaller. And it was hard enough to get it to work reliably...<br /><br />So, my timeline are the last 10-15 years, and in our work there is not so much biomimetic left... The early silicon retinas were much more biomimetic.Raphael Bernerhttps://www.blogger.com/profile/11216952903667431369noreply@blogger.comtag:blogger.com,1999:blog-19092890.post-83735047577060140592015-10-14T00:31:47.753+03:002015-10-14T00:31:47.753+03:00What is the definition of DR for an asynchronous s...What is the definition of DR for an asynchronous sensor?<br />Can all pixels change simultaneously, constantly, like for flicker? <br />Also, what is the lowest light level change a pixel will respond to? (in electrons).<br /><br />Is such a sensor used in any product today? Or anticipated to be widely used in any product?<br /><br />Lastly, the work of Jurg Kramer is from early 2000's, right?<br />Carver Mead's group (incl. Tobi) looked at changes in signal before then.<br />And even Bell Labs/JPL published a paper on a motion sensitive CMOS APS in 1995 ISSCC.<br />Finally, event-driven readout was well understood for sure in 1988 as discussed by several groups in a conference in Leuven on detectors for physics experiments.<br />I am just not sure what timeline Berner is on. Is it just the biomimetic work?<br />Eric R Fossumhttps://www.blogger.com/profile/09740612324630105312noreply@blogger.comtag:blogger.com,1999:blog-19092890.post-71693033712195607602015-10-13T23:04:26.232+03:002015-10-13T23:04:26.232+03:00True, Kramer initiated the event-based sensor, but...True, Kramer initiated the event-based sensor, but I would argue that Patrick (Lichtsteiner)'s pixel design really made it to work. After all, all subsequent publications I am aware of use his differentiator circuit, Posch included.<br /><br />Whether adding a conventional APS pixel is a step backward is debatable, I think both approaches have their advantages and disadvantages. It will depend on the application which one will be more suited...Raphael Bernerhttps://www.blogger.com/profile/11216952903667431369noreply@blogger.comtag:blogger.com,1999:blog-19092890.post-19989178925004173162015-10-13T21:31:42.608+03:002015-10-13T21:31:42.608+03:00It was actually initiated by a guy called J. Krame...It was actually initiated by a guy called J. Kramer that passed away, then carried out by Delbruck, Lichteiner and Posch. Posch took it further by integrated a time based coding of gray levels making the sensor a full asynchronous camera, While Delbruck followed a more conventional path adding conventional APS pixels to the initial chip, which looks like a step backward for the technology....Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-19092890.post-50643245673868321482015-10-13T15:03:19.536+03:002015-10-13T15:03:19.536+03:00Maybe you would be interested to know that you can...Maybe you would be interested to know that you can buy prototypes of event-based vision and audition sensors at inilabs (www.inilabs.com), spin-off of the institute of neuroinformatics (INI) in Zurich, where the first version of this event-based sensor originated.<br />Chronocam's sensor was originally developed at the Austrian institute of technology by Posch et al, and is itself based on the sensor developed by Tobi at Patrick at the INI.<br /><br />Li's IISW paper is different from Chronocam's sensor because it combines an asynchronous change detection pixel with an standard (synchronous) APS pixel, while Chronocam combines an asynchronous change detection pixel with an also asynchronous 'time-to-first-spike' pixel.<br /><br />For reference, I am former PhD student of Tobi and co-author of Li's paper.Raphael Bernerhttps://www.blogger.com/profile/11216952903667431369noreply@blogger.comtag:blogger.com,1999:blog-19092890.post-69058345887450548962015-10-13T00:00:52.125+03:002015-10-13T00:00:52.125+03:00Congrats for your works on log sensor, Dr Delbruck...Congrats for your works on log sensor, Dr Delbruck! magskyichttps://www.blogger.com/profile/01863338961197365344noreply@blogger.comtag:blogger.com,1999:blog-19092890.post-59910122582387588452015-10-12T20:14:22.506+03:002015-10-12T20:14:22.506+03:00Well I am the founder of an event-driven computing...Well I am the founder of an event-driven computing technology company, that's based on software. For the longest time, I thought it all should be done at the hardware level--the further on very front end, the better. This sensor is in the right direction. bostonchttps://www.blogger.com/profile/02213116601754091217noreply@blogger.comtag:blogger.com,1999:blog-19092890.post-81762475814283538422015-10-12T19:07:09.053+03:002015-10-12T19:07:09.053+03:00Toibi of ETH has designed a sensor for University ...Toibi of ETH has designed a sensor for University of Vienna in an European project. Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-19092890.post-18919430853742349152015-10-12T18:16:57.841+03:002015-10-12T18:16:57.841+03:00This looks a lot like this presentation we had at ...This looks a lot like this presentation we had at the IISW this year from Chenghan Li: http://www.imagesensors.org/Past%20Workshops/2015%20Workshop/2015%20Papers/Sessions/Session_13/13-05_Li_Delbruck.pdf<br /><br />Really interesting technology.Anonymousnoreply@blogger.com