“This work is the first step toward our final destination–to develop a micro-scale camera for microrobots,” says assistant professor of Physics Sidong Lei, who led the research. “We illustrate the fundamental principle and feasibility to construct this new type of image sensor with emphasis on miniaturization.”
Friday, April 29, 2022
Color sensing for nano-vision-sensors
Thursday, April 28, 2022
Samsung making a new larger ISOCELL camera sensor?
Wednesday, April 27, 2022
90-min Tutorial on Single Photon Detectors
Tuesday, April 26, 2022
Embedded Vision Summit 2022
- Keynote speaker Prof. Ryad Benosman of University of Pittsburgh and the CMU Robotics Institute will speak on “Event-based Neuromorphic Perception and Computation: The Future of Sensing and AI”
- General session speakers include:
- Zach Shelby, co-founder and CEO of Edge Impulse, speaking on “How Do We Enable Edge ML Everywhere? Data, Reliability, and Silicon Flexibility”
- Ziad Asghar, Vice President of Product Management at Qualcomm, speaking on “Powering the Connected Intelligent Edge and the Future of On-Device AI”
- 90+ sessions across four tracks—Fundamentals, Technical Insights, Business Insights, and Enabling Technologies
- 60+ exhibitors including Premier Sponsors Edge Impulse and Qualcomm, Platinum Sponsors FlexLogix and Intel, and Gold Sponsors Arm, Arrow, Avnet, BDTi, City of Oulu, Cadence, Hailo, Lattice, Luxonis, Network Optics, Nota, Perceive, STMicroelectronics, Synaptics and AMD Xilinx
- Deep Dive Sessions — offering opportunities to explore cutting-edge topics in-depth — presented by Edge Impulse, Qualcomm, Intel, and Synopsys
- “We are delighted to return to being in-person for the Embedded Vision Summit after two years of online Summits,” said Jeff Bier, founder of the Edge AI and Vision Alliance. “Innovation in visual and edge AI continues at an astonishing pace, so it’s more important than ever to be able to see, in one place, the myriad of practical applications, use cases and building-block technologies. Attendees with diverse technical and business backgrounds tell us this is the one event where they get a complete picture and can rapidly sort out the hype from what’s working. A whopping 98% of attendees would recommend attending to a colleague.”
Half a billion years ago something remarkable occurred: an astonishing, sudden increase in new species of organisms. Paleontologists call it the Cambrian Explosion, and many of the animals on the planet today trace their lineage back to this event.A similar thing is happening in processors for embedded vision and artificial intelligence (AI) today, and nowhere will that be more evident than at the Embedded Vision Summit, which will be an in–person event held in Santa Clara, California, from May 16–19. The Summit focuses on practical know–how for product creators incorporating AI and vision in their products. These products demand AI processors that balance conflicting needs for high performance, low power, and cost sensitivity. The staggering number of embedded AI chips that will be on display at the Summit underscores the industry’s response to this demand. While the sheer number of processors targeting computer vision and ML is overwhelming, there are some natural groupings that make the field easier to comprehend. Here are some themes we’re seeing.
Founded in 2011, the Edge AI and Vision Alliance is a worldwide industry partnership that brings together technology providers who are enabling innovative and practical applications for edge AI and computer vision. Its 100+ Member companies include suppliers of processors, sensors, software and services.First, some processor suppliers are thinking about how to best serve applications that simultaneously apply machine learning (ML) to data from diverse sensor types — for example, audio and video. Synaptics’ Katana low–power processor, for example, fuses inputs from a variety of sensors, including vision, sound, and environmental. Xperi’s talk on smart toys for the future touches on this, as well.Second, a subset of processor suppliers are focused on driving power and cost down to a minimum. This is interesting because it enables new applications. For example, Cadence will be presenting on additions to their Tensilica processor portfolio that enable always–on AI applications. Arm will be presenting low–power vision and ML use cases based on their Cortex–M series of processors. And Qualcomm will be covering tools for creating low–power computer vision apps on their Snapdragon family.Third, although many processor suppliers are focused mainly or exclusively on ML, a few are addressing other kinds of algorithms typically used in conjunction with deep neural networks, such as classical computer vision and image processing. A great example is quadric, whose new q16 processor is claimed to excel at a wide range of algorithms, including both ML and conventional computer vision.Finally, an entirely new species seems to be coming to the fore: neuromorphic processors. Neuromorphic computing refers to approaches that mimic the way the brain processes information. For example, biological vision systems process events in the field of view, as opposed to classical computer vision approaches that typically capture and process all the pixels in a scene at a fixed frame rate that has no relation to the source of the visual information. The Summit’s keynote talk, “Event–based Neuromorphic Perception and Computation: The Future of Sensing and AI” by Prof. Ryad Benosman, will give an overview of the advantages to be gained by neuromorphic approaches. Opteran will be presenting on their neuromorphic processing approach to enable vastly improved vision and autonomy, the design of which was inspired by insect brains.Whatever your application is, and whatever your requirements are, somewhere out there is an embedded AI or vision processor that’s the best fit for you. At the Summit, you’ll be able to learn about many of them, and speak with the innovative companies developing them. Come check them out, and be sure to check back in 10 years — when we will see how many of 2032’s AI processors trace their lineage to this modern–day Cambrian Explosion!—Jeff Bier is the president of consulting firm BDTI, founder of the Edge AI and Vision Alliance, and the general chair of the Embedded Vision Summit.
- Inspiring and empowering product creators to incorporate AI and vision technology into new products and applications
- Helping Member companies achieve success with edge AI and vision technology by:
- Building a vibrant AI and vision ecosystem by bringing together suppliers, end-product designers, and partners
- Delivering timely insights into AI and vision market research, technology trends, standards and application requirements
- Assisting in understanding and overcoming the challenges of incorporating AI in their products and businesses
Monday, April 25, 2022
Perspective article on solar-blind UV photodetectors
Friday, April 22, 2022
Videos du jour - CICC, PhotonicsNXT and EPIC
IEEE CICC 2022 best paper candidates present their work
PhotonicsNXT Fall Summit keynote discusses automotive lidar
This keynote session by Pierrick Boulay of Yole Developpement at the PhotonicsNXT Fall Summit held on October 28, 2021 provides an overview of the lidar ecosystem and shows how lidar is being used within the auto industry for ranging and imaging.
EPIC Online Technology Meeting on Single Photon Sources and Detectors
The power hidden in one single photon is unprecedented. But we need to find ways to harness that power. This meeting will discuss cutting-edge technologies paving the way for versatile and efficient pure single-photon sources and detection schemes with low dark count rates, high saturation levels, and high detection efficiencies. This meeting will gather the key players in the photonic industry pushing the development of these technologies towards commercializing products that harness the intrinsic properties of photons.
Thursday, April 21, 2022
Wide field-of-view imaging with a metalens
Wide-angle imaging is an important function in photography and projection, but it also places high demands on the design of the imaging components of a camera. To eliminate the coma caused by the focusing of large-angle incident light, traditional wide-angle camera lenses are composed of complex optical components. Here, we propose a planar camera for wide-angle imaging with a silicon nitride metalens array mounted on a CMOS image sensor. By carefully designing proper phase profiles for metalenses with intentionally introduced shifted phase terms, the whole lens array is capable of capturing a scene with a large viewing angle and negligible distortion or aberrations. After a stitching process, we obtained a large viewing angle image with a range of >120 degrees using a compact planar camera. Our device demonstrates the advantages of metalenses in flexible phase design and compact integration, and the prospects for future imaging technology.
Wednesday, April 20, 2022
PhD Thesis on Analog Signal Processing for CMOS Image Sensors
The very first PhD thesis that came out of Albert Theuwissen's group at TU Delft is now freely available as a pdf. This seems like a great educational resource for people interested in image sensors.
Direct download link: https://repository.tudelft.nl/islandora/object/uuid:2fbc1f51-7784-4bcd-85ab-70fc193c5ce9/datastream/OBJ/download
This thesis describes the development of low-noise power-efficient analog interface circuitry for CMOS image sensors. It focuses on improving two aspects of the interface circuitry: firstly, lowering the noise in the front-end readout circuit, and secondly the realization of more power-efficient analog-to-digital converters (ADCs) that are capable of reading out high-resolution imaging arrays.Chapter 2 provides an overview of the analog signal processing chain in conventional, commercially-available CMOS imagers. First of all, the different photo-sensitive elements that form the input to the analog signal chain are briefly discussed. This is followed by a discussion of the analog signal processing chain itself, which will be divided into two parts. Firstly, the analog front-end, consisting of in-pixel circuitry and column-level circuitry, is discussed. Second, the analog back-end, consisting of variable gain amplification and A/D conversion is discussed. Finally, a brief overview of advanced readout circuit techniques is provided.
In chapter 3, the performance of the analog front-end is analyzed in detail. It is shown that its noise performance is the most important parameter of the front-end. An overview of front-end noise sources is given and their relative importance is discussed. It will be shown that 1/f noise is the limiting noise source in current CMOS imagers. A relatively unknown 1/f noise reduction technique, called switched-biasing or large signal excitation (LSE), is introduced and its applicability to CMOS imagers is explored. Measurement results on this 1/f noise reduction technique are presented. Finally, at the end of the chapter, a preliminary conclusion on CMOS imager noise performance is presented.
The main function of the back-end analog signal chain is analog-to-digital conversion, which is described in chapter 4. First of all, the conventional approach of a single chip-level ADC is compared to a massively-parallel, column-level ADC, and the advantages of the latter will be shown. Next, the existing column-level ADC architectures will be briefly discussed, in particular the column-parallel single-slope ADC. Furthermore, a new architecture, the multiple-ramp single-slope ADC will be proposed. Finally, two circuit techniques are introduced that can improve ADC performance. Firstly, it will be shown that the presence of photon shot noise in an imager can be used to significantly decrease ADC power consumption. Secondly, an column FPN reduction technique, called Dynamic Column Switching (DCS) is introduced.
Chapter 5 and 6 present two realisations of imagers with column-level ADCs. In chapter 5, a CMOS imager with single-slope ADC is presented that consumes only 3.2µW per column. The circuit details of the comparator achieving this low power consumption are described, as well as the digital column circuitry. The ADC uses the dynamic column switching technique introduced in chapter 4 to reduce the perceptional effects of column FPN. Chapter 6 presents an imager with a multiple-ramp single-slope architecture, which was proposed in chapter 4. The column comparator used in this design is taken from a commercially available CMOS imager. The multiple ramps are generated on chip with a low power ladder DAC structure. The ADC uses an auto-calibration scheme to compensate for offset and delay of the ramp drivers.
Tuesday, April 19, 2022
Google AI Blog article on Lidar-Camera Fusion
Monday, April 18, 2022
Quantum Dot Photodiodes for SWIR Cameras
A research team from Ghent University in Belgium has published an article titled "Colloidal III–V Quantum Dot Photodiodes for Short-Wave Infrared Photodetection".
Abstract: Short-wave infrared (SWIR) image sensors based on colloidal quantum dots (QDs) are characterized by low cost, small pixel pitch, and spectral tunability. Adoption of QD-SWIR imagers is, however, hampered by a reliance on restricted elements such as Pb and Hg. Here, QD photodiodes, the central element of a QD image sensor, made from non-restricted In(As,P) QDs that operate at wavelengths up to 1400 nm are demonstrated. Three different In(As,P) QD batches that are made using a scalable, one-size-one-batch reaction and feature a band-edge absorption at 1140, 1270, and 1400 nm are implemented. These QDs are post-processed to obtain In(As,P) nanocolloids stabilized by short-chain ligands, from which semiconducting films of n-In(As,P) are formed through spincoating. For all three sizes, sandwiching such films between p-NiO as the hole transport layer and Nb:TiO2 as the electron transport layer yields In(As,P) QD photodiodes that exhibit best internal quantum efficiencies at the QD band gap of 46±5% and are sensitive for SWIR light up to 1400 nm.
Full article (open access): https://onlinelibrary.wiley.com/doi/10.1002/advs.202200844













