Saturday, August 24, 2019

LiDAR Approaches Overview

Universitat Polit├Ęcnica de Catalunya, Spain, publishes a paper "An overview of imaging lidar sensors for autonomous vehicles" by Santiago Royo and Maria Ballesta.

"Imaging lidars are one of the hottest topics in the optronics industry. The need to sense the surroundings of every autonomous vehicle has pushed forward a career to decide the final solution to be implemented. The diversity of state-of-the art approaches to the solution brings, however, a large uncertainty towards such decision. This results often in contradictory claims from different manufacturers and developers. Within this paper we intend to provide an introductory overview of the technology linked to imaging lidars for autonomous vehicles. We start with the main single-point measurement principles, and then present the different imaging strategies implemented 8 in the different solutions. An overview of the main components most frequently used in practice is also presented. Finally, a brief section on pending issues for lidar development has been included, 10 in order to discuss some of the problems which still need to be solved before an efficient final implementation.Beyond this introduction, the reader is provided with a detailed bibliography containing both relevant books and state of the art papers."

Friday, August 23, 2019

AUO Presents Large Area Optical Fingerprint Sensor Embedded in LCD Display

Devicespecifications: Taiwan-based AU Optronics unveils the 6-inch full screen optical in-cell fingerprint LTPS LCD which is the world first of its kind to have an optical sensor installed within the LCD structure. Equipped with AHVA (Advanced Hyper-Viewing Angle) technology, full HD+ (1080 x 2160) resolution, and a high pixel density of 403 PPI, the panel has a 30 ms sensor response time for the accurate fingerprint sensing. For identification such as law enforcement and customs inspection purposes, AUO’s 4.4-inch fingerprint sensor module features high resolution (1600 x 1500) and a high pixel density of 503 PPI. Its large sensor area allows for multiple fingers detection, enhancing both accuracy and security. The module can produce accurate images outdoors even under strong sunlight.

Samsung Presents Green-Selective Organic PD

OSA Optics Express publishes Samsung paper "Green-light-selective organic photodiodes for full-color imaging" by Gae Hwang Lee, Xavier Bulliard, Sungyoung Yun, Dong-Seok Leem, Kyung-Bae Park, Kwang-Hee Lee, Chul-Joon Heo, In-Sun Jung, Jung-Hwa Kim, Yeong Suk Choi, Seon-Jeong Lim, and Yong Wan Jin.

"OPDs using narrowband organic semiconductors in the active layers are highly promising candidates for novel stacked-type full-color photodetectors or image sensors [11–17]. In a stacked-type architecture, the light detection area is increased and thanks to the organic layer, the spectral response can be controlled without the use of a top filter. Recently, highly attractive filterless narrowband organic photodetectors based on charge collection narrowing for tuning the internal quantum efficiency (IQE) have been reported [18,19], but this concept might be hardly applied to real full-color imaging due to the significant mismatch between the absorption spectra and the external quantum efficiency (EQE) spectra of the devices as well as the low EQEs. To improve the EQE of OPDs, in addition to the development of narrowband and high absorption organic semi-conductors, other optical manipulations can be brought to the device.

...we presented a high-performing green-selective OPD that reached an EQE over 70% at an operating voltage of 3 V, while the dark current was only 6 e/s/╬╝m2. Those performances result from the use of newly synthesized dipolar p-type compounds having high absorption coefficient and from light manipulation within the device.

Thursday, August 22, 2019

ResearchInChina on Automotive Vision Difficulties

ResearchInChina publishes quite a long list of Toyota camera-based Pre-collision System (PCS) difficulties in various situations:

Toyota’s Pre-Collision System (PCS) renders an in-vehicle camera and laser to detect pedestrians and other vehicles in front of the vehicle. If it determines possibility of a frontal collision, the system will prompt the driver to take action and avoid it with audio and visual alerts. If the driver notices the potential collision and apply the brakes, the Pre-Collision System with Pedestrian Detection (PCS w/PD) may apply additional force using Brake Assist (BA). If the driver fails to brake in time, it may automatically apply the brakes to reduce the vehicle’s speed, helping to minimize the likelihood of a frontal collision or reduce its severity.

In some situations (such as the following), a vehicle/pedestrian may not be detected by the radar and camera sensors, thus preventing the system from operating properly:

  • When an oncoming vehicle approaches
  • When the preceding vehicle is a motorcycle or a bicycle
  • When approaching the side or front of a vehicle
  • If a preceding vehicle has a small rear end, such as an unloaded truck
  • If a preceding vehicle has a low rear end, such as a low bed trailer
  • When the preceding vehicle has high ground clearance
  • When a preceding vehicle is carrying a load which protrudes past its rear bumper
  • If a vehicle ahead is irregularly shaped, such as a tractor or sidecar
  • If the sun or other light is shining directly on the vehicle ahead
  • If a vehicle cuts in front of your vehicle or emerges from beside a vehicle
  • If a preceding vehicle ahead makes an abrupt maneuver (such as sudden swerving, acceleration or deceleration)
  • When a sudden cut-in occurs behind a preceding vehicle
  • When a preceding vehicle is not right in front of your vehicle
  • When driving in bad weather such as heavy rain, fog, snow or a sandstorm
  • When the vehicle is hit by water, snow, dust, etc. from a vehicle ahead
  • When driving through steam or smoke
  • When amount of light changes dramatically, such as at a tunnel exit/entrance
  • When a very bright light, such as the sun or the headlights of oncoming vehicle, beat down the camera sensor
  • When driving in low light (dusk, dawn, etc.) or when driving without headlights at night or in a tunnel
  • After the hybrid system has started and the vehicle has not been driven for a certain period of time
  • While making a left/right turn and within a few seconds after making a left/right turn
  • While driving on a curve, and within a few seconds after driving on a curve
  • If your vehicle is skidding
  • If the front of the vehicle is raised or lowered
  • If the wheels are misaligned
  • If the camera sensor is blocked (by a wiper blade, etc.)
  • If your vehicle is wobbling
  • If your vehicle is being driven at extremely high speeds
  • While driving up or down a slope
  • When the camera sensor or radar sensor is misaligned

PCS should be disabled when radar and camera sensor may not recognize a pedestrian in the following circumstances:
  • When a pedestrian is 1m or shorter or 2m or taller
  • When a pedestrian wears oversized clothing (a rain coat, long skirt, etc.), obscuring the pedestrian’s silhouette
  • When a pedestrian carries large baggage, holds an umbrella, etc., hiding part of the body
  • When a pedestrian leans forward or squats
  • When a pedestrian pushes a pram, wheelchair, bicycle or other vehicle
  • When pedestrians are walking in a group or are close together
  • When a pedestrian is in white that reflects sunlight and looks extremely bright
  • When a pedestrian is in the darkness such as at night or while in a tunnel
  • When a pedestrian has clothing with brightness/color similar to scenery and that blend into the background
  • When a pedestrian is staying close to or walking alongside a wall, fence, guardrail, vehicle or other obstacle
  • When a pedestrian is walking on top of metal on the road surface
  • When a pedestrian walks fast
  • When a pedestrian abruptly changes walking speed
  • When a pedestrian runs out from behind a vehicle or a large object
  • When a pedestrian is very close to a side (external rearview mirror) of the vehicle

ADAS suppliers and OEMs work together on product and technology development to make breakthroughs in so many inapplicable scenarios, so that ADAS can get improved and become safer. All players still have a long way to go before autonomous driving comes true.

Sony vs Top 14 Semiconductor Companies

IC Insights: Sony is the only top-15 semiconductor supplier to register YoY growth in 1H19. In total, the top-15 semiconductor companies’ sales dropped by 18% in 1H19 compared to 1H18, 4% worse than the total worldwide semiconductor industry 1H19/1H18 decline of 14%. Most of Sony semiconductor sales are from CMOS image sensors.

Wednesday, August 21, 2019

Automotive News: Porsche-Trieye, Koito-Daihatsu

Globes: Porsche has invested in Israeli SWIR sensor startup Trieye. “We see great potential in this sensor technology that paves the way for the next generation of driver assistance systems and autonomous driving functions. SWIR can be a key element: it offers enhanced safety at a competitive price,” says Michael Steiner, Member of the Executive Board for R&D at Porsche AG.

Porsche $1.5M investment is a part of Series A round extension from $17M to $19M. TriEye's SWIR technology is CMOS-based, said to enable the scalable mass-production of SWIR sensors and reducing the cost by a factor of 1,000 compared to current InGaAs-based technology. As a result, the company can produce an affordable HD SWIR camera in a compact format, facilitating easy in-vehicle mounting behind the car’s windshield.

Nikkei: Daihatsu low cost car Tanto released it July 9, 2019 features adaptive headlight technology from Koito. This is the first appearance of such a technology in low cost car, probably signalling a beginning of the broad market adoption:

"When a light-distribution-changeable headlight detects an oncoming or preceding vehicle at the time of using high beam, it blocks part of light so that the light is not directed at the area in which the vehicle exists.

In general, it uses multiple horizontally-arranged LED chips for high beam and controls light distribution by turning off LED light that is directed at an area to which light should not be applied.

With a stereo camera set up in the location of the rear-view mirror, it recognizes oncoming and preceding vehicles. By recognizing the color and intensity of light captured by the camera, it judges whether the source of the light is a headlight or taillight.

When it recognizes a vehicle, LEDs irradiating the area of the vehicle are turned off. It can recognize a headlight about 500m (approx 0.31 miles) away.

Cambridge Mechatronics Extends ToF Range by 5-10x

AF and OIS actuators company Cambridge Mechatronics Ltd. (CML) comes up with an interesting statement:

"CML has developed technology capable of extending the range of ToF depth sensing for next-generation AR experiences.

Because current technology can only detect objects at a maximum distance of around two metres, its usage in smartphones is limited to photographic enhancements and a small number of Augmented Reality (AR) applications. However, for widespread adoption, these experiences should be enriched by significantly increasing the working range.

CML has developed technology that is now ready to be licensed capable of increasing this range of interaction to ten metres to unlock the full potential of the smartphone AR market.

And another statement:

"Mobile AR uses sensors to create a depth map of the surrounding landscape in order to overlay images on top of the real world. Current 3D sensors have a limited range of interaction of around 4 metres. CML is developing SMA actuator technology to improve the range and accuracy of augmented experiences tenfold, providing a more immersive experience."

Tuesday, August 20, 2019

Smartsens Partners with MEMS Drive to Deliver Chip-Level OIS

PRNewswire: SmartSens has signed a cooperation agreement with MEMS image stabilization company MEMS Drive. The two will collaborate on a series of research projects in the fields of CMOS sensing chip and chip-level OIS technologies to open up new areas of applications.

Through this collaboration, SmartSens now introduces chip-level anti-vibration technology directly into CMOS sensors, made available in non-mobile applications, such as security and machine vision. In comparison with conventional OIS technology, sensors with chip-level stabilization offer higher simplicity in engineering, without losing any of the robust capabilities unique to OIS in traditional VCMs. Additionally, these sensors will feature added stabilization control to the rotation of the sensor, achieving optimal stabilization with 5 axes. For applications in AI-enabled systems, which often require HDR imaging, high frame rate video capture, or sensing in ultra-low light conditions, SmartSens will be able to provide a solution in the form of efficient stabilization.

Besides its conventional application in mobile cameras, image stabilization is equally important for application in non-mobile areas such as security monitoring, AI machine vision and autonomous vehicles. As an example of the importance of video capturing chips in the field of AI machine vision, CMOS image sensors need to be able to respond to factors such as uneven roads and air turbulence that could result in blurred images. This requires that the image sensor itself possesses exceptional stabilization capabilities, in order to increase efficiency in both the identification as well as the overall AI system.

SmartSens CMO Chris Yiu expresses her optimism, saying, "Optical image stabilizing technology is one of the hottest areas of research and development in the fields of DSLR and mobile cameras, and is unprecedented in non-mobile applications such as AI video capturing. SmartSens's collaboration with MEMS Drive in the area of non-mobile image stabilization opens up new possibilities in this field. SmartSens and MEMS Drive are both companies that rely on innovation to drive growth, and we are very pleased to use this partnership to bring forth further innovation in the field of image sensing technology."

MEMS Drive CEO Colin Kwan notes, "Since its founding, MEMS Drive has dedicated itself to replacing the limited voice coil motor (VCM) technology with its own innovative MEMS OIS technology. Our close collaboration with SmartSens will further improve efficiency of MEMS OIS develop image stabilization for non-mobile fields, and together achieve many more technological breakthroughs."

Artilux Ge-on-Si Imager Info

Artilux kindly agreed to answer on some of my questions on their technology and progress:

Q: What is your QE vs wevelength?

A: "For QE, it is a rather smooth transition from 70%@940nm to 50%@1550nm, and we are working on further optimization at wavelengths that silicon cannot detect."

Q: What is your imager spatial resolution, pixel size, and dark current?

A: "For pitch, resolution, and Idark vs temperature, what we can disclose now is that we have been working with a few LiDAR companies to accommodate different pitch and resolution requirements, as well as the dark current vs temperature trend up to 125C. As for the consumer products, we will have a more complete product portfolio line-up over the following quarters."

Monday, August 19, 2019