AutoSens Detroit conference to be held on May 14-17, 2018 announces it agenda with a rich image sensing content:
Near-Infrared QE Enhancing Technology for Automotive Applications
Boyd Fowler
CTO, OmniVision Technologies, Inc.
• Why is near infrared sensitivity important in automotive machine vision applications ?
• Combining thicker EPI, deep trench isolation and surface scattering to improve quantum efficiency, in CMOS image sensors, while still retaining excellent spatial resolution.
• Improving the performance of CMOS image sensors for in cabin monitoring and external night time imaging.
Challenges, opportunities and deep learning for thermal cameras in ADAS and autonomous vehicle applications
Mike Walters, VP of Product Management for Uncooled Thermal Camerast, FLIR Systems
• Deep learning analytic techniques including full scene segmentation, an AI technique that enables ADAS developers to create full scene classification of every pixel in the thermal image.
The emerging field of free-form optics in cameras, and its use in automotive
Li Han Chan, CEO, DynaOptics
Panel discussion: how many cameras are enough?
Tom Toma, Global Product Manager, Magna Electronics
Sven Fleck, Managing Director, SmartSurv Vision Systems GmbH
Patrick Denny, Senior Expert, Valeo
• OEM design engineer – can we make sensors a cool feature not an ugly bolt-on?
• Retail side – how to make ADAS features sexy?
• Tier 1 – minimal technical requirements
• Outside perspective – learning from an industry where safety sells (B2C market)
A review of relevant existing IQ challenges
Uwe Artmann
CTO/Partner , Image Engineering
Addressing LED flicker
Brian Deegan, Senior Expert - Vision Research Engineer , Valeo Vision Systems
• Definition, root cause and manifestations of LED flicker
• Impact of LED flicker for viewing and machine vision applications
• Initial proposals for test setup and KPIs, as defined by P2020 working group
• Preliminary benchmarking results from a number of cameras
CDP – contrast detection probability
Marc Geese, System Architect for Optical Capturing Systems, Robert Bosch
Moving from legacy LiDAR to Next Generation iDAR
Barry Behnken, VP of Engineering, AEye
• How can OEMs and Tier 1s leverage iDAR to not just capture a scene, but to dynamically perceive it?
• Learn how iDAR optimizes data collection, allowing for situational configurability at the hardware level that enables the system to emulate legacy systems, define regions of interest, focus on threat detection and/or be programmed for variable environments.
• Learn how this type of configurability will optimize data collection, reduce bandwidth, improve vision perception and intelligence, and speed up motion planning for autonomous vehicles.
Enhanced Time-Of-Flight – a CMOS full solution for automotive LIDAR
Nadav Haas, Product Manager, Newsight Imaging
• The need for a real 3D solid state lidar solution to overcome challenges associated with lidar.
• Enabling very wide dynamic range by means of standard processing tools, to amplify very weak signals to achieve high SNR and accurately detect objects with high resolution at long range.
• Eliminating blinding by mitigating or blocking background sunlight, random light from sources in other cars, and secondary reflections.
• Enabling very precise timing of the transmitted and received pulses, essential to obtain the desired overall performance.
Panel discussion: do we have a lidar bubble?
Abhay Rai, Director Product Marketing: Automotive Imaging, Sony Electronics
• Do we even need lidar in AV?
• Which is the right combo; lidar + cornering radar or no lidar just radar + camera?
• How many sensors are the minimum for autonomous driving
• Are image sensors and cameras fit for autonomous driving?
All-weather vision for automotive safety: which spectral band?
Emmanuel Bercier, Project Manager, AWARE Project
• The AWARE (All Weather All Roads Enhanced vision) French public funded project is aiming at the development of a low-cost sensor fitting to automotive requirements, and enabling a vision in all poor visibility conditions.
• Evaluation of the relevance of four different spectral bands: Visible RGB, Visible RGB Near-Infrared (NIR) extended, Short-Wave Infrared (SWIR) and Long-Wave Infrared (LWIR).
• Outcome of two test campaigns in outdoor natural conditions and in artificial fog tunnel, with four cameras recording simultaneously.
• Presentation of the detailed results of this comparative study, focusing on pedestrians, vehicles, traffic signs and lanes detection.
Automotive Sensor Design Enablement; a discussion of multiple design enablement tools/IP to achieve smart Lidar
Ian Dennison, Senior Group Director R&D, Cadence Design Systems
• Demands of advanced automotive sensors, driving design of silicon photonics, MEMS, uW/RF, advanced node SoC, and advanced SiP.
• Examining design enablement requirements for automotive sensors that utilize advanced design fabrics, and their integration.
Role of Specialty Analog Foundry in Enabling Advanced Driver Assistance Systems (ADAS) and Autonomous Driving
Amol Kalburge, Head of the Automotive Program , TowerJazz
• Driving improvements in device level figures of merit to meet the technical requirements of key ADAS sensors such as automotive radar, LiDAR and camera systems.
• Optimizing the Rdson vs breakdown voltage to enable higher bus voltages of the future hybrid/EV systems.
• Presenting an overview of advanced design enablement and design services capabilities required for designers to build robust products: design it once, design it right.