Wednesday, May 18, 2016

Mobileye & ST Partner on Self-Driving Processor

GlobeNewsWire: Mobileye and STMicroelectronics announce that they are co-developing the next (5th) generation of Mobileye's SoC, the EyeQ5, to act as the central computer performing sensor fusion for Fully Autonomous Driving (FAD) vehicles starting in 2020.

To meet power consumption and performance targets, the EyeQ5 will be designed in advanced 10nm or below FinFET technology node and will feature eight multithreaded CPU cores coupled with eighteen cores of Mobileye's next-generation vision processors. These enhancements will increase performance 8x times over the current 4th generation EyeQ4. The EyeQ5 will produce more than 12 Tera operations per second, while keeping power consumption below 5W, to maintain passive cooling at extraordinary performance. Engineering samples of EyeQ5 are expected to be available by first half of 2018.

"EyeQ5 is designed to serve as the central processor for future fully-autonomous driving for both the sheer computing density, which can handle around 20 high-resolution sensors and for increased functional safety," said Prof. Amnon Shashua, cofounder, CTO and Chairman of Mobileye. "The EyeQ5 continues the legacy Mobileye began in 2004 with EyeQ1, in which we leveraged our deep understanding of computer vision processing to develop highly optimized architectures to support extremely intensive computations at power levels below 5W to allow passive cooling in an automotive environment."

"Each generation of the EyeQ technology has proven its value to drivers and ST has proven its value to Mobileye as a manufacturing, design, and R&D partner since beginning our cooperation on the EyeQ1," said Marco Monti, EVP and GM of Automotive and Discrete Group, STM. "With our joint commitment to the 5th-generation of the industry's leading Advanced Driver Assistance System (ADAS) technology, ST will continue to provide a safer, more convenient smart driving experience."

EyeQ5's proprietary accelerator cores are optimized for a wide variety of computer-vision, signal-processing, and machine-learning tasks, including deep neural networks. Autonomous driving requires fusion processing of dozens of sensors, including high-resolution cameras, radars, and LiDARs. The sensor-fusion process has to simultaneously grab and process all the sensors' data. For this purpose, the EyeQ5's dedicated IOs support at least 40Gbps data bandwidth.

Engineering samples of EyeQ5 are expected to be available by first half of 2018. First development hardware with the full suite of applications and SDK are expected by the second half of 2018.

1 comment:

  1. https://www.youtube.com/watch?v=Ywh0votSJxk

    ReplyDelete

All comments are moderated to avoid spam and personal attacks.