Lists

Thursday, June 16, 2016

Mobileye Chief Engineer on ADAS Market

EETimes publishes an article "Mobileye Chief Engineer Explains EyeQ5" by Junko Yoshida. Few quotes:

"12 to 18 months ago, automakers were more inclined to develop an autonomous car that allows a driver to take his mind off driving on the highway, according to Mobileye. That would be a Level 3 autonomous car – according to the SAE standard -- defined as “within known, limited environments (such as freeways), the driver can safely turn their attention away from driving tasks.”

Now, automakers want autonomous cars that can operate without a driver – much sooner than later, according to Mobileye.

In keeping up with its customers’ more aggressive timetable, Mobileye has also “slightly pulled forward” — to 2020 — its plan for EyeQ5.


[Elchanan Rushinek, Mobileye’s SVP of Engineering] said that EyeQ5 was designed to support more than 16 cameras in addition to multiple radars and LIDARs, including the low-level processing of all sensors.

More at the technical level, he explained, “There are 16 virtual MIPI channels and more than 16 sensors can be supported by multiplexing several physical sensors on a single virtual MIPI channel.”

“Camera processing is the most computationally intensive, so any solution would include dedicated vision processing ECUs in addition to sending some of the raw data to the central ECU. Both EyeQ4 and EyeQ5 can support both goals -- specific vision processing, as well as master ECU,” he added. “The EyeQ controls each sensor via I2C bus on a frame basis in order to get the optimized output, which is deeply aligned with real time algorithms.

No comments:

Post a Comment

All comments are moderated to avoid spam and personal attacks.