Tuesday, September 13, 2016

Tesla Abandons Camera as the Primary Sensor for Self-Driving

Tesla announces a change of direction of its self-driving car development:

"The radar was added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but was only meant to be a supplementary sensor to the primary camera and image processing system.

After careful consideration, we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition. This is a non-trivial and counter-intuitive problem, because of how strange the world looks in radar. Photons of that wavelength travel easily through fog, dust, rain and snow, but anything metallic looks like a mirror. The radar can see people, but they appear partially translucent. Something made of wood or painted plastic, though opaque to a person, is almost as transparent as glass to radar.

On the other hand, any metal surface with a dish shape is not only reflective, but also amplifies the reflected signal to many times its actual size. A discarded soda can on the road, with its concave bottom facing towards you can appear to be a large and dangerous obstacle, but you would definitely not want to slam on the brakes to avoid it.

Therefore, the big problem in using radar to stop the car is avoiding false alarms. ...The first part of solving that problem is having a more detailed point cloud.

...The second part consists of assembling those radar snapshots, which take place every tenth of a second, into a 3D "picture" of the world.

...The third part is a lot more difficult. When the car is approaching an overhead highway road sign positioned on a rise in the road or a bridge where the road dips underneath, this often looks like a collision course. The navigation data and height accuracy of the GPS are not enough to know whether the car will pass under the object or not. By the time the car is close and the road pitch changes, it is too late to brake.

This is where fleet learning comes in handy. Initially, the vehicle fleet will take no action except to note the position of road signs, bridges and other stationary objects, mapping the world according to radar. The car computer will then silently compare when it would have braked to the driver action and upload that to the Tesla database. If several cars drive safely past a given radar object, whether Autopilot is turned on or off, then that object is added to the geocoded whitelist.
"

9 comments:

  1. So why not combine methods to create a more accurate and complete model of the world, with no "Primary" sensor. Each method of sensing has strengths and weaknesses, so combining different sensing technologies into a single understanding of the environment around the vehicle should mitigate the weaknesses and combine the strengths of the sensing technologies. Maybe its possible to have at least 3 or 4 methods for sensing the environment around the vehicle and targeted technologies to combine and understand what the sensing information means as a complete model, in a timely fashion.

    ReplyDelete
  2. When there will be next accident, Telsa will set human eyes as primary sensor... :)

    ReplyDelete
  3. Next step: big data and machine learning only, no sensors. Machine learning and big data can do anything.

    ReplyDelete
    Replies
    1. Until a kid steps out in front of the car.

      Delete
    2. One can always blame the kid - he suffered from too many sensors and too little machine intelligence. And big data? Don't be silly, what big data - he was still so small... // sarcasm mode off

      I don't understand this direction. How would radar-only even see things like lane markers and traffic light colors? Or is Musk planning to get his investors to pay to network all traffic lights and repaint all highway markings with some radar-reflective stuff? The notion that cars around you are driving themselves on radar vision is rather disturbing when you are on the road.

      What happens when one day a real obstacle shows up at about the same location as the "whitelisted" one?

      Delete
    3. One would presume the white listed object is shape matched to presented obstacle for verification, as opposed to a blind blanket over the area.

      Delete
  4. I don't really think Tesla ever claimed that they are abandoning cameras as the primary sensor that just increasing the importance of the radars.

    ReplyDelete
    Replies
    1. Please see the official Tesla statement quoted from the link in the post:

      "After careful consideration, we now believe it [radar] can be used as a primary control sensor without requiring the camera to confirm visual image recognition. This is a non-trivial and counter-intuitive problem, because of how strange the world looks in radar."

      Delete
  5. JERUSALEM, Sept. 16, 2016 /PRNewswire/ -- (NYSE: MBLY) – In response to inquiries received this morning, Mobileye N.V. notes that the allegations recently attributed to a spokesperson for Tesla regarding Mobileye's position in respect of Tesla internal computer visions efforts are incorrect and can be refuted by the facts.

    It has long been Mobileye's position that Tesla's Autopilot should not be allowed to operate hands-free without proper and substantial technological restrictions and limitations. In communications dating back to May 2015 between Mobileye Chairman and Tesla's CEO, Mobileye expressed safety concerns regarding the use of Autopilot hands-free. After a subsequent face to face meeting, Tesla's CEO confirmed that activation of Autopilot would be "hands on." Despite this confirmation, Autopilot was rolled out in late 2015 with a hands-free activation mode. Mobileye has made substantial efforts since then to take more control on how this project can be steered to a proper functional safety system.

    Tesla's response to the May 7 crash, wherein the company shifted blame to the camera, and later corrected and shifted blame to the radar, indicated to Mobileye that Mobileye's relationship with Tesla could not continue. Failing agreement on necessary changes in the relationship, Mobileye terminated its association with Tesla. As for Tesla's claim that Mobileye was threatened by Tesla's internal computer vision efforts, the company has little knowledge of these efforts other than an awareness that Tesla had put together a small team.

    In any event, it is Mobileye's policy not to respond to rumors or other spurious claims in the press. Mobileye has commented fully on its relationship with Tesla and will not provide further comment. Mobileye's deeply held view is that the long-term potential for vehicle automation to reduce traffic injuries and fatalities significantly is too important to risk consumer and regulatory confusion or to create an environment of mistrust that puts in jeopardy technological advances that can save lives.

    ReplyDelete

All comments are moderated to avoid spam and personal attacks.