The leading edge of vehicle autonomy recently took a significant step forward with the unveiling of the world’s first hands-free autonomous vehicle – the Audi A8 – and Nissan’s ProPilot Assist, an advanced semi-autonomous system that reduces need for driver input in a variety of situations.
These new technology releases by Audi and Nissan boast a litany of advanced safety and autonomous driving capabilities, made possible in large part by environmental information provided by Mobileye’s EyeQ® software-on-chip (SoC) – the “brains” that interprets raw data from a forward-facing camera, enabling vehicles to “see” the road and objects ahead, and plot a safe path forward. Many talented companies collaborated on these systems, but “sight” provided by Mobileye is the foundational element.
In a world increasingly filled with press releases, the only true evidence of progress toward autonomous vehicles is what’s being sold today to the public. “Cars on the road”, given all the associated risks, do not happen without rigorous testing of the systems and validation to the highest accuracy level. Mobileye is proud to have its SoC’s inside 20 million cars on the road today, with rapid innovations to the software continuously pushing the industry forward.
Audi’s A8, unveiled with a historic splash in Barcelona, features Traffic Jam Pilot, the world’s first-to-market Level 3 automated driving system, which enables hands-free navigation of stop-and-go traffic up to 35mph on divided highways. This, along with the most advanced active safety systems (automatic emergency braking, lane keeping support, etc), is powered by Mobileye’s proprietary artificial intelligence-powered EyeQ®. The vertically-integrated Mobileye software-on-chip is the “eyes” of the vehicle, detecting vehicles and pedestrians, road boundaries and free space, reading traffic lights and traffic signs. Redundant information from radars and lidar add to the robustness of the “eyes.” Two videos by Audi (here and here) illustrate the system well.
Traffic Jam Pilot is a collaborative endeavor. The system uses Delphi’s Multi-Domain Controller, which contains three processors in addition to EyeQ®3. Software by others, including Audi, is written onto these chips. For example, an Nvidia processor hosts driver monitoring and self-parking software. Processors by Altera and Infineon host the sensor fusion and vehicle control software.
A week after the Audi A8 debuted, Nissan invited reporters to Farmington Hills, Michigan, to experience its cutting edge ProPilot Assist technology. A similar system won Technology of the Year in Japan in 2016, but the first US-version will hit the streets in the next-generation Nissan Leaf launching this Fall. Journalists from a variety of outlets hailed the smooth handling of the system, which can brake, accelerate and steer in traffic, but requires drivers to keep their hands on the wheel. Mobileye’s EyeQ® processor enables ProPilot to “see” the road and vehicles ahead and around them. ProPILOT Assist automatically maintains a safe following distance behind other vehicles, adjusts speed accordingly, and controls steering to automatically keep the vehicle centered in its lane. ProPILOT will also bring the vehicle to a full stop in busy traffic.
Mobileye’s EyeQ® monitors the road using sophisticated Computer Vision algorithms, currently used to power Advanced Driver Assistance Systems (ADAS). ADAS by Mobileye is used by most of the world’s automakers. ADAS alerts drivers to dangerous situations and acts autonomously to prevent collisions.
The set of situations that EyeQ® is able to identify continually grows, thanks to constantly improving software, higher computational processing power, and more powerful cameras. Eventually, we believe cameras will be the primary source of information used to monitor the entire driving environment, resulting in driverless cars. In the meantime, the quest for Autonomous Vehicles results in continuous improvement to ADAS, with rapid expansion of the set of dangerous situations monitored.
New features included in the latest EyeQ®3 systems include:
- Advancements in the performance of the software, including significant improvements in the distance to which vehicles and pedestrians can be detected, and night-time performance.
- Road Surface Profile: This function provides an ability to understand subtle variations in the structure of the road. The information is used by sophisticated suspension systems to provide a “magic-carpet-like” feel.
- Freespace Detection: Using semantic cues beyond simple lane markings, this function provides detailed information on the space that is available to drive. Critical for future Autonomous Vehicles, this function is being used today to enhance the ability of the car to take evasive action to avoid collisions, i.e. better information on Freespace means more options to evade a collision