blog

|

April 24, 2023

Autonomous Vehicle Questions Get the Answers They Deserve

Today we’re answering some of your most pressing questions on our self-driving technology, as we advance towards the future of autonomous vehicles.

Though highly complex, Mobileye develops the advanced technologies for autonomous vehicles to be as transparent as possible.

Though highly complex, Mobileye develops the advanced technologies for autonomous vehicles to be as transparent as possible.

Mobileye periodically releases footage of our autonomous vehicles (AVs) out testing around the world. We’ve published numerous videos of our camera-only developmental AVs driving in locations across Asia, Europe, and North America, and most recently released another showing our Mobileye Drive™ test vehicle (with cameras, radars, and lidars) navigating the complex streets of Jerusalem at night.

The videos show vehicles equipped with our self-driving technologies maneuvering in real-world conditions, alongside actual traffic, in densely packed city centers, tackling the same challenges a human driver would face. The unvarnished glimpse they provide into our technologies at work have made them some our most popular videos. They’ve also sparked some excellent questions across various online platforms, and today we’re answering some of them.

Question: In what locations and under what conditions is Mobileye testing its autonomous-vehicle technologies?

Answer: Our testing of autonomous vehicles began (and continues) in our hometown of Jerusalem, Israel. The city presents a particularly challenging set of driving conditions, including narrow streets, heavy stop-and-go traffic, a large volume of pedestrians (including lots of baby strollers), frequent jaywalking, ongoing roadworks, and often-aggressive, hurried drivers.

These conditions make Jerusalem an excellent testing environment, but we’re not limiting ourselves to the one location; rather, over the past few years, we’ve expanded testing to other environments around the world, including Tokyo, Shanghai, Paris, Munich, Stuttgart, Detroit, New York, and Miami. We’re testing on a variety of road types, both day and at night, and in a multitude of driving conditions.

A Mobileye autonomous development vehicle opposite the Eifel Tower, pictured while undergoing testing in Paris.

Key to our geographic scalability is our Road Experience Management™ (REM™) technology. REM crowdsources data from millions of vehicles around the globe equipped with Mobileye technology, and creates a map of all the roads they’re traveling to inform the AV about what to “expect” on those roads.

By undertaking such rigorous testing, we aim to better prepare our self-driving solutions to handle whatever conditions they may face out there on the road – regardless of the location, driving environment, weather, or other conditions in which they’re operating.

Q: How does the autonomous vehicle handle narrow streets where visibility may be limited?

A: Just as a human driver would instinctively drive more cautiously on a narrow urban street where pedestrians might suddenly jump onto the road, so must an AV make similar assumptions and be cautious in such areas.

As you can see in the video below, the AV is following the fourth rule of our Responsibility-Sensitive Safety™ model (RSS™) – the framework on which its driving policy is based. RSS rule #4 instructs the vehicle to always be cautious in areas of limited visibility.

Q: How does the AV handle inclement weather conditions?

A: Visibility on the road can be compromised by a long list of weather events, including heavy rain, snow, fog, strong crosswinds, and more. We prepare our autonomous vehicles for such conditions by equipping them with a full array of sensor types so that they’ll be able to detect and handle whatever weather they encounter. The sensor inputs allow an AV to determine, for example, whether it should reduce speed and proceed with greater caution due to weather conditions.

Importantly, our AV featured in the night-drive video in Jerusalem incorporates a radar/lidar subsystem (in addition to a camera-based subsystem). We call this approach to sensing True Redundancy™, which specifically provides for incorporation of both subsystems operating independently of one another. So, if bad weather limits the cameras’ visibility, the vehicle will still be able to operate safely and effectively on radar and lidar alone (since these active sensors are not impeded in the same way by bad weather as cameras are).

A compressed-air system also keeps the lenses on our AV’s cameras clean from dirt, grime, rain, snow, and ice it might pick up from the driving environment.

A Zeekr 001 with Mobileye SuperVision testing our camera-only self-driving system in the snow in Detroit.

In addition, the REM maps mentioned earlier provide an additional rich layer of information on the driving environment, supplementing what the vehicle’s onboard sensors pick up, which is especially useful if any of the sensors are impaired by reduced visibility conditions.

In the unlikely event that the AV determines for any reason that it cannot proceed safely, it’s programmed to either pull over to the side of the road and stop (if equipped with an eyes-off/hands-off or driverless system), or slow down and remain within its lane (in a camera-only, eyes-on/hands-off system).

 

These are some of the excellent questions we’ve received, and we hope the answers we’ve provided here have helped you better understand how our self-driving technologies work. For a closer look at each of the unedited autonomous drive videos captured in locations around the world, watch the videos in the playlist below.

Share article

Sign Up for Our Newsletter

To stay up-to-date on the latest at Mobileye

Press Contacts

Contact our PR team

/