[{"data":1,"prerenderedAt":2464},["ShallowReactive",2],{"postDataBlog_three-reasons-camera-first-adas-enables-scalable-automated-driving":3,"featuredPosts":18},[4],{"type":5,"url":6,"title":7,"description":8,"meta_description":8,"primary_tag":9,"author_name":10,"is_hidden":11,"lang":12,"image":13,"img_alt":14,"content":15,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"publish_date":17},"blog","three-reasons-camera-first-adas-enables-scalable-automated-driving","Three reasons camera-first ADAS enables scalable automated driving","Many of today’s automated driving capabilities begin with something surprisingly simple: a camera.",4,"",0,"en","https://static.mobileye.com/website/us/corporate/images/573bcfd215b643fabfb9d60a78d2d4cf_1776346572665.png","Cameras are cost-effective sensors to produce","\u003Cp>Today's advanced driver assistance systems (ADAS) have evolved in leaps and bounds, and more drivers are beginning to get a taste of autonomous driving. But it can be easy to forget that much of the driver assist technology Mobileye provides begins with a camera.\u003C/p>\n\u003Cp>While Mobileye products are integrated with Lidar and radar as part of a comprehensive sensing system, its camera-first approach is the foundation for scalable driver-assistance and autonomous capabilities.\u003C/p>\n\u003Cp>We dive into three reasons why Mobileye's camera-first approach is foundational to robust and scalable driver-assistance and autonomous technology and its many use cases.\u003C/p>\n\u003Ch3>\u003Cstrong>1. Camera sensors mimic human vision in ADAS systems\u003C/strong>\u003C/h3>\n\u003Cp>Road infrastructure is inherently designed for the human eye. We're the ones driving after all.&nbsp; Automotive cameras are designed to capture objects like traffic lights, road signs, lane markings, and road users, all visual information that cameras are uniquely positioned to capture and interpret, complementing what lidar and radar sensors provide.\u003C/p>\n\u003Cp>In fact, with as little as a single forward-facing camera and optimized architecture, a vehicle can support ADAS features such as lane departure, front pedestrian detection and even \u003Ca href=\"https://www.mobileye.com/blog/what-is-fmvss-127-and-how-can-mobileye-help-automakers-comply/\">meet core safety requirements\u003C/a>.\u003C/p>\n\u003Cp>Multiple camera configurations go a step further, designed to support heightened detection and form the basis of a broader automotive perception system capable of detecting \u003Ca href=\"https://www.euroncap.com/en/car-safety/the-ratings-explained/vulnerable-road-user-vru-protection/aeb-pedestrian/\">surrounding cyclists, pedestrians, and merging vehicles\u003C/a>.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/27a28e92c80a34ee136d43235b8018ab_1776356029022.png\" alt=\"\" width=\"1200\" height=\"675\" />\u003C/p>\n\u003Cp>Mobileye platforms support different camera setups, from a single forward-facing camera in basic driver assistance systems to multi-camera surround systems in more advanced platforms. A typical surround setup includes a high-resolution front camera that looks ahead, along with additional cameras placed around the vehicle. Together, these cameras help support highway driving, blind-spot monitoring, lane changes, and low-speed maneuvers such as parking.\u003C/p>\n\u003Ch3>\u003Cstrong>2. Cameras are affordable and suited for mass scale production\u003C/strong>\u003C/h3>\n\u003Cp>The car production industry can be costly. Cameras are cost-effective sensors to produce, especially compared with other types of sensors, making them well suited for mass production.\u003C/p>\n\u003Cp>With cameras, it is a relatively simpler way to achieve high-level deployment across millions of vehicles, even when including multi-camera configurations like \u003Ca href=\"https://www.mobileye.com/blog/how-surround-adas-delivers-the-new-standard-of-safety-and-tech/\">Mobileye Surround ADAS&trade;\u003C/a> and \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\">SuperVision&trade;\u003C/a>, or even full autonomous multi-sensor systems like \u003Ca href=\"https://www.mobileye.com/solutions/chauffeur/\">Chauffeur&trade;\u003C/a> and \u003Ca href=\"https://www.mobileye.com/solutions/drive/\">Drive&trade;\u003C/a>.\u003C/p>\n\u003Cp>By adding additional cameras and the needed compute, automakers can scale capability across vehicle segments while maintaining a cost structure suitable for mass-market vehicles.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/3c17b9e26e2dbf729d99070b6d1dc101_1776356049904.jpg\" alt=\"\" width=\"1200\" height=\"565\" />\u003C/p>\n\u003Ch3>\u003Cstrong>3. Vision-first perception creates a strong system foundation efficiently \u003C/strong>\u003C/h3>\n\u003Cp>Advanced driver-assistance technology requires an extensive technology stack. But an important question is how to build those capabilities in the most effective and efficient way. Mobileye designs its systems so that vision sits at the center of how the vehicle understands the road.\u003C/p>\n\u003Cp>Cameras capture the driving environment, and AI-based perception systems interpret that visual data to identify vehicles, pedestrians, lane markings, traffic signals, and other critical road elements, creating a rich understanding of the driving scene. Combined with advanced AI, the system can reconstruct three-dimensional depth using cameras positioned around the vehicle, even when those cameras are not configured as traditional stereoscopic pairs.\u003C/p>\n\u003Cp>Once the scene is understood, other parts of the system, such as mapping, driving decisions, and safety models, use that information to help guide safe and intelligent driving.\u003C/p>\n\u003Ch3>\u003Cstrong>Real-world experience above all&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>What transforms raw visual input into a coherent understanding of the driving environment is Mobileye's proprietary&nbsp;Mobileye&rsquo;s \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\">EyeQ&trade; \u003C/a>system-on-chip\u003C/p>\n\u003Cp>But in the end, scalable driver-assistance systems begin with perception. By placing vision at the center of the system, Mobileye&rsquo;s camera-first architecture provides a practical path from basic safety features to increasingly capable automated driving.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>",null,"2026-04-16T07:00:00.000Z",[19,22,35,46,57,67,79,88,98,108,117,127,135,146,156,166,176,185,196,207,216,227,237,246,256,266,275,285,295,305,314,323,333,344,353,363,372,382,392,403,413,423,435,445,455,465,475,484,494,504,514,524,534,544,554,564,574,584,594,604,613,623,633,642,652,664,674,684,694,703,713,723,733,743,753,763,772,782,791,801,810,820,830,840,850,860,870,880,890,899,909,919,929,940,949,960,969,979,988,998,1007,1017,1027,1037,1047,1057,1067,1076,1086,1096,1105,1115,1124,1134,1145,1155,1165,1176,1185,1195,1204,1214,1224,1233,1243,1253,1262,1271,1280,1289,1299,1308,1318,1326,1336,1345,1353,1362,1373,1381,1389,1398,1406,1415,1424,1434,1443,1452,1462,1471,1480,1490,1499,1508,1518,1528,1537,1547,1557,1567,1576,1584,1592,1601,1608,1617,1626,1636,1644,1654,1663,1673,1682,1692,1701,1709,1719,1728,1738,1747,1757,1766,1776,1785,1794,1804,1813,1823,1831,1839,1849,1859,1869,1879,1888,1898,1907,1916,1925,1934,1943,1952,1962,1971,1981,1989,1999,2008,2017,2026,2035,2044,2053,2063,2072,2081,2091,2101,2109,2119,2128,2138,2148,2158,2168,2177,2185,2195,2202,2211,2219,2229,2238,2247,2254,2265,2275,2284,2294,2304,2313,2323,2332,2340,2350,2359,2369,2378,2386,2396,2403,2409,2418,2428,2438,2445,2455],{"id":20,"type":5,"url":6,"title":7,"description":8,"primary_tag":9,"author_name":10,"is_hidden":11,"lang":12,"meta_description":8,"image":13,"img_alt":14,"content":15,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":17,"tags":21},322,"ADAS, Industry",{"id":23,"type":24,"url":25,"title":26,"description":27,"primary_tag":28,"author_name":10,"is_hidden":11,"lang":12,"meta_description":27,"image":29,"img_alt":30,"content":31,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":32,"publish_date":33,"tags":34},321,"news","mobileye-secures-major-dms-production-program-with-leading-us-automaker","Mobileye Secures Major DMS Production Program with Leading U.S. Automaker","New win extends Mobileye’s in-cabin sensing momentum as platform gains traction with global OEMs ",16,"https://static.mobileye.com/website/us/corporate/images/8bfc2d66bf2b488ba43e4ea74cf956e3_1774268681384.png","Mobileye DMS™ correlates driver gaze and attention with real-time road context to detect distraction more intelligently ","\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">JERUSALEM,&nbsp;March&nbsp;23, 2026\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\"> &mdash; Mobileye (Nasdaq: MBLY) today announced&nbsp;that a leading U.S.&nbsp;automaker will integrate&nbsp;the&nbsp;Mobileye&nbsp;Driver Monitoring System\u003C/span>\u003Cspan data-contrast=\"none\">&trade;\u003C/span>\u003Cspan data-contrast=\"auto\">&nbsp;(Mobileye&nbsp;DMS)&nbsp;into future vehicles&nbsp;equipped with Mobileye's EyeQ6L system-on-chip, with start of production targeted for 2027.&nbsp;The&nbsp;newly awarded win&nbsp;expands the scope and feature set of an&nbsp;existing ADAS&nbsp;program&nbsp;and&nbsp;is&nbsp;expected&nbsp;to&nbsp;span&nbsp;millions of vehicles across multiple models and model years.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye&rsquo;s in-cabin sensing platform includes both DMS and Occupant Monitoring (OMS), running alongside ADAS&nbsp;perception&nbsp;on a single chip. By unifying interior sensing with exterior road perception, the platform&nbsp;is designed to&nbsp;evaluate&nbsp;driver engagement in the context of the driving environment&nbsp;&ndash;&nbsp;in order to&nbsp;assess&nbsp;not just whether a&nbsp;driver is&nbsp;alert, but&nbsp;where they are looking and&nbsp;whether their attention corresponds with what is happening on the road.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The&nbsp;new&nbsp;program builds on previously secured wins, including&nbsp;Mobileye&nbsp;DMS and OMS integrated into EyeQ6H-based&nbsp;SuperVision\u003C/span>\u003Cspan data-contrast=\"none\">&trade;\u003C/span>\u003Cspan data-contrast=\"auto\">&nbsp;and Surround ADAS\u003C/span>\u003Cspan data-contrast=\"none\">&trade;\u003C/span>\u003Cspan data-contrast=\"auto\">&nbsp;programs&nbsp;with a global automaker.&nbsp;Together, these programs reflect growing OEM demand for&nbsp;the&nbsp;consolidation&nbsp;of&nbsp;driver monitoring, occupant&nbsp;safety, and advanced driving functions &ndash;&nbsp;eliminating&nbsp;the cost and complexity of a separate DMS ECU.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">&ldquo;The next generation of intelligent driving demands richer context from every part of the vehicle &ndash; the road ahead, the cabin, and the interplay between them. At the same time, automakers are looking to scale advanced driving features across their lineups without the cost penalty of additional hardware or complex system integration. Mobileye DMS delivers on both &ndash; running context-aware driver monitoring on a single ADAS chip and ECU platform. This combination is something Mobileye is uniquely positioned for, and we look forward to helping our customers deploy at scale.&rdquo; &ndash; Nimrod Nehushtan, EVP, Business Development and Strategy.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">As hands-off driving expands beyond premium vehicles,&nbsp;ensuring a driver is genuinely engaged with the road&nbsp;is increasingly important&nbsp;for&nbsp;safe deployment. Mobileye DMS&nbsp;is designed to&nbsp;correlate&nbsp;driver&nbsp;gaze with real-world road conditions from ADAS&nbsp;cameras,&nbsp;to&nbsp;catch&nbsp;distraction that&nbsp;cabin-only systems&nbsp;may miss and&nbsp;recognize&nbsp;when the driver is already aware.&nbsp;The&nbsp;intended&nbsp;result is fewer false alerts, more precise interventions, and for higher levels of autonomy, smarter takeover requests tuned to driver engagement.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The&nbsp;platform is&nbsp;intended&nbsp;to support&nbsp;Euro NCAP 2026 scoring requirements and&nbsp;is designed to address&nbsp;the&nbsp;potential evolution of the&nbsp;Euro NCAP&nbsp;2029&nbsp;protocol,&nbsp;which&nbsp;is&nbsp;expected to raise&nbsp;the benchmark from eye tracking to meaningful engagement detection.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">About Mobileye\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye (Nasdaq: MBLY) leads the mobility revolution with our autonomous driving and driver-assistance technologies, harnessing world-renowned&nbsp;expertise&nbsp;in artificial intelligence, computer vision and integrated software and hardware. Since our founding in 1999, Mobileye has enabled the global adoption of advanced driver-assistance systems that save countless lives and reduce crashes, while pioneering groundbreaking technologies such as REM&trade; crowdsourced road intelligence, Imaging Radar and Compound AI. These technologies drive the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions at scale, and powering industry-leading ADAS products. Through 2025, about 230 million vehicles worldwide have been built with Mobileye&rsquo;s&nbsp;EyeQ&nbsp;technology inside, and in 2026 Mobileye&nbsp;acquired&nbsp;Mentee Robotics to pursue the future of physical AI and humanoid robots. Since 2022, Mobileye has been listed independently from Intel (Nasdaq: INTC), which&nbsp;retains&nbsp;majority ownership. For more information, visit&nbsp;\u003C/span>\u003Ca href=\"https://cts.businesswire.com/ct/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.mobileye.com&amp;esheet=54421863&amp;newsitemid=20260210772101&amp;lan=en-US&amp;anchor=https%3A%2F%2Fwww.mobileye.com&amp;index=1&amp;md5=cfa04b95770b8044fedb93639c4fc56e\">\u003Cspan data-contrast=\"auto\">https://www.mobileye.com\u003C/span>\u003C/a>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye's name, product names, product&nbsp;marks&nbsp;and logos are trademarks or registered trademarks of Mobileye Vision Technologies Limited. Other names and brands are the property of their respective owners.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Any reference to \"Mobileye\" in this document means Mobileye Vision Technologies Ltd., Mobileye Global&nbsp;Inc.&nbsp;or any of their subsidiaries.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>",1,"2026-03-23T07:00:00.000Z","ADAS, News, Autonomous Driving, AV Safety",{"id":36,"type":5,"url":37,"title":38,"description":39,"primary_tag":40,"author_name":10,"is_hidden":11,"lang":12,"meta_description":39,"image":41,"img_alt":42,"content":43,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":44,"tags":45},320,"from-neighborhoods-to-networks-how-grorud-valleys-maas-enriched-public-transit","From neighborhoods to networks: MaaS in Oslo's Grorud Valley","A look at how autonomous MaaS could help strengthen existing public transport networks and support first  and last mile connectivity.",6,"https://static.mobileye.com/website/us/corporate/images/2d61d6b749adeef7c328028aea90f7d2_1773837732827.jpg","Mobileye enabled the technology by delivering the autonomous driving functionality.","\u003Cp>Residents of Oslo&rsquo;s Grorud Valley, a quieter residential district within Oslo&rsquo;s metropolitan area, have experienced firsthand what autonomous \u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">Mobility-as-a-Service (MaaS)\u003C/a> looks like in practice through participation in a limited pilot program.\u003C/p>\n\u003Cp>Through a small-scale \u003Ca href=\"https://ruter.no/en/projects-and-new-development/selvkjoringspiloten\" target=\"_blank\" rel=\"noopener\">pilot\u003C/a>, Mobileye, in collaboration with key regional partners, explored how autonomous vehicle (AV) solutions can help expand mobility access for suburban residents to major transit hubs, supporting Europe&rsquo;s broader AV mobility vision, and operate under harsh winter conditions.\u003C/p>\n\u003Ch3>\u003Cstrong>Building upon existing public transit infrastructure\u003C/strong>\u003C/h3>\n\u003Cp>Like many European cities, Oslo has a vast and robust public transit network. Sustainable and innovative mobility is strong across the region, but step outside the city center and the story begins to shift. In smaller villages and outer suburbs, buses run less frequently, and connections become limited. It's an issue that has in some areas made \u003Ca href=\"https://www.polisnetwork.eu/article/transport-poverty-on-the-agenda/\" target=\"_blank\" rel=\"noopener\">freedom of mobility more challenging\u003C/a>. Active modes of transportation like walking or biking aren't always safe or practical, and shared mobility solutions are often lacking, which can increase the cost and complexity of commuting to nearby transit hubs.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/c548d129deb6b4fb9ac1a3d8342e07f3_1773838037929.jpg\" alt=\"\" width=\"1200\" height=\"565\" />\u003C/p>\n\u003Cp>To tackle this obstacle, Mobileye collaborated with key bodies to bring a pilot‑scale AV ecosystem and integrate it into the city&rsquo;s existing transport network, marking an exploratory step toward Europe&rsquo;s broader automated mobility vision.\u003C/p>\n\u003Cp>Through the implementation of this AV pilot, the project aimed to connect residents more seamlessly to public transit, with the goal of enabling travel to work, schools, and shopping centers through an efficient, autonomous, and sustainable ride‑pooling service.\u003C/p>\n\u003Ch3>\u003Cstrong>What does a Mobileye, Holo and Ruter ride‑sharing platform look like?\u003C/strong>\u003C/h3>\n\u003Cp>The ride-pooling platform \u003Ca href=\"https://www.autonews.com/suppliers/nio-mobileye-autonomous-driving-tech-nears-norway-debut/\" target=\"_blank\" rel=\"noopener\">pilot\u003C/a>, launched in January 2023 in collaboration with Oslo's public transit authority Ruter and fleet management provider \u003Ca href=\"https://www.letsholo.com/\" target=\"_blank\" rel=\"noopener\">Holo\u003C/a>, operated between designated pick-up and drop-off (PUDO) points.\u003C/p>\n\u003Cp>Residents used an app-based platform to request shared rides to key mobility hubs, including train stations, schools, supermarkets, and community centers. Each vehicle operated autonomously within a defined operational design domain under the supervision of a safety driver. The pilot deployed six NIO first‑generation development vehicles powered by Mobileye Drive&trade;.\u003C/p>\n\u003Ch3>\u003Cstrong>Autonomous driving in Nordic winter conditions\u003C/strong>\u003C/h3>\n\u003Cp>One significant differentiator, particularly in the Nordic region, is the presence of extremely harsh winters, characterized by long periods of snow, ice, and low visibility, not always conducive to safe driving. However, during the pilot period, Mobileye Drive&trade;, with its sensing and AI capabilities, supported continued operations, and rides continued throughout the winter months, with no PUDOs deactivated due to weather conditions during the pilot.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/af0a6131d2fe4999f482329344e831d0_1773848447547.jpg\" alt=\"\" width=\"1200\" height=\"565\" />\u003C/p>\n\u003Ch3>\u003Cstrong>Demonstrating the potential of scalable AV MaaS\u003C/strong>\u003C/h3>\n\u003Cp>While Holo and Ruter focused on service design and PUDO distribution to ensure dense neighborhood coverage, Mobileye enabled the technology by delivering the autonomous driving functionality required to support pickup and drop‑off operations in real‑world conditions.\u003C/p>\n\u003Cp>The service was used by participating residents to connect to existing transit, and usage was notably observed among students traveling between education centers, train stations, supermarkets, and shopping hubs.\u003C/p>\n\u003Cp>This initial pilot hence demonstrated two key findings: Introducing AV MaaS into an existing public transport network may help improve service quality for passengers and has the potential, as technology continues to develop to scale to other areas and cities where improvements in public transport are being explored.\u003C/p>\n\u003Ch3>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/a9b19cc068613ed91093d9e710cc5f9a_1773848480238.jpg\" alt=\"\" width=\"1200\" height=\"565\" />\u003C/h3>\n\u003Ch3>\u003Cstrong>How does this fit into the broader European vision?\u003C/strong>\u003C/h3>\n\u003Cp>The pilot project has informed discussions around further AV deployments in the area. Recently, Ruter and Holo announced a \u003Ca href=\"https://www.mobileye.com/news/ruter-and-holo-choose-moia-with-mobileye-drive-for-next-stage-of-avs/\" target=\"_blank\" rel=\"noopener\">collaboration with MOIA\u003C/a> to bring the ID. Buzz AD to Oslo&mdash;an autonomous vehicle powered by Mobileye Drive&trade;.\u003C/p>\n\u003Cp>The Mobileye‑powered vehicle is expected to be equipped with four \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener\">EyeQ\u003C/a> processors, \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener\">REM&trade;\u003C/a>, and a comprehensive sensing suite to support autonomous operation and continuous road data generation.\u003C/p>\n\u003Cp>Beyond Oslo's Grorud Valley early‑stage pilot and any upcoming phases, Mobileye&rsquo;s broader vision is to support autonomous services that aim to benefit communities and contribute to the transformation of mobility in areas where enhanced transportation access is needed.\u003C/p>","2026-03-18T07:00:00.000Z","Driverless MaaS, Autonomous Driving",{"id":47,"type":5,"url":48,"title":49,"description":50,"primary_tag":51,"author_name":10,"is_hidden":11,"lang":12,"meta_description":50,"image":52,"img_alt":53,"content":54,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":55,"tags":56},319,"what-are-some-common-misconceptions-in-the-av-industry","What are some common misconceptions in the AV industry? ","This blog aims to bring clarity in this rapidly evolving field.  ",13,"https://static.mobileye.com/website/us/corporate/images/2f4915cf46ad5205cd886cc1e4b2da45_1771416842474.jpg","Autonomous driving: three quick truths","\u003Cp>\u003Cspan data-contrast=\"none\">From competing technical approaches to autonomous driving, differing driving styles, and questions around mass-market viability, misleading narratives can circulate&nbsp;fast.&nbsp;This blog lays&nbsp;out&nbsp;what we see as common misconceptions&nbsp;in the&nbsp;industry and&nbsp;aims&nbsp;to bring clarity&nbsp;in&nbsp;this&nbsp;rapidly evolving field.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Question:&nbsp;Is there a single &ldquo;correct&rdquo; AI approach for autonomous driving?\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"2\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"1\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"auto\">Myth: There is a single AI approach for AVs&nbsp;\u003C/span>\u003C/strong>\u003C/li>\n\u003C/ul>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"2\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"2\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"auto\">Fact: There's a lot of value in a blended approach. \u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\">Artificial intelligence is at the core of certain advanced ADAS features and autonomous driving technologies. However, as technology evolves, so does the debate around which approach serves autonomous driving. \u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cp>\u003Cspan data-contrast=\"auto\">One school of thought is that AI needs direct, raw sensor inputs and data (like camera or radar data) to learn driving functions. One such approach is end-to-end AI. This means that the more data that is fed into the system, the better the result. This approach to AI has the potential to enable highly adaptive and human-like driving behavior when trained on large volumes of diverse driving data. However, it also comes with challenges, as the decision-making process can be difficult to interpret, validate, or modify, complicating deployment across different geographies and regulatory frameworks. \u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Others adopt a modular, or &ldquo;compound AI,&rdquo; architecture that separates perception, decision-making, and planning into distinct layers. When designed for safety and predictability, this redundancy-based approach tends to align more naturally with regulatory requirements. \u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">In this way,&nbsp;Mobileye&rsquo;s&nbsp;compound&nbsp;AI&nbsp;approach breaks autonomy into clearly defined components,&nbsp;with&nbsp;modular design, multiple independent sensing modalities, and layered redundancy,&nbsp;\u003C/span>\u003Cspan data-contrast=\"none\">while utilizing the right tool and optimizing each discrete task\u003C/span>\u003Cspan data-contrast=\"auto\">.&nbsp;The result is AI behavior that&nbsp;remains&nbsp;explainable, tunable,&nbsp;and designed&nbsp;for validation.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Question:&nbsp;Do all autonomous vehicles drive the same?\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"3\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"1\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"auto\">Myth: There is no distinction between AVs\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"3\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"2\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"auto\">Fact: Driving behavior can be customized and&nbsp;tuned,&nbsp;allowing&nbsp;carmakers to&nbsp;reflect brand identity, regional driving styles, and design intent\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Not all AVs are created equal, nor should they be. While core functions of autonomous driving like safety and traffic law compliance are foundational, autonomous driving policy shapes how a vehicle behaves in real-world scenarios. AVs and hands-free systems are expected to not only smoothly stop at a red light or navigate on a highway, but they should also reflect brand personality and local driving culture. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The way a vehicle drives can and should represent a brand&rsquo;s signature style, as some might design frequent lane changes, others might prioritize smoother merging, or opt for more assertive driving in dense traffic while adhering to regulatory requirements and end-user expectations.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Some car makers might believe that they&nbsp;must&nbsp;develop AV&nbsp;technology&nbsp;and&nbsp;driving&nbsp;policy in-house&nbsp;in order to&nbsp;have control over the driving experience.\u003C/span>\u003Cstrong>\u003Cspan data-contrast=\"auto\">&nbsp;\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\">However, by doing so, they might risk missing their time-to market expectations and require a higher investment. Mobileye DXP&trade; bridges that gap, allowing automakers to fine-tune key aspects of the driving experience, while leveraging Mobileye&rsquo;s modular and scalable stack, making it possible for OEMs to develop advanced functions fast. \u003C/span>&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Question:&nbsp;Is hands-off driving&nbsp;only&nbsp;for luxury vehicles?\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"4\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"1\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"auto\">Myth: High-level autonomy is not available for everyday cars. \u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"4\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"2\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"auto\">Fact:&nbsp;Hands-off capabilities are scaling into mass-market models.\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The belief that hands-off driving is limited to premium vehicles is increasingly becoming outdated. As ADAS technologies evolve and consumer expectations grow, autonomous experiences are entering volume-production platforms, becoming more mainstream and meeting the rising need for safety, comfort and convenience. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">According to a report by IDTechEx, the global market for L2+ functionalities could reach US$17.98 billion by 2045, \u003Ca href=\"https://www.idtechex.com/en/research-report/passenger-car-adas-market-2025-2045-technology-market-analysis-and-forecasts/1080\">driven by both premium and mass-market adoption and the rise of hands-free driving features.\u003C/a>\u003C/span>\u003Cspan data-contrast=\"auto\">&nbsp;This growth&nbsp;is fueled, among other things, by regulatory changes,&nbsp;product readiness,&nbsp;advancements in computer vision and sensing technologies, and&nbsp;AI&nbsp;scalability and performance.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mass-market vehicles equipped with hands-off capabilities for supported highway conditions are becoming more common. We see this with \u003Ca href=\"https://www.mobileye.com/news/volkswagen-group-cooperates-with-valeo-and-mobileye-to-enhance-driver-assistance-in-future-mqb-vehicles/\">Mobileye's recent collaboration with Volkswagen Group and Valeo\u003C/a>.\u003C/span>\u003Cspan data-contrast=\"auto\"> The Mobileye Surround ADAS&trade; platform, powered by the EyeQ&trade;6 High, offers features like hands-off highway driving in specific conditions on approved highway sections, hazard detection, and traffic jam assist, delivering a premium-grade experience at mass-production scale.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>","2026-02-18T08:00:00.000Z","Autonomous Driving, Industry",{"id":58,"type":24,"url":59,"title":60,"description":61,"primary_tag":32,"author_name":10,"is_hidden":11,"lang":12,"meta_description":61,"image":62,"img_alt":63,"content":64,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":65,"tags":66},318,"prof-amnon-shashua-elected-to-us-national-academy-of-engineering","Prof. Amnon Shashua Elected to US National Academy of Engineering","Mobileye President and CEO recognized for contributions to computer vision and its applications to autonomous driving","https://static.mobileye.com/website/us/corporate/images/fb56d3e56621d10b6f0acc4fde141090_1770890393267.png","Mobileye CEO and President Prof. Amnon Shashua","\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye&nbsp;is proud to share that&nbsp;President&nbsp;and CEO Prof. Amnon&nbsp;Shashua&nbsp;has been elected to the&nbsp;US&nbsp;National Academy of Engineering (NAE) as an international member of its Class of 2026.&nbsp;Election to the NAE is among&nbsp;the highest professional distinctions awarded to&nbsp;an&nbsp;engineer, recognizing outstanding contributions to engineering research,&nbsp;practice&nbsp;or education.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Prof.&nbsp;Shashua&nbsp;was nominated and&nbsp;elected&nbsp;by peers for his contributions to computer vision and its application to autonomous driving technology.&nbsp;He is one of 28 international members in this year&rsquo;s class, joining a body of 356 international members worldwide.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Prof. Shashua co-founded Mobileye in 1999 with the conviction that AI could make driving fundamentally safer. Since then, about 230 million vehicles have been equipped with Mobileye technology, turning foundational computer vision and machine learning research into one of the most widely deployed physical AI systems in the world. His work has spanned industry-leading perception systems, formal safety frameworks, and scalable AI architectures, contributing to the proliferation of ADAS and the evolution of autonomous driving from science project to commercial reality. \u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The foundation Prof. Shashua established in computer vision and machine learning today extends beyond vehicles. In January, Mobileye announced the acquisition of Mentee Robotics and with it a new chapter &ndash; Mobileye 3.0 &ndash; applying breakthroughs in AI to the physical world and leveraging core strengths in perception, simulation, and safety models to lead in humanoid robotics and autonomous driving.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The NAE honor adds to a series of recognitions of Prof. Shashua&rsquo;s contributions, including the 2020 Dan David Prize in artificial intelligence, the 2022 Automotive Hall of Fame Mobility Innovator Award, and the 2019 Electronic Imaging Scientist of the Year. \u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Read the full NAE announcement and view the entire list of new members \u003Ca href=\"https://www.nae.edu/19579/31222/20095/343222/345149/NAENewClass2026#:~:text=The%20National%20Academy%20of%20Engineering,of%20international%20members%20to%20356\" target=\"_blank\" rel=\"noopener\">here\u003C/a>.&nbsp;\u003C/span>\u003C/p>","2026-02-12T08:00:00.000Z","Amnon Shashua",{"id":68,"type":69,"url":70,"title":71,"description":72,"primary_tag":73,"author_name":10,"is_hidden":11,"lang":12,"meta_description":72,"image":74,"img_alt":75,"content":76,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":77,"tags":78},317,"press-kit","mobileye-at-the-adas-show-2026-india","Mobileye at The ADAS Show 2026","Visit our press kit throughout the show for the latest news, multimedia, press resources and more.",15,"https://static.mobileye.com/website/us/corporate/images/871e06d2a7a8fb75a2341a70351192b7_1770726805252.jpg","Bring world-class safety and driving intelligence to India’s roads","\u003Cp>India, the world&rsquo;s third-largest automotive market, is entering a phase of accelerated advanced driver-assistance systems (ADAS) adoption, driven by increasing safety awareness, broader integration of advanced driver assistance systems across vehicle segments, and supportive regulatory developments. These dynamics present unprecedented opportunities for market players. Local OEMs are intensifying efforts to introduce and scale ADAS technologies to strengthen their competitive edge in this promising market. As a global leader in ADAS and autonomous driving technologies, Mobileye is uniquely positioned to support India&rsquo;s mobility transformation with globally proven and locally adapted solutions.\u003C/p>\n\u003Cp>With this mission and vision, Mobileye is joining The ADAS Show on February 12, 2026, as the exclusive &ldquo;Powered By&rdquo; sponsor, alongside the Automotive Research Association of India and the Ministry of Heavy Industries India, to discuss key industry topics aimed at benefiting hundreds of millions of road users across the country.\u003C/p>\n\u003Cp>\u003Cbr />Visit us at\u003Cstrong> Booth #B17-18\u003C/strong>, ADAS Test City, Takwe, Pune, and hear insights from Mobileye spokespersons Elie Luskin, Vice President India and China, and Dhairyashil Gaekwad, Director of Business Development &amp; Strategy India, to learn how our technological expertise and real-world deployment experience can help expedite ADAS adoption at scale in India and pave the way toward higher levels of driving automation.&nbsp;\u003C/p>\n\u003Cp>\u003Cbr />Throughout the show, this press kit will be updated with Mobileye-related sessions, blogs, visual assets, and additional resources as they become available.\u003C/p>\n\u003Cp>\u003Cstrong>Events:\u003C/strong>\u003C/p>\n\u003Cp>\u003Ca href=\"https://adasshow.com/\" target=\"_blank\" rel=\"noopener\">The ADAS Show Edition 3 Powered by Mobileye\u003C/a>\u003Cbr />12\u003Csup>th\u003C/sup> February 2026\u003Cbr />ADAS Test City &ndash; Pune &ndash; ARAI\u003C/p>\n\u003Cp>\u003Cstrong>Technology Presentation\u003C/strong>\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye: Go-To ADAS Partner for India &amp; Physical AI Powerhouse covering ADAS - Autonomous Driving and Humanoid Robots\u003C/strong>\u003C/p>\n\u003Cp>10:40 &ndash; 11:00 AM, February 12, 2026\u003C/p>\n\u003Cp>\u003Cstrong>Elie Luskin, VP India and China, Mobileye\u003C/strong>, will share insights into India&rsquo;s transformation toward safer and smarter mobility, demonstrating why Mobileye, a leader in ADAS and autonomous driving technologies and now a Physical AI Powerhouse, is the trusted go-to partner for scaling ADAS in the Indian market.&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Panel Discussion\u003C/strong>\u003C/p>\n\u003Cp>\u003Cstrong>ADAS for the Indian Market &ndash; From Premium to Mass Adoption\u003C/strong>\u003C/p>\n\u003Cp>11:00 AM &ndash; 12:00 PM, February 12, 2026\u003Cbr />\u003Cstrong>Dhairyashil Gaekwad, Director of Business Development &amp; Strategy India\u003C/strong>, will join the panel and share insights on how Mobileye&rsquo;s scalable technologies are enabling automakers to bring advanced safety systems across all vehicle tiers &ndash; from premium models to affordable segments in India.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye in the Indian \u003C/strong>\u003Cstrong>Press\u003C/strong>\u003C/p>\n\u003Cp>ETAuto: \u003Ca href=\"https://auto.economictimes.indiatimes.com/news/auto-technology/mobileye-and-vvdn-technologies-partner-to-localize-advanced-driver-assistance-systems-in-india/124890370\" target=\"_blank\" rel=\"noopener\">Mobileye VVDN Technologies Partnership For ADAS: Mobileye and VVDN Technologies Partner to Localize Advanced Driver Assistance Systems in India\u003C/a>\u003Cbr />Acko Drive: \u003Ca href=\"https://ackodrive.com/news/2026-2027-to-be-defining-years-for-adas-in-india-mobileye/\" target=\"_blank\" rel=\"noopener\">2026, 2027 to be Defining Years for ADAS in India: Mobileye\u003C/a>\u003Cbr />Future Mobility Media: \u003Ca href=\"https://futuremobilitymedia.com/thought-leaders/interview/global-tech-leadership-meets-local-traffic-conditions/\" target=\"_blank\" rel=\"noopener\">Global Tech Leadership Meets Local Traffic Conditions: Mobileye Steers Towards Road Safety for All in India\u003C/a>\u003Cbr />Motorindia: \u003Ca href=\"https://www.motorindiaonline.in/mobileye-to-power-indias-adas-revolution-at-inauguration-of-arais-adas-test-city-in-pune/\" target=\"_blank\" rel=\"noopener\">Mobileye to Power India&rsquo;s ADAS Revolution at Inauguration of ARAI&rsquo;s ADAS TEST CITY in Pune\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Galleries\u003C/strong>\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye technology and solutions\u003C/strong>\u003Cstrong> \u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileye-technology-and-solutions[**]\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye ECU Series\u003C/strong>\u003Cstrong> \u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileye-ecu-series[**]\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye SuperVision on the Road\u003C/strong>\u003Cstrong> \u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileyes-advanced-platforms-in-the-drivers-seat[**]\u003C/p>\n\u003Cp>\u003Cstrong>Mentee images&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:menteebot[**]\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Video\u003C/strong>\u003C/p>\n\u003Cp contenteditable=\"false\">[**]vimeo-press:1151510258[**]\u003C/p>\n\u003Cp contenteditable=\"false\">[**]vimeo-press:1151510659[**]\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Get to Know Mobileye\u003C/strong>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/from-driver-assisting-to-self-driving/\">From driver assisting to self-driving: Mobileye&rsquo;s most FAQs\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/solutions/\">Mobileye Solutions | From Driver Assistance to Self-Driving\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/mobileyes-adas-poised-to-power-indias-safety-shift/\">Mobileye&rsquo;s ADAS: Poised to power India's safety shift\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/technology/\">Mobileye Technology | Rethinking the Autonomous Future\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/the-fast-lane-to-higher-levels-of-autonomy-with-the-eyeq6-soc/\">The fast lane to higher levels of autonomy with the EyeQ&trade;6 SoC\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/nimrod-nehushtan-rem-cloud-enhanced-adas-autotech-detroit/\">The road to the future of mobility is being mapped by REM&trade;\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye Insights\u003C/strong>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/takeaways-from-the-mobileye-press-conference-with-ceo-prof-amnon-shashua-at-ces-2026/\">Prof. Amnon Shashua at CES 2026: Robotaxi updates, breakthroughs in AI, and robotics | Mobileye Blog\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/hands-off-eyes-off-taxonomy-for-automated-driving/\">Autonomous Driving Levels: Hands Off, Eyes Off - A New Taxonomy\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/compound-ai-the-framework-powering-scalable-autonomy/\">Compound AI: The framework powering scalable autonomy\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/tackling-global-regulations-and-safety-standards/\">Tackling global regulations and safety standards\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/the-mobileye-safety-methodology-for-fully-autonomous-driving/\">The Mobileye safety methodology for fully autonomous driving\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/pedestrian-safety-month-protection-detection/\">Pedestrian Safety: Protection Begins with Detection\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2026-02-10T08:00:00.000Z","Press Kit, News",{"id":80,"type":5,"url":81,"title":82,"description":83,"primary_tag":51,"author_name":10,"is_hidden":11,"lang":12,"meta_description":83,"image":84,"img_alt":85,"content":86,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":87,"tags":21},316,"how-premium-level-hands-off-driving-became-scalable-for-the-mass-market","The era of scalable premium-level, hands-off driving ","Discover how Mobileye Surround ADAS™ is a cost-efficient and scalable ADAS platform that makes premium hands-off driving accessible across mass-market vehicles.","https://static.mobileye.com/website/us/corporate/images/71528f4db6eba81be4cbd3545f0e9998_1769588395078.png","Made for reduced hardware overhead, simplified software integration, and  overall system architecture. ","\u003Cp>\u003Cspan data-contrast=\"auto\">Advanced ADAS capabilities have proliferated across vehicle segments, with an increasing number of modern vehicles falling under the eyes-on, hands-off&nbsp;category\u003C/span>\u003Cspan data-contrast=\"auto\">.\u003C/span>\u003Cspan data-contrast=\"auto\"> But as these technologies reach maturity, industry focus shifts away from individual feature enhancement. Instead, it moves toward feature consolidation and the development of advanced, cohesive, and affordable premium ADAS systems that are ready for production.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335557856&quot;:16777215,&quot;335559739&quot;:120}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">With \u003Ca href=\"https://www.mobileye.com/news/mobileye-surround-adas-adds-second-top-10-automaker/\">two major automaker wins, \u003C/a>\u003C/span>\u003Cspan data-contrast=\"auto\">Mobileye Surround ADAS&trade; is entering a new phase. This progress signals a clear path toward broader industry adoption worldwide.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>From standalone ADAS features to scalable integrated platforms\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">It&rsquo;s clear that ADAS has moved beyond early adoption and into a mature phase. In fact, a recent \u003Ca href=\"https://www.autopacific.com/autopacific-insights/2025/7/24/autopacifics-newest-future-attribute-demand-study-fads-shows-increase-in-demand-for-autonomous-driving-and-adas-features\">study \u003C/a>\u003C/span>\u003Cspan data-contrast=\"auto\">revealed that hands-off, driver-supervised highway driving is the single most wanted vehicle feature,&nbsp;desired&nbsp;by 43% of new-vehicle intenders.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">New cars today have capabilities&nbsp;that&nbsp;range&nbsp;from&nbsp;more basic ADAS&nbsp;features&nbsp;such as automatic emergency braking and lane-keeping assist&nbsp;to more advanced features like&nbsp;assisted lane change, and among premium vehicles, hands-off driving on select highways.&nbsp;&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">With the provision of some features moving toward standard fitment, the challenge lies in how they come\u003C/span>\u003Cspan data-contrast=\"auto\"> together to ensure low latency across inter-ECU communication, reliable operation in fragmented environments, and the abilty to meet increasingly high validation pressures.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">The consolidation of parking for cost efficiency\u003C/span>\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">In a software defined era, there&rsquo;s a concerted effort to \u003Ca href=\"https://www.mobileye.com/blog/the-shift-towards-centralized-intelligence/\">minimize ECUs \u003C/a>\u003C/span>\u003Cspan data-contrast=\"auto\">where possible, pushing automakers to re-evaluate their core architectures&nbsp;to&nbsp;reduce costs, wiring and integration.&nbsp;\u003C/span>\u003Cspan data-contrast=\"auto\">But too much consolidation can potentially introduce risk and dramatically reshape existing architectures. So where is the balance? Where can automakers consolidate and reduce hardware and software cost in the most effective way? One answer lies in the parking application.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Historically, parking and ADAS have run on separate ECUs, existing in silos within the same vehicle. Yet because both systems rely on the same sensors, maintaining duplicate data paths becomes an unnecessary cost.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003Cspan data-contrast=\"auto\">That model is now changing. In&nbsp;a&nbsp;recent Mobileye production program, ADAS and parking&nbsp;have now been&nbsp;consolidated&nbsp;to&nbsp;run on a singl\u003C/span>\u003Cspan data-contrast=\"auto\">e \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\">EyeQ6H\u003C/a>\u003C/span>\u003Cspan data-contrast=\"auto\">&nbsp;within one ECU.\u003C/span>\u003Cspan data-contrast=\"auto\">&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">This setup&nbsp;leverages&nbsp;the same&nbsp;available&nbsp;sensors for both applications and removes the use for an extra dedicated&nbsp;parking&nbsp;ECU, thus&nbsp;reducing hardware costs and increasing&nbsp;system&nbsp;simplicity.\u003C/span>\u003Cspan data-contrast=\"auto\">&nbsp;\u003C/span>\u003Cspan data-contrast=\"auto\">&nbsp;\u003C/span>\u003Cspan data-contrast=\"auto\">Through this architecture,&nbsp;automakers&nbsp;can deliver richer functionality without adding costs.&nbsp;\u003C/span>\u003Cspan data-contrast=\"auto\">Consolidating parking is therefore less about adding capability and more about making the economics of a unified L2 platform work.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/599de49148978271f5e84e58a42b0461_1770196613152.png\" alt=\"\" width=\"1200\" height=\"675\" />\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>Pairing driver monitoring systems with ADAS\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Another efficient consolidation opportunity comes from pairing ADAS and Driver Monitor System (DMS). In the Mobileye Surround ADAS platform, the\u003Ca href=\"https://www.mobileye.com/blog/presenting-the-mobileye-driver-monitoring-system-fusing-road-safety-inside-the-cabin/\"> Mobileye DMS&trade;\u003C/a> \u003C/span>\u003Cspan data-contrast=\"auto\">works directly alongside the ADAS stack, allowing both systems to&nbsp;operate&nbsp;in concert. By&nbsp;leveraging&nbsp;the vehicle&rsquo;s external sensing system, Mobileye&rsquo;s DMS&nbsp;is designed to&nbsp;eliminate&nbsp;the need for a separate processor, reducing&nbsp;cost&nbsp;and supporting broader SoC and ECU consolidation goals.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">This integration is not just about hardware savings. Running ADAS and DMS together on a single chip also enables tighter coordination between internal and external sensing, allowing the system to function more cohesively and respond more intelligently in real time.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">By cross-referencing the driver's gaze with real-time road conditions captured by the external ADAS cameras, the system&nbsp;is designed to&nbsp;assess&nbsp;the driver&nbsp;more accurately&nbsp;and respond more intelligently.&nbsp;The result is lower latency and tighter coupling than would be possible if ADAS and DMS were running on separate ECUs communicating over a vehicle network\u003C/span>\u003Cspan data-contrast=\"auto\">,\u003C/span>\u003Cspan data-contrast=\"auto\">.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">This integration supports&nbsp;highway&nbsp;hands-off/eyes-on&nbsp;driving&nbsp;in designated areas&nbsp;while reducing redundancy and simplifying system validation. It also reinforces a broader industry trend:&nbsp;hands-off driving&nbsp;is no longer built from loosely connected subsystems, but from unified platforms.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>Why two big automakers have decided that Surround is the new standard for ADAS\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">With automakers increasingly concerned with building architectures designed for scale,&nbsp;Surround ADAS&nbsp;responds directly to these design needs.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003Cspan data-contrast=\"auto\">Now, with&nbsp;two major global automakers adopting Surround ADAS, the solution is becoming&nbsp;a&nbsp;reference architecture of the&nbsp;industry,&nbsp;demonstrating&nbsp;that the move is not merely a point, but a&nbsp;vector in&nbsp;a new&nbsp;direction.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Through&nbsp;the Surround ADAS solution, that&nbsp;leverages&nbsp;the&nbsp;latest advances in AI, automakers&nbsp;can meet consumer&nbsp;and industry demands with far more&nbsp;simplicity and&nbsp;bring&nbsp;advanced&nbsp;and scalable&nbsp;ADAS&nbsp;technology&nbsp;to vehicles across different&nbsp;segments, on a mass-market scale.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch5>\u003Cem>\u003Cstrong>*Above specifications may be modified and updated by Mobileye.\u003C/strong>\u003C/em>\u003C/h5>\n\u003Cp>&nbsp;\u003C/p>","2026-02-04T08:00:00.000Z",{"id":89,"type":5,"url":90,"title":91,"description":92,"primary_tag":32,"author_name":10,"is_hidden":11,"lang":12,"meta_description":92,"image":93,"img_alt":94,"content":95,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":96,"tags":97},315,"takeaways-from-the-mobileye-press-conference-with-ceo-prof-amnon-shashua-at-ces-2026","Prof. Amnon Shashua at CES 2026: Robotaxi updates, breakthroughs in AI, and robotics","Major milestones, meaningful collaborations, advances in robotaxis, novel training approaches and humanoid robotics, Mobileye’s CEO revealed several key updates as the company looks ahead to 2026 and beyond.","https://static.mobileye.com/website/us/corporate/images/749a8a359e3b53c6c41e870f8770aa0f_1767863157245.jpg","These announcements mark a new phase for the company.","\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye kicked off CES 2026 with its annual address by Prof. Amnon Shashua, President and CEO of Mobileye.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Taking to the stage, Mobileye&rsquo;s CEO revealed several key updates as the company looks ahead to 2026 and beyond. The address spanned the full ADAS to AV product spectrum and major milestones, advances in robotaxis with Volkswagen and MOIA, progress in vision-language models and novel training approaches, and finally, Mobileye&rsquo;s big news of the day, expansion into humanoid robotics through the acquisition of Mentee Robotics Ltd. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Together, these announcements mark a new phase for the company, Mobileye 3.0, in which its leadership in Physical AI is expanded across two compelling frontiers.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>Company performance and&nbsp;solutions&nbsp;deployment&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Starting with an overview of the data and numbers of Mobileye&rsquo;s market position, expected revenue, and deployment numbers, Prof. Amnon Shashua addressed significant growth points. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Entering 2026, Mobileye has a projected estimated $24.5 billion revenue pipeline over the next eight years, representing approximately 42% growth from 2023&rsquo;s projected revenue pipeline of $17.3 billion. \u003C/span>\u003Cspan data-contrast=\"auto\">During&nbsp;2025, the company secured design wins with two new OEMs&nbsp;that&nbsp;Mobileye had&nbsp;not partnered with in the past&nbsp;decade and&nbsp;saw nominated volumes for EyeQ&trade;6L grow 3.5x&nbsp;compared to 2024.&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">As of&nbsp;the end of&nbsp;Q3&nbsp;2025, Mobileye technology is now deployed in more than 230 million vehicles worldwide.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>Surround ADAS is shaping the&nbsp;evolution&nbsp;of driver&nbsp;assist&nbsp;for&nbsp;mass market vehicle production&nbsp;&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan class=\"TextRun SCXW265358974 BCX8\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW265358974 BCX8\">The keynote highlighted a significant design win, with a major \u003Ca href=\"https://www.mobileye.com/news/mobileye-surround-adas-adds-second-top-10-automaker/\">U.S-based OEM\u003C/a> selecting\u003C/span>\u003Cspan class=\"NormalTextRun SCXW265358974 BCX8\">&nbsp;the\u003C/span>\u003Cspan class=\"NormalTextRun SCXW265358974 BCX8\">&nbsp;Mobileye Surround&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW265358974 BCX8\">ADAS&trade;&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW265358974 BCX8\">platform for upcoming mass-market vehicle production.&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun CommentStart CommentHighlightPipeRest CommentHighlightRest SCXW265358974 BCX8\">This&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun CommentHighlightRest SCXW265358974 BCX8\">win reinforces\u003C/span>\u003Cspan class=\"NormalTextRun CommentHighlightRest SCXW265358974 BCX8\">&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun CommentHighlightRest SCXW265358974 BCX8\">a\u003C/span>\u003Cspan class=\"NormalTextRun CommentHighlightRest SCXW265358974 BCX8\">&nbsp;trend&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun CommentHighlightRest SCXW265358974 BCX8\">we see&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun CommentHighlightRest SCXW265358974 BCX8\">of basic ADAS evolving towards Surround ADAS\u003C/span>\u003Cspan class=\"NormalTextRun CommentHighlightRest SCXW265358974 BCX8\">.\u003C/span>\u003C/span>\u003Cspan class=\"EOP CommentHighlightPipeRest SCXW265358974 BCX8\" data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Surround ADAS extends perception and assistance around the entire vehicle using multiple cameras and radars, all processed on a single EyeQ&trade;6H system-on-chip within one ECU. This centralized architecture is designed to enable advanced driver assistance, integrated parking, and hands-off, eyes-on driving capabilities within defined highway operational domains and conditions, while reducing system complexity and cost for OEMs.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/04e658007e1bece0efff765b2e911adb_1767869270652.jpg\" alt=\"\" width=\"1200\" height=\"675\" />\u003C/span>\u003C/p>\n\u003Ch3>MOIA plans to deploy more than 100,000 AVs globally by 2033\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Bringing robotaxis safely onto public roads requires an end-to-end ecosystem that supports continuous operation, fleet management, and real-world readiness.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">&nbsp;\u003C/span>\u003Cspan data-contrast=\"auto\">Highlighting the ID. Buzz program, Volkswagen Autonomous Mobility CEO Christian Senger, joined Prof. Shashua on stage to discuss how large-scale deployment comes together in practice. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Volkswagen brings industry-scale vehicle production, Mobileye delivers Level 4 autonomous driving through Mobileye Drive&trade;, and MOIA provides the fleet operations and service layer, together forming a complete operational ecosystem around the ID.&nbsp;Buzz&nbsp;platform.&nbsp;&nbsp;Initial U.S. deployments are planned for 2026, followed by broader rollouts in Europe.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">\u003Cspan class=\"TextRun SCXW209012507 BCX8\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">Together, this project brings&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun ContextualSpellingAndGrammarErrorV2Themed SCXW209012507 BCX8\">significant\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">&nbsp;scale to the robotaxi project, with testing&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">occurring&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">in&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">multiple\u003C/span>\u003Cspan class=\"NormalTextRun CommentStart CommentHighlightPipeRest PointComment CommentHighlightRest SCXW209012507 BCX8\">&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">locations, under&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">differing&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">weather and&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">climate&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun ContextualSpellingAndGrammarErrorV2Themed SCXW209012507 BCX8\">conditions,\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">&nbsp;from sun\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">shine to rain\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\"> to snow. \u003C/span>\u003C/span>\u003C/span>\u003Cspan data-contrast=\"auto\">\u003Cspan class=\"TextRun SCXW209012507 BCX8\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">According to Christian\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">&nbsp;Senger,\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">b\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">y\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">&nbsp;2033,&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">MOIA&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">aims to deploy over 100,000 self-driving&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">vehicles\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">, that are based on Mobileye's self-\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">driving\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">system\u003C/span>\u003Cspan class=\"NormalTextRun SCXW209012507 BCX8\">.\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW209012507 BCX8\" data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">&nbsp;\u003C/span>\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>How AI is being integrated into autonomy&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">While&nbsp;collaborations&nbsp;bring business to life, the innovation itself continues to accelerate, and AI is very much at the center of how autonomy is evolving. New and more powerful models are&nbsp;emerging, not just for&nbsp;digital-only&nbsp;applications, but for physical systems&nbsp;operating&nbsp;in the&nbsp;real&nbsp;world. But even with extremely powerful models, core AI challenges do not disappear, including known obstacles such as hallucinations, achieving safety assurances, and the need to train at&nbsp;a&nbsp;massive&nbsp;scale.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Prof.&nbsp;Shashua&nbsp;pointed to Mobileye&rsquo;s architectural response to these constraints&nbsp;and&nbsp;how to harness modern AI in compute and power-constrained real-time systems while&nbsp;maintaining&nbsp;exceptionally high accuracy through fast and slow&nbsp;thinking, vision-language-semantic-action and&nbsp;more.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>Fast and slow thinking \u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">An important&nbsp;aspect of the approach involves looking at the&nbsp;tasks&nbsp;through a lens of fast thinking and slow thinking. Fast thinking&nbsp;refers to the system\u003C/span>\u003Cspan data-contrast=\"auto\">&nbsp;responsible for 'reflexive' decisions, at&nbsp;a&nbsp;high frequency&nbsp;rate, including&nbsp;the&nbsp;safety layer. The slow thinking system&nbsp;on the other hand&nbsp;is responsible for&nbsp;the driving decisions that require reasoning about the&nbsp;entire&nbsp;scene,&nbsp;but&nbsp;don't&nbsp;affect safety, and therefore can run at a low-frequency rate.\u003C/span>\u003Cspan data-contrast=\"auto\">&nbsp;&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/5a6ff54ec40f4d0803a26a2e566f688c_1767863258664.png\" alt=\"\" width=\"1200\" height=\"676\" />\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>Emergent driving policy&nbsp;with ACI&nbsp;&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">At the center of this approach is ACI, Artificial Community Intelligence, a self-play-based framework built on sensing-state simulation rather than photorealistic imagery for driving policy. Using HD maps generated through REM&trade;&mdash;Mobileye&rsquo;s AV mapping technology&mdash;ACI places agents such as cars, pedestrians, buses, and other road users onto real road layouts, each with millions of possible driving behaviors. This allows rare, high-risk scenarios to be injected at high density and enables billions of simulated driving hours to be generated overnight. \u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559685&quot;:720,&quot;335559740&quot;:257,&quot;335559991&quot;:720}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>Vision-Language-Semantic-Action&nbsp;(VLSA)&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Vision-language-semantic-action (VLSA) acts as a slow-thinking, vision-language-based model that processes deep scene semantics, almost like an adult accompanying a young driver in complex driving situations. Rather than controlling the vehicle or outputting trajectories, VLSA provides structured semantic guidance that feeds into planning, while safety-critical control remains in the fast-thinking system governed by formal safety layers. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Together, this separation between fast and slow thinking, combined with large-scale simulation-driven training, creates a path to scaling autonomy without placing generative models in the safety loop or relying on human teleoperation to resolve every edge case.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559685&quot;:720,&quot;335559740&quot;:257,&quot;335559991&quot;:720}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cvideo autoplay=\"autoplay\" loop=\"loop\" muted=\"\" width=\"100%\" height=\"100%\">\u003Csource src=\"https://static.mobileye.com/website/us/corporate/videos/ces_2026_slide28.mp4\" type=\"video/mp4\" />\u003C/video>\u003C/p>\n\u003Ch3>\u003Cstrong>From eyes-on to mind-off: the future of ADAS, AVs, and robotics&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Prof.&nbsp;Shashua&nbsp;outlined&nbsp;the&nbsp;industry's&nbsp;evolution&nbsp;across three autonomous driving categories through the 2030s, spanning ADAS, consumer AVs, and robotaxis, all framed around how intelligence can safely scale. L2++ eyes-on systems, currently designed for higher-end consumer vehicles, will continue to undergo cost optimization to enable deployment as a standard feature across broader vehicle segments.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Consumer L3 systems are expected to move beyond today&rsquo;s eyes-off operation toward L4 mind-off driving, with lower intervention rates and expansion beyond current highway operational design domains. Robotaxi systems, while already commercially deployable, will accelerate through sensor and cost reduction, alongside dramatically lower teleoperation-to-vehicle ratios.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Looking toward 2030, the next major shift in autonomy&nbsp;is&nbsp;the move from eyes-off to mind-off driving, where intervention becomes rare enough to support large-scale deployment.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/bad8d11c129052a484c41c13a57ae2e8_1767863433955.jpg\" alt=\"\" width=\"1200\" height=\"675\" />\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>Mobileye and Mentee Robotics: A new frontier in Physical AI&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Closing his keynote, Prof. Shashua announced&nbsp;that&nbsp;Mobileye&nbsp;is set to&nbsp;\u003Ca href=\"https://www.mobileye.com/news/mobileye-to-acquire-mentee-robotics-to-accelerate-physical-ai-leadership/\">acquire&nbsp;Mentee Robotics\u003C/a>, a company developing&nbsp;vertically integrated hardware and software,&nbsp;simulation-first learning,&nbsp;few-shot generalization,&nbsp;highly dexterous humanoid&nbsp;hands, with zero teleoperation. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">With this step, Mobileye extends its reach beyond vehicles and into a broader class of intelligent, physical&nbsp;AI&nbsp;systems&nbsp;also built for the real world.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The move reflects a deeper convergence between autonomous driving and robotics\u003C/span>\u003Cspan data-contrast=\"auto\">, both domains rely on a common Physical Artificial Intelligence stack that spans multimodal&nbsp;perception, world modeling, intent-aware planning, precision control, and decision-making under uncertainty.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:257}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">For a more in-depth look, watch the entire press conference \u003Ca href=\"https://www.youtube.com/watch?v=VUI85RtI3O0\">here.&nbsp;\u003C/a>\u003C/span>\u003C/p>","2026-01-10T08:00:00.000Z","Amnon Shashua, Events",{"id":99,"type":69,"url":100,"title":101,"description":102,"primary_tag":73,"author_name":10,"is_hidden":11,"lang":12,"meta_description":102,"image":103,"img_alt":104,"content":105,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":106,"tags":107},312,"mobileye-agrees-to-acquire-mentee-robotics","Mobileye Agrees to Acquire Mentee Robotics","Combination brings world-class AI talent together to scale humanoid robots and autonomous vehicles globally","https://static.mobileye.com/website/us/corporate/images/afa43d0bb6d179df2aea3f407dd39b28_1767531378872.jpg","Mobileye and Mentee: Unlocking the next era of physical AI, together.","\u003Cp>\u003Cspan data-teams=\"true\">Today at CES 2026, Mobileye announced an agreement to acquire Mentee Robotics, combining Mobileye&rsquo;s advanced AI technology and global production expertise with Mentee&rsquo;s breakthrough humanoid platform and deep AI talent, creating a global leader in physical AI across two transformative markets: autonomous driving and humanoid robotics.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan class=\"EOP SCXW165912919 BCX0\" data-ccp-props=\"{}\">\u003Cspan class=\"TextRun SCXW137863347 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW137863347 BCX0\">\u003Cspan data-teams=\"true\">Here you will find background information, official press materials, and visual assets.&nbsp;&nbsp;\u003C/span>\u003C/span>\u003C/span>\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>About Mentee Robotics\u003C/strong>\u003C/p>\n\u003Cp>Mentee Robotics is building an AI-first, vertically integrated humanoid robot designed for real-world usefulness across logistics, industrial, and household environments. The MenteeBot combines camera-only sensing, integrated AI, proprietary electric motors, and Sim2Real-based learning to enable precise locomotion, scene understanding, manipulation, and natural-language guidance. Its behaviors are trained in simulation and efficiently adapted to the physical world, supporting reliability, scalability, and continuous improvement over time. Mentee&rsquo;s mission is to create a universal, human-centric robotic system with dexterity and perception that can adapt to different environments and tasks through natural interaction.&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>News\u003C/strong>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/news/mobileye-to-acquire-mentee-robotics-to-accelerate-physical-ai-leadership/?utm_source=website&amp;utm_medium=press-kit-pt&amp;utm_campaign=ces-2026\" target=\"_blank\" rel=\"noopener\">Mobileye To Acquire Mentee Robotics to Accelerate Physical AI Leadership\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan class=\"EOP SCXW165912919 BCX0\" data-ccp-props=\"{}\">\u003Cspan class=\"EOP SCXW137863347 BCX0\" data-ccp-props=\"{}\">\u003Cstrong>\u003Cspan class=\"TextRun SCXW160431637 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW160431637 BCX0\">Mentee images\u003C/span>\u003C/span>\u003C/strong>\u003Cspan class=\"EOP SCXW160431637 BCX0\" data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/span>\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan class=\"EOP SCXW165912919 BCX0\" data-ccp-props=\"{}\">[**]gallery:menteebot[**]\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Videos\u003C/strong>\u003C/p>\n\u003Cp contenteditable=\"false\">[**]vimeo-press:1151517854[**]\u003C/p>\n\u003Cp contenteditable=\"false\">[**]vimeo-press:1150802982[**]\u003C/p>\n\u003Cp contenteditable=\"false\">[**]vimeo-press:1150802966[**]\u003C/p>\n\u003Cp contenteditable=\"false\">[**]vimeo-press:1151510659[**]\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2026-01-06T08:00:00.000Z","News, Press Kit",{"id":109,"type":24,"url":110,"title":111,"description":112,"primary_tag":28,"author_name":10,"is_hidden":11,"lang":12,"meta_description":112,"image":113,"img_alt":114,"content":115,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":106,"tags":116},314,"mobileye-to-acquire-mentee-robotics-to-accelerate-physical-ai-leadership","Mobileye To Acquire Mentee Robotics to Accelerate Physical AI Leadership","Combination brings world-class AI talent together to scale autonomous vehicles and humanoid robots globally","https://static.mobileye.com/website/us/corporate/images/e6d126b4d74e0fce47e379c3fff057e6_1767710135503.jpg","Mobileye and Mentee Robotics","\u003Cstyle>\r\n.blog-body ul {\r\n  padding-left: 30px;  \r\n  list-style-type: disc;\r\n}\r\n\r\n.blog-body ul li{\r\n  margin-bottom: 10px\r\n}\r\n\r\n\r\n.blog-body ul ul {\r\n  margin-top: 10px;\r\n  padding-left: 30px;      \r\n  list-style-type: circle; \r\n}\r\n\r\n\r\n.blog-body ul ul li {\r\n  margin-bottom: 10px\r\n}\r\n\u003C/style>\r\n\u003Cp>LAS VEGAS, Jan. 6, 2026 &mdash; Mobileye today announced entry into a definitive agreement to acquire Mentee Robotics Ltd., an AI-first humanoid robotics company with a third-generation, vertically integrated humanoid robot. This transaction would combine Mobileye&rsquo;s advanced AI technology and global production expertise with Mentee&rsquo;s breakthrough humanoid platform and deep AI talent, creating a global leader in physical AI across two transformative markets: autonomous driving and humanoid robotics.\u003C/p>\r\n\u003Cp>Strong traction in advanced vehicle autonomy and core ADAS technology has resulted in a current automotive revenue pipeline of $24.5 billion over the next eight years, up more than 40 percent compared to January 2023.\u003Csup>1 \u003C/sup>&nbsp;This acquisition broadens the scope of the business with a decisive step toward Physical Artificial Intelligence in general: systems designed to understand context, infer intent, interact naturally with humans and act safely and effectively in the physical world in an economically scalable manner. The evolution of Mobileye&rsquo;s autonomy stack over the last few years beyond goal-driven navigation toward holistic, context-aware and intent-aware reasoning, provides a natural foundation for general-purpose robots designed to operate productively alongside humans while meeting uncompromising safety requirements.\u003C/p>\r\n\u003Cp>This acquisition will accelerate Mentee&rsquo;s go-to-market strategy, with first on-site proof-of-concept deployments with customers expected in 2026. These deployments are intended to operate autonomously without teleoperation, and series production and commercialization are targeted for 2028.\u003C/p>\r\n\u003Cp>The total consideration for the acquisition will be $900 million\u003Cem> \u003C/em>(subject to certain adjustments), comprising approximately $612 million in cash and up to about 26.2 million shares of Mobileye Class A common stock subject to adjustment based on the vesting of any Mentee options prior to the closing. The foregoing amounts are not final and are subject to adjustment pursuant to the terms of the Share Purchase Agreement. The transaction is subject to customary closing conditions and is expected to close in\u003Cem> \u003C/em>the first quarter of 2026.\u003C/p>\r\n\u003Cp>\u003Cstrong>Remarkable Progress Made in Startup Phase\u003C/strong>\u003C/p>\r\n\u003Cp>Mentee has made remarkable progress in the four years since its founding, designing and prototyping a cost-efficient humanoid platform engineered for scalable real-world deployment. The platform combines in-house hardware and software design, with an AI architecture built around human-to-robot mentoring, few-shot learning, and simulation-first training. Unlike systems relying on large-scale real-world data collection or continuous teleoperation, Mentee&rsquo;s approach is designed to enable robots to acquire new skills from natural demonstrations and intent cues over time, to deliver predictable, safe interactions with humans and objects while preserving an optimized price-to-usefulness ratio.\u003C/p>\r\n\u003Cp>\u003Cstrong>Mentee&rsquo;s Core Moat: Rapid Learning with Robust, Cost-Efficient Utility\u003C/strong>\u003C/p>\r\n\u003Cp>\u003Cimg style=\"float: right;margin-left:20px;\" src=\"https://static.mobileye.com/website/us/corporate/images/127b18f2ac06dc118ee1f59837cd4167_1767711405382.png\" alt=\"Mentee Robotics is developing an AI-first, vertically integrated humanoid robot designed for real-world usefulness and adaptability.\" width=\"350\" height=\"560\" />Mentee humanoids are engineered to deliver robust out-of-the-box functionality, including the integration of advanced scene understanding and natural instruction following, end-to-end autonomous task execution without teleoperation, and reliable locomotion, navigation, and safe manipulation of rigid objects. Development is progressing rapidly towards &ldquo;few-shot generalization&rdquo; which is designed to enable robots to learn and execute new skills and tasks after only a few human demonstrations. This capability will enable rapid deployment of humanoid robots across a wide range of real-world tasks, as both a labor multiplier and a collaborative presence alongside people.\u003C/p>\r\n\u003Cp>\u003Cstrong>Two Pillars Driving Mentee&rsquo;s Technology Advantage\u003C/strong>\u003C/p>\r\n\u003Cul>\r\n\t\u003Cli>\r\n\t\t\u003Cstrong>Mentee&rsquo;s Scalable AI Advantage:\u003C/strong> The Mentee platform is built on two foundational AI pillars that create a strong technology moat:\r\n\t\t\u003Cul>\r\n\t\t\t\u003Cli>An integrated AI solution that combines advanced foundation models with reinforcement-learning based motion models.\u003C/li>\r\n\t\t\t\u003Cli>Simulation-only training with breakthrough technologies that minimize the Sim2Real gap.Mentee&rsquo;s approach reduces reliance on large-scale real-world data collection and enables efficient skill acquisition through simulation, accelerating scalability and cost efficiency.\u003C/li>\r\n\t\t\u003C/ul>\r\n\t\u003C/li>\r\n\t\u003Cli>\r\n\t\t\u003Cstrong>Vertically Integrated Hardware for Scalable Deployment:\u003C/strong> Mentee develops critical hardware and embedded software technologies in-house, including:\r\n\t\t\u003Cul>\r\n\t\t\t\u003Cli>Proprietary actuators for superior torque density and compact form-factor\u003C/li>\r\n\t\t\t\u003Cli>Precision motor drivers for superior control and behavior transparency\u003C/li>\r\n\t\t\t\u003Cli>Practical and strong robotic hands, with motor-based tactile sensing to enhance modularity and reduce complexity\u003C/li>\r\n\t\t\t\u003Cli>Hot-swappable batteries for continuous uptime\u003C/li>\r\n\t\t\u003C/ul>\r\n\t\tThis deep vertical integration minimizes the Sim2Real gap, enables 24/7 operational availability, provides versatility for a wide range of applications, and supports cost-effective volume manufacturing.\r\n\t\u003C/li>\r\n\u003C/ul>\r\n\r\n\u003Cp>\u003Cstrong>The Convergence of Vehicle Autonomy and Humanoid Robotics\u003C/strong>\u003C/p>\r\n\u003Cp>Autonomous driving and humanoid robotics share the same fundamental challenges: operating reliably and usefully in a world built by humans, for humans. Success requires meeting strict performance requirements, delivering proven and verifiable safety, operating efficiently on edge-compute platforms, and achieving scalable, economically viable deployment.\u003C/p>\r\n\u003Cp>Consequently, both domains rely on a common Physical Artificial Intelligence stack that spans multimodal perception, world modeling, intent-aware planning, precision control, and decision-making under uncertainty.\u003C/p>\r\n\u003Cp>\u003Cstrong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/a6cab3bf8701a75779ffd976269f0830_1767711531677.png\" alt=\"Two MenteeBots demonstrate stability, accuracy, and adaptive control transferring 32 boxes across multiple racks.\" width=\"1000\" height=\"561\" />\u003C/strong>\u003C/p>\r\n\u003Cp>\u003Cstrong>Acquisition of Mentee Enables Key Strategic Synergies\u003C/strong>\u003C/p>\r\n\u003Cp>The acquisition of Mentee by Mobileye is expected to catalyze technological synergies that advance Physical AI across robotics and autonomous vehicles.\u003C/p>\r\n\r\n\u003Cul>\r\n\t\u003Cli>Enhanced Autonomy Stack: Mentee&rsquo;s advancements in vision-language-action technologies and large-scale simulation with novel Sim2Real transfer techniques are direct complements to Mobileye&rsquo;s autonomy stack. These capabilities strengthen autonomous driving systems through improved generalization of long-tail scenarios, faster adaptation to new environments, and more efficient development and validation cycles.&nbsp;\u003C/li>\r\n\t\u003Cli>Safety Leadership for Humanoids: Humanoid robots operating near humans, other machines, and dynamic everyday environments require a level of verifiable safety that goes beyond reactive collision avoidance. Unlike fixed automation, humanoids must reason in real time about human behavior, shared spaces, movable objects, and fragile surroundings, while producing predictable and auditable outcomes. Mobileye brings a proven safety-first approach developed for autonomous driving, including formal safety models such as Responsibility-Sensitive Safety (RSS), mathematically grounded decision-making under uncertainty, and system-level redundancy architectures validated at scale. Together, these technologies provide a foundation for defining, verifying, and enforcing safe behavior, thereby building the trust, reliability, and regulatory readiness required for humanoid robots to become economically viable at scale.\u003C/li>\r\n\t\u003Cli>Accelerated Commercialization: Mobileye&rsquo;s two decades of expertise in bringing advanced technologies to market&ndash;leveraging tools &amp; infrastructure that adheres to strict safety standards, AI training infrastructure, and deep relationships with high-volume precision manufacturers will accelerate deployment of humanoid solutions in factories, warehouses, and industrial environments globally.\u003C/li>\r\n\u003C/ul>\r\n\r\n\r\n\u003Cp>By unifying breakthroughs across humanoid robotics and autonomous vehicles, Mobileye and Mentee create a compounding advantage in Physical AI, where progress in one domain systematically reinforces the other.\u003C/p>\r\n\u003Cp>&ldquo;Today marks a new chapter for robotics and automotive AI, and the beginning of Mobileye 3.0,&rdquo; said Prof. Amnon Shashua, President and CEO of Mobileye. &ldquo;By combining Mentee&rsquo;s breakthroughs in humanoid robotics with Mobileye&rsquo;s expertise in automotive autonomy, and its proven ability to productize advanced AI, we have a unique opportunity to lead the evolution of physical AI across robotics and autonomous vehicles on a global scale.&rdquo;\u003C/p>\r\n\u003Cp>Prof. Lior Wolf, CEO of Mentee Robotics, said: &ldquo;I am immensely proud of what Mentee&rsquo;s multidisciplinary team has accomplished in just four years. We set out to build a platform that combines cutting-edge AI with deeply integrated hardware to make humanoid robots truly useful in real-world environments. Joining forces with Mobileye gives us access to unparalleled AI infrastructure and commercialization expertise, accelerating our mission to bring scalable, safe, and cost-effective humanoid solutions to market.&rdquo;\u003C/p>\r\n\u003Cp>The acquisition was approved by Mobileye&rsquo;s Board of Directors, following the recommendation of a strategic transaction committee consisting of independent directors, and Intel Corp., Mobileye&rsquo;s largest shareholder. Intel also approved the acquisition as the sole Class B shareholder of Mobileye pursuant to Mobileye&rsquo;s Amended and Restated Certificate of Incorporation. Prof. Shashua, who also serves as the Chairman, Co-Founder and a significant shareholder of Mentee, recused himself from the Mobileye Board&rsquo;s consideration and approval of the transaction.\u003C/p>\r\n\u003Cp>Mentee will operate as an independent unit within Mobileye, preserving continuity while leveraging Mobileye&rsquo;s advanced AI training infrastructure to accelerate integration of AI software and hardware capabilities. The transaction is expected to modestly increase Mobileye&rsquo;s operating expenses in 2026 by a low-single-digit percentage.\u003C/p>\r\n\u003Cp>Prof. Shashua will share more on Mobileye&rsquo;s vision for physical AI during his presentation at CES 2026, which will be \u003Ca href=\"https://www.youtube.com/mobileye\">livestreamed here\u003C/a> at 1:45 PM PST/4:45 PM EST today, January 6\u003Csup>th\u003C/sup>, and available for replay thereafter.\u003C/p>\r\n\u003Cp>For further information regarding the terms and conditions contained in the definitive agreement for the acquisition, please see Mobileye&rsquo;s Current Report on Form 8-K, which will be filed with the Securities and Exchange Commission in connection with this transaction.\u003C/p>\r\n\u003Cp>Goldman Sachs &nbsp;&amp; Co. LLC serves as financial advisor and Erdinast Ben Nathan Toledano and Davis Polk and Wardwell LLP serve as Israeli and US legal counsel to Mobileye, respectively.&nbsp; Shibolet &amp; Co. and Paul Hastings LLP serve as Israeli and US legal counsel to Mentee, respectively.\u003C/p>\r\n\u003Cp>&nbsp;\u003C/p>\r\n\u003Cp>\u003Cstrong>Media Contact: \u003C/strong>Justin Hyde, justin.hyde@mobileye.com\u003C/p>\r\n\u003Cp>\u003Cspan style=\"font-size: 8pt;\">\u003Csup>1 \u003C/sup>Mobileye&rsquo;s revenue for the periods presented represent estimated volumes based on projections of future production volumes that were provided by our current and prospective OEMs at the time of sourcing the design wins for the models related to those design wins. See the disclaimer under the heading &ldquo;Forward-Looking Statements&rdquo; below for important limitations applicable to these estimates.\u003C/span>\u003C/p>\r\n\u003Cp>___________________________________\u003C/p>\r\n\u003Cp>\u003Cspan style=\"font-size: 12pt;\">Mobileye (Nasdaq: MBLY) leads the mobility revolution with our autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in artificial intelligence, computer vision, mapping and integrated software and hardware. Since our founding in 1999, Mobileye has enabled the wide adoption of advanced driver-assistance systems that bolster driving safety, while pioneering such groundbreaking technologies as REM&trade; crowdsourced mapping, True Redundancy&trade; sensing, and Responsibility Sensitive Safety&trade; (RSS). These technologies drive the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions at scale, and powering industry-leading advanced driver-assistance systems. Through 2024, more than 200 million vehicles worldwide have been built with Mobileye&rsquo;s EyeQ technology inside. Since 2022, Mobileye has been listed independently from&nbsp;Intel&nbsp;(Nasdaq: INTC), which retains majority ownership. For more information, visit&nbsp;\u003Ca href=\"https://cts.businesswire.com/ct/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.mobileye.com&amp;esheet=54167879&amp;newsitemid=20241217366044&amp;lan=en-US&amp;anchor=https%3A%2F%2Fwww.mobileye.com&amp;index=7&amp;md5=25e995ce7275b1a687baef28e2f033a3\">https://www.mobileye.com\u003C/a>.\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cspan style=\"font-size: 12pt;\">&ldquo;Mobileye,&rdquo; the Mobileye logo and Mobileye product names are registered trademarks of&nbsp;Mobileye Global. All other marks are the property of their respective owners.\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">\u003Cstrong>Forward-Looking Statements\u003C/strong>\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">Mobileye&rsquo;s business outlook, guidance and other statements in this release that are not statements of historical fact, including statements about our beliefs and expectations, are forward-looking statements and should be evaluated as such. Forward-looking statements include expectations and information regarding the development of robotics and AI capabilities, the impact of robotics and AI development on Mobileye&rsquo;s business, the impact of the transaction on Mobileye&rsquo;s operating expenses, and descriptions of our future business plan and strategies. These statements often include words such as &ldquo;anticipate,&rdquo; &ldquo;expect,&rdquo; &ldquo;suggests,&rdquo; &ldquo;plan,&rdquo; &ldquo;believe,&rdquo; &ldquo;intend,&rdquo; &ldquo;estimates,&rdquo; &ldquo;targets,&rdquo; &ldquo;projects,&rdquo; &ldquo;should,&rdquo; &ldquo;could,&rdquo; &ldquo;would,&rdquo; &ldquo;may,&rdquo; &ldquo;will,&rdquo; &ldquo;forecast,&rdquo; or the negative of these terms, and other similar expressions, although not all forward-looking statements contain these words. We base these forward-looking statements or projections, on our current expectations, plans and assumptions that we have made in light of our experience in the industry, as well as our perceptions of historical trends, current conditions, expected future developments and other factors we believe are appropriate under the circumstances and at such time. You should understand that these statements are not guarantees of performance or results. The forward-looking statements and projections are subject to and involve risks, uncertainties and assumptions and you should not place undue reliance on these forward-looking statements or projections. Although we believe that these forward-looking statements and projections are based on reasonable assumptions at the time they are made, you should be aware that many factors could affect our actual financial results or results of operations and could cause actual results to differ materially from those expressed in the forward-looking statements and projections.\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">Important factors that may materially affect such forward-looking statements and projections include the following: the robotics technology and industry may not develop as expected; further deterioration of macroeconomic conditions due to ongoing global economic and political uncertainty (as our current guidance assumes the estimated production and/or demand impact of current tariff conditions); future business, social and environmental performance, goals and measures; our anticipated growth prospects and trends in markets and industries relevant to our business; business and investment plans; expectations about our ability to maintain or enhance our leadership position in the markets in which we participate; future consumer demand and behavior, including expectations about excess inventory utilization by customers; our ability to effectively compete in the markets in which we operate; future products and technology, and the expected availability and benefits of such products and technology; development of regulatory frameworks for current and future technology; changes in regulation and trade policy, including increased tariffs, in regions in which we operate, including the U.S., Europe and China; projected cost and pricing trends; future production capacity and product supply; potential future benefits and competitive advantages associated with our technologies and architecture and the data we have accumulated; the future purchase, use and availability of products, components and services supplied by third parties, including third-party IP and manufacturing services; uncertain events or assumptions, including statements relating to our estimated vehicle production and market opportunity, potential production volumes associated with design wins and other characterizations of future events or circumstances; adverse conditions in Israel, including as a result of war and geopolitical conflict, which may affect our operations and may limit our ability to produce and sell our solutions; any disruption in our operations by the obligations of our personnel to perform military service as a result of current or future military actions involving Israel; availability, uses, sufficiency and cost of capital and capital resources, including expected returns to stockholders such as dividends, and the expected timing of future dividends; and tax- and accounting-related expectations.\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">Detailed information regarding these and other factors that could affect Mobileye&rsquo;s business and results is included in Mobileye&rsquo;s SEC filings, including the company&rsquo;s Annual Report on Form 10-K for the year ended December 28, 2024, particularly in the section entitled &ldquo;Item 1A. Risk Factors&rdquo;. Copies of these filings may be obtained by visiting our Investor Relations website at ir.mobileye.com or the SEC&rsquo;s website at www.sec.gov.\u003C/span>\u003C/p>","Industry, News, Financial, Mobileye Inside, Autonomous Driving",{"id":118,"type":24,"url":119,"title":120,"description":121,"primary_tag":9,"author_name":10,"is_hidden":11,"lang":12,"meta_description":121,"image":122,"img_alt":123,"content":124,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":125,"tags":126},310,"mobileye-surround-adas-adds-second-top-10-automaker","Mobileye Surround ADAS Adds Second Top 10 Automaker ","Major U.S.-based automaker selects Mobileye’s EQ6H-powered solution as standard across mass market to premium vehicles.","https://static.mobileye.com/website/us/corporate/images/e168f4ef0ca1b83bfec43c22b85e36b8_1767603449292.jpg","Illustration of Mobileye Surround ADAS","\u003Cp>LAS VEGAS, January 5, 2026 &ndash; Mobileye today announced that a US-based automaker has chosen the Mobileye EyeQ&trade;6H to power future advanced driver assistance systems with hands-free driving on select highways across millions of vehicles worldwide. This deal reflects accelerating demand for Mobileye Surround ADAS&trade; systems globally, and Mobileye now estimates future delivery of more than 19 million EyeQ6H-based Surround systems, including 9 million from this new automaker announced today in addition to programs by Volkswagen Group announced in March 2025. &nbsp;\u003C/p>\n\u003Cp>The new customer will offer Surround ADAS as standard equipment across many mainstream and premium models in software-defined vehicle architectures. Compared to first-generation hands-free, eyes-on highway ADAS systems, the Mobileye Surround ADAS approach significantly lowers costs and supports ECU consolidation efforts for automakers by vertically integrating software systems and multiple driving functions on one chip and one ECU, a key feature for software-defined vehicles.\u003C/p>\n\u003Cp>With regulators increasing performance requirements for ADAS systems globally, and customer demand for hands-free driving on the rise in developed markets, Mobileye expects Surround ADAS-type systems to represent the next generation of mainstream ADAS and to become standard equipment on many European and U.S. models in the near future.\u003C/p>\n\u003Cp>&ldquo;This selection of Mobileye Surround ADAS by one of the world&rsquo;s great automakers reflects the power of our approach to democratizing safety and technology,&rdquo; said Mobileye President and CEO Prof. Amnon Shashua. &ldquo;Leveraging the EyeQ6H as a powerful central processor for ADAS enables better performance, increased features and greater flexibility to automakers and their customers, all at a lower cost.&rdquo;\u003C/p>\n\u003Cp>Surround ADAS represents a software-defined set of safety and convenience features, intended for deployment in designated areas and conditions, building on Mobileye&rsquo;s two decades of experience in automated safety and driving. By leveraging the latest advancements in AI, a suite of multiple cameras and multiple radars totaling up to 11 sensors can be processed by a single EyeQ6H, integrating computer vision, sensor fusion and REM&trade; crowdsourced driving data.\u003C/p>\n\u003Cp>A typical Mobileye Surround ADAS system uses one forward-looking high-resolution camera, four corner parking cameras, and multiple radars. These systems can enable hands-free, eyes-on driving in designated areas and conditions up to 81 mph or 130 kph, with automated lane change, highway traffic jam assist and cut-in protection. The system is also designed to bolster safety with advanced blind spot detection, evasive maneuver assist, increased pedestrian detection and driver monitoring integrated into the EyeQ operation, along with optional automated parking solutions.\u003C/p>\n\u003Cp>A key to enabling hands-free driving comes from REM crowdsourced data, which covers nearly all highway and arterial roads in the United States and Europe, and more than 90 percent of roads in key Asian markets. To date, more than 8 million vehicles across 18 automotive brands and 50 vehicle models harvest anonymized REM data globally. The high processing power of the EQ6H are also designed to enable over-the-air updates for future features, along with robust cybersecurity protections.\u003C/p>\n\u003Cp>\u003Cstrong>Media Contact: \u003C/strong>Justin Hyde, justin.hyde@mobileye.com\u003C/p>\n\u003Cp>___________________________________\u003C/p>\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">Mobileye (Nasdaq: MBLY) leads the mobility revolution with our autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in artificial intelligence, computer vision, mapping and integrated software and hardware. Since our founding in 1999, Mobileye has enabled the wide adoption of advanced driver-assistance systems that bolster driving safety, while pioneering such groundbreaking technologies as REM&trade; crowdsourced mapping, True Redundancy&trade; sensing, and Responsibility Sensitive Safety&trade; (RSS). These technologies drive the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions at scale, and powering industry-leading advanced driver-assistance systems. Through 2024, about 200 million vehicles worldwide have been built with Mobileye&rsquo;s EyeQ technology inside. Since 2022, Mobileye has been listed independently from&nbsp;Intel&nbsp;(Nasdaq: INTC), which retains majority ownership. For more information, visit&nbsp;\u003Ca href=\"https://cts.businesswire.com/ct/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.mobileye.com&amp;esheet=54167879&amp;newsitemid=20241217366044&amp;lan=en-US&amp;anchor=https%3A%2F%2Fwww.mobileye.com&amp;index=7&amp;md5=25e995ce7275b1a687baef28e2f033a3\">https://www.mobileye.com\u003C/a>.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">&ldquo;Mobileye,&rdquo; the Mobileye logo and Mobileye product names are registered trademarks of&nbsp;Mobileye Global. All other marks are the property of their respective owners.\u003C/span>\u003C/p>","2026-01-05T08:00:00.000Z","ADAS, Industry, News, Mapping & REM",{"id":128,"type":69,"url":129,"title":130,"description":131,"primary_tag":73,"author_name":10,"is_hidden":11,"lang":12,"meta_description":131,"image":132,"img_alt":133,"content":134,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":125,"tags":78},311,"mobileye-at-ces-2026","Mobileye at CES 2026","Press materials and resources related to Mobileye at CES 2026 ","https://static.mobileye.com/website/us/corporate/images/3744a10eb4a9024d092956b85b48d4a9_1767530835604.jpg","At CES 2026, Mobileye shared updates across ADAS, autonomous driving, and physical AI.","\u003Cp>\u003Cspan data-contrast=\"none\">At CES 2026, Mobileye announced developments across advanced driver&nbsp;assistance, autonomous driving, and humanoid robotics.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;335551550&quot;:0,&quot;335551620&quot;:0,&quot;335557856&quot;:16777215,&quot;335559738&quot;:220,&quot;335559739&quot;:220}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">During the show, Mobileye detailed progress in its automotive roadmap, including momentum for Surround ADAS and large-scale&nbsp;autonomous driving programs, and announced its plans to&nbsp;acquire&nbsp;Mentee Robotics, expanding the company&rsquo;s scope into humanoid robotics.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;335551550&quot;:0,&quot;335551620&quot;:0,&quot;335557856&quot;:16777215,&quot;335559738&quot;:220,&quot;335559739&quot;:220}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Together, these developments reflect Mobileye&rsquo;s focus on building AI systems designed to&nbsp;operate&nbsp;safely,&nbsp;reliably&nbsp;and usefully in real-world environments at scale, spanning both&nbsp;vehicles&nbsp;automation and broader physical AI applications.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;335551550&quot;:0,&quot;335551620&quot;:0,&quot;335557856&quot;:16777215,&quot;335559738&quot;:220,&quot;335559739&quot;:220}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;335551550&quot;:0,&quot;335551620&quot;:0,&quot;335557856&quot;:16777215,&quot;335559738&quot;:220,&quot;335559739&quot;:220}\">\u003Cspan class=\"TextRun SCXW28759288 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"none\">\u003Cspan class=\"NormalTextRun SCXW28759288 BCX0\">This press kit includes Mobileye&rsquo;s CES 2026 announcements, event materials, and multimedia assets.\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW28759288 BCX0\" data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;335551550&quot;:0,&quot;335551620&quot;:0,&quot;335557856&quot;:16777215,&quot;335559738&quot;:220,&quot;335559739&quot;:220}\">&nbsp;\u003C/span>\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye Live at CES 2026 \u003C/strong>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan class=\"TextRun SCXW229906071 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"none\">\u003Cspan class=\"NormalTextRun SCXW229906071 BCX0\">Press Conference Replay\u003C/span>\u003C/span>\u003C/strong>\u003Cspan class=\"LineBreakBlob BlobObject DragDrop SCXW229906071 BCX0\">\u003Cbr class=\"SCXW229906071 BCX0\" />\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan class=\"TextRun SCXW229906071 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"none\">\u003Cspan class=\"NormalTextRun SCXW229906071 BCX0\">\u003Cspan class=\"TextRun SCXW57700641 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"none\">\u003Cspan class=\"NormalTextRun SCXW57700641 BCX0\">Watch the full replay of Mobileye&rsquo;s CES 2026 annual address featuring Presid\u003C/span>\u003Cspan class=\"NormalTextRun SCXW57700641 BCX0\">ent and\u003C/span>\u003Cspan class=\"NormalTextRun SCXW57700641 BCX0\">&nbsp;CEO Prof. Amno\u003C/span>\u003Cspan class=\"NormalTextRun SCXW57700641 BCX0\">n&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SpellingErrorV2Themed SCXW57700641 BCX0\">Sh\u003C/span>\u003Cspan class=\"NormalTextRun SpellingErrorV2Themed SCXW57700641 BCX0\">ashua\u003C/span>\u003Cspan class=\"NormalTextRun SCXW57700641 BCX0\">.\u003C/span>\u003C/span>\u003Cspan class=\"LineBreakBlob BlobObject DragDrop SCXW57700641 BCX0\">\u003Cspan class=\"SCXW57700641 BCX0\">&nbsp;\u003C/span>\u003C/span>\u003C/span>\u003C/span>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.youtube.com/watch?v=VUI85RtI3O0\" target=\"_blank\" rel=\"noopener\">Watch the press conference replay\u003C/a>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan class=\"TextRun SCXW229906071 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"none\">\u003Cspan class=\"NormalTextRun SCXW229906071 BCX0\">\u003Cspan class=\"TextRun SCXW21110973 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"none\">\u003Cspan class=\"NormalTextRun SCXW21110973 BCX0\">Recap\u003C/span>\u003C/span>\u003C/span>\u003C/span>\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan class=\"NormalTextRun SCXW106056914 BCX0\">\u003Cspan class=\"TextRun SCXW152508641 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"none\">\u003Cspan class=\"NormalTextRun SCXW152508641 BCX0\">Read the blog post\u003C/span>\u003Cspan class=\"NormalTextRun SCXW152508641 BCX0\">&nbsp;\u003C/span>\u003C/span>\u003Ca class=\"Hyperlink SCXW152508641 BCX0\" href=\"https://www.mobileye.com/blog/takeaways-from-the-mobileye-press-conference-with-ceo-prof-amnon-shashua-at-ces-2026/\" target=\"_blank\" rel=\"noopener\">Prof. Amnon Shashua at CES 2026: Robotaxi updates, breakthroughs in AI, and robotics\u003C/a>\u003Cspan class=\"EOP SCXW152508641 BCX0\" data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:240,&quot;335559739&quot;:240,&quot;335559740&quot;:279}\">&nbsp;\u003C/span>\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>News\u003C/strong>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/news/mobileye-announces-ces-2026-press-conference/?utm_source=website&amp;utm_medium=press-kit&amp;utm_campaign=ces-2026\" target=\"_blank\" rel=\"noopener\">Mobileye Announces CES 2026 Press Conference\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/news/mobileye-surround-adas-adds-second-top-10-automaker/?utm_source=website&amp;utm_medium=press-kit&amp;utm_campaign=ces-2026\" target=\"_blank\" rel=\"noopener\">Mobileye Surround ADAS Adds Second Top 10 Automaker\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/news/mobileye-to-acquire-mentee-robotics-to-accelerate-physical-ai-leadership/?utm_source=website&amp;utm_medium=press-kit&amp;utm_campaign=ces-2026\" target=\"_blank\" rel=\"noopener\">Mobileye To Acquire Mentee Robotics to Accelerate Physical AI Leadership\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan class=\"TextRun SCXW193030887 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW193030887 BCX0\">Galleries\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW193030887 BCX0\" data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileye-live-at-ces-2026[**]\u003C/p>\n\u003Cp>[**]gallery:mobileye-technology-and-solutions[**]\u003C/p>\n\u003Cp>[**]gallery:mobileye-ecu-series[**]\u003C/p>\n\u003Cp>[**]gallery:professor-amnon-shashua[**]\u003C/p>\n\u003Cp>[**]gallery:mobileyes-advanced-platforms-in-the-drivers-seat[**]\u003C/p>\n\u003Cp>[**]gallery:driven-by-mobileye[**]\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Ch2>\u003Cspan class=\"TextRun SCXW169187365 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW169187365 BCX0\">Video\u003C/span>\u003C/span>\u003C/h2>\n\u003Cp contenteditable=\"false\">[**]vimeo-press:1151510258[**]\u003C/p>\n\u003Cp contenteditable=\"false\">[**]vimeo-press:1153971674[**]\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>",{"id":136,"type":5,"url":137,"title":138,"description":139,"primary_tag":140,"author_name":10,"is_hidden":11,"lang":12,"meta_description":139,"image":141,"img_alt":142,"content":143,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":144,"tags":145},309,"the-shift-towards-centralized-intelligence","The shift towards centralized intelligence  ","In this current landscape, the challenge isn’t just updating software, it’s ensuring that safety-critical driving functions remain uncompromised.",2,"https://static.mobileye.com/website/us/corporate/images/42e83d11a5ca50726f0fc0c8d9e1e8a6_1767170943828.jpg","Software embedded in vehicle platforms, which has grown over time to exceed 100 million lines of code.","\u003Cp>\u003Cspan data-contrast=\"auto\">As the industry consolidates compute, and looks to enable a software-defined experience, the challenge isn&rsquo;t just updating software, it&rsquo;s ensuring that safety-critical driving functions remain uncompromised. Today, software capabilities increasingly define a vehicle&rsquo;s appeal, enabling new features and revenue opportunities long after it leaves the factory floor.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">This evolution is evident in the steady expansion of software embedded in vehicle platforms, which has grown over time to exceed \u003Ca href=\"https://www.mckinsey.com/industries/automotive-and-assembly/our-insights/winning-the-automotive-software-development-race\">100 million lines of code\u003C/a> \u003C/span>\u003Cspan data-contrast=\"auto\">in many modern vehicles.&nbsp;Supporting and enabling this&nbsp;code-based&nbsp;growth requires a new generation of hardware: powerful, system-on-chip (SoC)&nbsp;domain-level&nbsp;processors capable of advanced computing, AI processes, and cloud connectivity. These&nbsp;computer&nbsp;systems&nbsp;enable&nbsp;vehicles to evolve over time, execute complex safety functions, and power in-cabin applications without compromising performance or reliability.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Such progress demands deep&nbsp;expertise,&nbsp;the ability to deliver&nbsp;the high-compute and safety foundations that&nbsp;can&nbsp;enable automakers to&nbsp;consolidate&nbsp;control, simplify complexity, and scale confidently toward their next generation of intelligent vehicles.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>Why do automakers race toward ECU consolidation?&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">For decades, vehicles relied on a patchwork of electronic control units (ECUs),&nbsp;often more than&nbsp;a hundred&nbsp;per car,&nbsp;each managing separate functions from multiple suppliers. This created a system that was costly, complex, and difficult to update.&nbsp;The industry&rsquo;s answer&nbsp;to the growing complexity&nbsp;has been consolidation, and&nbsp;the creation of&nbsp;a system in which&nbsp;fewer, more capable domain controllers&nbsp;become&nbsp;linked through high-bandwidth&nbsp;gateways&nbsp;to enable&nbsp;cross-domain functionality.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">This consolidation re-centers the vehicle around powerful system-on-chips that&nbsp;can host&nbsp;multiple software functions and domain workloads in one place. It simplifies software management across models and the vehicle lifespan. And with flexible hardware and over-the-air updates, automakers can roll out new features, improve performance, and introduce subscription-based services long after production.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The shift is also&nbsp;redefining competition across the supply chain, as chipmakers and software providers race to build reliable, safety-ready platforms for&nbsp;a rapidly digitizing vehicle fleet. In this&nbsp;current&nbsp;landscape,&nbsp;architectures&nbsp;that&nbsp;combine&nbsp;high computing&nbsp;performance with validated safety&nbsp;separation&nbsp;will&nbsp;likely&nbsp;define&nbsp;how effectively automakers can pursue their software-driven&nbsp;goals&nbsp;at&nbsp;scale.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/8e4afe14afa2340ed8fe5e0302aee3a6_1767692149084.jpg\" alt=\"\" width=\"1200\" height=\"574\" />\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>Consolidating&nbsp;E/E&nbsp;(electrical/electronic)&nbsp;architectures&nbsp;in cars has limitations&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Compute&nbsp;consolidation comes&nbsp;with new challenges, with the&nbsp;boundary&nbsp;becoming blurred&nbsp;between safety-critical systems and non-safety applications. Competing architectures that merge everything into one platform risk bottlenecks and validation gaps, as well as the&nbsp;possibility that a post-production software update&nbsp;unintentionally&nbsp;impacts&nbsp;braking,&nbsp;perception, or other essential safety functions.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">There is more than one way to architect centralized vehicle&nbsp;compute. Mobileye takes a path that combines consolidation with separation, connecting our&nbsp;EyeQ&trade; SoCs through a Multi-Domain Controller (MDC) architecture that&nbsp;is designed to&nbsp;keep safety-critical driving functions distinct from general-purpose workloads.&nbsp;This&nbsp;is designed to&nbsp;ensure that driving decisions&nbsp;remain&nbsp;validated and deterministic, even as new&nbsp;software&nbsp;features,&nbsp;via&nbsp;over-the-air&nbsp;updates,&nbsp;are introduced&nbsp;to the rest of the system.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Built&nbsp;for holistic&nbsp;integration,&nbsp; Mobileye&rsquo;s&nbsp;system&nbsp;is designed to&nbsp;support applications, and cloud connectivity,&nbsp;all while preserving&nbsp;safety critical ADAS.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>How do automakers preserve validated safety as new software layers keep evolving?&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Every automaker&nbsp;is responsible for&nbsp;the safety of its vehicles&nbsp;at the moment&nbsp;of release. The challenge begins the moment software starts to change. As new features, updates, and applications are added over the vehicle&rsquo;s&nbsp;lifespan, the original validation that&nbsp;provides&nbsp;safe performance can be unintentionally altered.&nbsp;Ultimately, it&nbsp;comes down to creating a system&nbsp;that enables&nbsp;freedom from interference,&nbsp;in other words, ensuring that safety-critical tasks are not compromised as software evolves.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Validated safety must remain intact, regardless of how many new software layers enter the system. It&nbsp;can&rsquo;t&nbsp;depend on shared&nbsp;compute&nbsp;or unpredictable resource competition.&nbsp;That&rsquo;s&nbsp;why Mobileye&rsquo;s architecture enforces strict isolation between safety-critical driving functions and all other vehicle domains. This separation&nbsp;is designed to&nbsp;ensure that essential ADAS&nbsp;or autonomous-driving tasks stay fully&nbsp;validated&nbsp;and behave consistently even as the surrounding software evolves.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye&rsquo;s approach is open where it matters and isolated where it must be. This is seen in Mobileye&rsquo;s \u003Ca href=\"https://www.mobileye.com/opinion/mobileye-dxp-as-a-novel-approach/\">DXP\u003C/a>\u003C/span>\u003Cspan data-contrast=\"auto\">,&nbsp;where automakers&nbsp;can integrate their own middleware, OTA systems, and user-facing applications while relying on Mobileye&rsquo;s proven foundation to protect the vehicle&rsquo;s ability to see, decide, and act safely in every condition.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>The future is safe and autonomous&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">By combining validated driving intelligence with flexible integration, Mobileye gives automakers the confidence to scale their digital ambitions without ever compromising what matters most on the road. The industry is evolving toward more connected, intelligent, and autonomous vehicles, and Mobileye is ensuring that safety evolves with it. In the future, autonomy&nbsp;won&rsquo;t&nbsp;just be intelligent. It will be&nbsp;safety-defined.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2025-12-31T08:00:00.000Z","AV Safety, ADAS",{"id":147,"type":24,"url":148,"title":149,"description":150,"primary_tag":32,"author_name":10,"is_hidden":11,"lang":12,"meta_description":150,"image":151,"img_alt":152,"content":153,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":154,"tags":155},308,"mobileye-announces-ces-2026-press-conference","Mobileye announces CES 2026 press conference ","Annual address with President and CEO Prof. Amnon Shashua to be presented January 6 at 1:45 p.m. PT  ","https://static.mobileye.com/website/us/corporate/images/23d4be6e8ef7ac65fd346dbd647d2aa4_1767185208692.jpg","Mobileye returns to Las Vegas for CES 2026","\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">JERUSALEM, December&nbsp;18, 2025&nbsp;\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\">&mdash; Mobileye (Nasdaq: MBLY) today announced it will host&nbsp;\u003C/span>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Mobileye Live at CES 2026\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\">, its annual address with President and CEO Prof. Amnon&nbsp;Shashua,&nbsp;on Tuesday, January 6, 2026, at 1:45 p.m. PT.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cspan data-contrast=\"auto\">Prof.&nbsp;Shashua&nbsp;will present Mobileye&rsquo;s vision for the next frontier of physical AI, outlining how advances in AI are shaping the company&rsquo;s technology and product roadmap.&nbsp;He will highlight progress across&nbsp;Mobileye&rsquo;s&nbsp;assisted&nbsp;and autonomous driving product&nbsp;portfolio,&nbsp;the company&rsquo;s&nbsp;strategies for achieving a true mobility revolution,&nbsp;and offer a look ahead at Mobileye&rsquo;s next-generation chip architecture.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559738&quot;:240,&quot;335559739&quot;:240,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cspan data-contrast=\"auto\">The program will&nbsp;include&nbsp;a conversation between Prof.&nbsp;Shashua&nbsp;and Christian Senger, CEO of Volkswagen Autonomous Mobility, exploring the companies&rsquo; joint progress toward autonomy at scale.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559738&quot;:240,&quot;335559739&quot;:240,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cspan data-contrast=\"auto\">The&nbsp;invite-only&nbsp;event will be presented from Las Vegas and livestreamed globally via Mobileye&rsquo;s&nbsp;\u003C/span>\u003Ca href=\"https://www.youtube.com/mobileye\" target=\"_blank\" rel=\"noopener\">YouTube channel\u003C/a>\u003Cspan data-contrast=\"auto\">.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cspan data-contrast=\"auto\">From Jan. 6&ndash;9, Mobileye will host customers, media,&nbsp;analysts&nbsp;and partners in a private Mobileye Lounge in the LVCC West Hall. The space will feature demos&nbsp;showcasing&nbsp;the Mobileye product spectrum,&nbsp;core technology enablers propelling AI-driven mobility, and the evolution of the&nbsp;EyeQ&nbsp;platform.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335557856&quot;:16777215,&quot;335559739&quot;:422,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Mobileye Live at CES 2026\u003C/span>\u003C/strong>&nbsp;\u003Cbr />\u003Cspan data-contrast=\"auto\">Date: Tuesday, January 6, 2026\u003C/span>&nbsp;\u003Cbr />\u003Cspan data-contrast=\"auto\">Time: 1:45 p.m. PT\u003C/span>&nbsp;\u003Cbr />\u003Cspan data-contrast=\"auto\">YouTube&nbsp;livestream:&nbsp;\u003C/span>\u003Ca href=\"https://www.mobileye.com/ces-2026/register/b15158896608a49589967c544827f6f8/\" target=\"_blank\" rel=\"noopener\">Register Here\u003C/a>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cspan data-contrast=\"auto\">+++\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cspan data-contrast=\"auto\">Contacts\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cspan data-contrast=\"auto\">Dan Galves\u003C/span>&nbsp;\u003Cbr />\u003Cspan data-contrast=\"auto\">Investor Relations\u003C/span>&nbsp;\u003Cbr />\u003Ca href=\"mailto:investors@mobileye.com\">investors@mobileye.com\u003C/a>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cspan data-contrast=\"auto\">Justin Hyde\u003C/span>&nbsp;\u003Cbr />\u003Cspan data-contrast=\"auto\">Media Relations\u003C/span>&nbsp;\u003Cbr />\u003Ca href=\"mailto:justin.hyde@mobileye.com\">justin.hyde@mobileye.com\u003C/a>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">About Mobileye\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye (Nasdaq: MBLY) leads the mobility revolution with our autonomous driving and driver-assistance technologies, harnessing world-renowned&nbsp;expertise&nbsp;in artificial intelligence, computer vision, mapping and integrated software and hardware. Since our founding in 1999, Mobileye has enabled the wide adoption of advanced driver-assistance systems that bolster driving safety, while pioneering such groundbreaking technologies as REM&trade; crowdsourced mapping, True Redundancy&trade; sensing, and Responsibility Sensitive Safety&trade; (RSS). These technologies drive the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions at scale, and powering industry-leading advanced driver-assistance systems. Through 2024, more than 200 million vehicles worldwide have been built with Mobileye&rsquo;s&nbsp;EyeQ&nbsp;technology inside. Since 2022, Mobileye has been listed independently from Intel (Nasdaq: INTC), which&nbsp;retains&nbsp;majority ownership. For more information, visit&nbsp;\u003C/span>\u003Ca href=\"https://www.mobileye.com/\">https://www.mobileye.com\u003C/a>\u003Cspan data-contrast=\"auto\">.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335557856&quot;:16777215,&quot;335559739&quot;:360,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\r\n\u003Cp>\u003Cspan data-contrast=\"auto\">&ldquo;Mobileye,&rdquo; the Mobileye logo and Mobileye product names are registered trademarks of Mobileye Global. All other marks are the property of their respective owners.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335557856&quot;:16777215,&quot;335559739&quot;:360,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>","2025-12-18T08:00:00.000Z","Amnon Shashua, Autonomous Driving, News, Events",{"id":157,"type":24,"url":158,"title":159,"description":160,"primary_tag":28,"author_name":10,"is_hidden":11,"lang":12,"meta_description":160,"image":161,"img_alt":162,"content":163,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":164,"tags":165},307,"ruter-and-holo-choose-moia-with-mobileye-drive-for-next-stage-of-avs","Ruter and Holo choose MOIA with Mobileye Drive for next stage of AVs","Ruter, the public transport agency of Oslo, Norway, and AV operator Holo announce partnership with MOIA for AVs beginning in spring 2026.","https://static.mobileye.com/website/us/corporate/images/1796fb6544453c3291c342755c2d7c2b_1765365401469.jpg","The ID. Buzz AD, with Mobileye Drive™ AV technology, planned to be deployed as part of a larger shared, on-demand autonomous transport service.","\u003Cp>OSLO, December 10, 2025 &ndash; In a significant step for autonomous driving in Europe, Ruter, the public transport agency of Oslo, Norway, and Holo, a company specialized in the operation of autonomous vehicles, \u003Ca href=\"https://www.moia.io/en/news/ruter-and-holo-enter-partnership-with-moia\" target=\"_blank\" rel=\"noopener\">have announced\u003C/a> they are partnering with MOIA to deploy the ID. Buzz AD, to be equipped with the Mobileye Drive&trade; self-driving system, planned for as early as spring 2026. This step follows extensive testing by Ruter and Holo of autonomous development vehicles with Mobileye Drive&trade; over the past two years, and marks the transition to the second generation self-driving system based on the Mobileye EyeQ&trade; 6H that drives the ID. Buzz AD.\u003C/p>\n\u003Cp>After the announcements of Hamburger Hochbahn (Hamburg, Germany), BVG (Berlin, Germany) and \u003Ca href=\"https://www.moia.io/en/news/volkswagen-and-uber-long-term-strategic-partnership\">Uber\u003C/a> in North America, Mobileye Drive continues to enable MOIA, the autonomous mobility service arm of the Volkswagen Group, to scale across different operation zones with one of Europe&rsquo;s first series-production ready autonomous vehicles built for SAE Level 4 autonomous driving.\u003C/p>\n\u003Cp>Ruter has envisioned adding up to 30,000 shared autonomous vehicles into the public transport network around Oslo to reduce congestion and emissions. The ID. Buzz AD is expected to be deployed in Groruddalen, a suburb of Norway&rsquo;s capital Oslo. Groruddalen has been a testing site of Mobileye Drive&trade; since 2023, when first trials kicked off together with Ruter and Holo. These trials included testing performance under challenging weather conditions, such as snow-covered roads, and showed the potential of autonomous shared mobility in a complex urban driving environment &ndash; providing practical experience for deployment and operation of autonomous vehicles under real-world conditions.\u003C/p>\n\u003Cp>&ldquo;We are excited to see that mobility innovators like Ruter and Holo continue to rely on Mobileye technology as they work to transform the public transport landscape in Oslo,&rdquo; says Johann Jungwirth, Executive Vice President, Autonomous Vehicles at Mobileye. &ldquo;Our Mobileye Drive self-driving system has been extensively tested by Ruter and Holo in Oslo over the last two years and we are looking forward to seeing their journey continue with the ID. Buzz AD in 2026.&rdquo;&nbsp;\u003C/p>\n\u003Cp>Mobileye will provide further updates on developments in AI-powered autonomous driving on Tuesday, January 6, at CES 2026.\u003C/p>\n\u003Cp>Media contact: Justin Hyde, \u003Ca href=\"mailto:justin.hyde@mobileye.com\">justin.hyde@mobileye.com\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">Mobileye (Nasdaq: MBLY) leads the mobility revolution with our autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in artificial intelligence, computer vision, mapping and integrated software and hardware. Since our founding in 1999, Mobileye has enabled the wide adoption of advanced driver-assistance systems that bolster driving safety, while pioneering such groundbreaking technologies as REM&trade; crowdsourced mapping, True Redundancy&trade; sensing, and Responsibility Sensitive Safety&trade; (RSS). These technologies drive the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions at scale, and powering industry-leading advanced driver-assistance systems. Through 2024, more than 200 million vehicles worldwide have been built with Mobileye&rsquo;s EyeQ technology inside. Since 2022, Mobileye has been listed independently from&nbsp;Intel&nbsp;(Nasdaq: INTC), which retains majority ownership. For more information, visit&nbsp;\u003Ca href=\"https://cts.businesswire.com/ct/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.mobileye.com&amp;esheet=54167879&amp;newsitemid=20241217366044&amp;lan=en-US&amp;anchor=https%3A%2F%2Fwww.mobileye.com&amp;index=7&amp;md5=25e995ce7275b1a687baef28e2f033a3\">https://www.mobileye.com\u003C/a>.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">&ldquo;Mobileye,&rdquo; the Mobileye logo and Mobileye product names are registered trademarks of&nbsp;Mobileye Global. All other marks are the property of their respective owners.\u003C/span>\u003C/p>","2025-12-10T08:00:00.000Z","News, Autonomous Driving, Driverless MaaS, Industry",{"id":167,"type":5,"url":168,"title":169,"description":170,"primary_tag":51,"author_name":10,"is_hidden":11,"lang":12,"meta_description":170,"image":171,"img_alt":172,"content":173,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":174,"tags":175},306,"how-mobileye-redefines-camera-design-with-dual-hdr-image-capture","How Mobileye reimagines camera design with dual HDR image capture","Mobileye is bringing forth an approach that rethinks how high dynamic range scenes can be captured.  ","https://static.mobileye.com/website/us/corporate/images/8350a99b5ec22630290f9c05c7fea69a_1765273985964.jpg","This approach is designed to address one of automotive imaging’s most persistent challenges. ","\u003Cp>\u003Cspan data-contrast=\"auto\">Automotive hardware and software constantly intersect, and&nbsp;the need for technology that unites them&nbsp;more effectively&nbsp;has never been greater.&nbsp;Camera&nbsp;design is a&nbsp;strong&nbsp;example of this&nbsp;industry-wide need.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">At Mobileye,&nbsp;developing technologies that&nbsp;support&nbsp;this&nbsp;integration&nbsp;is a core focus.&nbsp;By&nbsp;proposing&nbsp;a different type of&nbsp;camera&nbsp;architecture, Mobileye&nbsp;is&nbsp;bringing forth&nbsp;an&nbsp;approach&nbsp;that rethinks&nbsp;how&nbsp;high dynamic range&nbsp;(HDR)&nbsp;scenes can&nbsp;be captured.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Developed by Mobileye for&nbsp;use with&nbsp;select sensor suppliers and&nbsp;intended&nbsp;for vehicle&nbsp;integration,&nbsp;this&nbsp;approach&nbsp;is designed to address&nbsp;one of automotive imaging&rsquo;s most persistent challenges: balancing exposure time, motion,&nbsp;light&nbsp;and distance.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">The exposure dilemma in automotive cameras\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">There are a lot of factors to consider when it comes to&nbsp;automotive imaging.&nbsp;One of them&nbsp;is&nbsp;ensuring that a camera can capture&nbsp;images&nbsp;clearly,&nbsp;with the right amount of exposure,&nbsp;all while&nbsp;on&nbsp;a fast-moving&nbsp;vehicle,&nbsp;and&nbsp;at&nbsp;various levels of lighting.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">At nighttime or&nbsp;in&nbsp;low light&nbsp;conditions, there are a few&nbsp;challenges&nbsp;to navigate.&nbsp;As we know, in&nbsp;limited lighting,&nbsp;longer&nbsp;exposure&nbsp;is needed to&nbsp;brighten&nbsp;distant or poorly lit&nbsp;objects.&nbsp;This, however,&nbsp;increases motion blur.&nbsp;\u003C/span>\u003Cspan data-contrast=\"auto\">On the other hand,&nbsp;shorter&nbsp;exposure keeps moving objects sharp but reduces visibility in darker scenes. The result is&nbsp;in both cases is&nbsp;often lost data and lower image quality.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">During daylight, exposure challenges can&nbsp;arise&nbsp;too.&nbsp;This often happens with LED traffic lights.&nbsp;In&nbsp;daylight,&nbsp;a camera&rsquo;s&nbsp;sensor&nbsp;operates&nbsp;at a shorter exposure to reduce motion blur, but that timing&nbsp;doesn&rsquo;t&nbsp;always align with the LED&rsquo;s&nbsp;pulsing&nbsp;cycle. Because LEDs pulse rapidly, the human eye sees steady light, but a camera might capture the light when&nbsp;it&rsquo;s&nbsp;off,&nbsp;resulting in&nbsp;the&nbsp;appearance&nbsp;of LED&nbsp;flicker.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>The solution&nbsp;&ndash;&nbsp;dual&nbsp;HDR&nbsp;image&nbsp;capture&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The key question is how to manage the exposure trade-off without adding&nbsp;unnecessary&nbsp;complexity.&nbsp;Adding a second camera&nbsp;that can&nbsp;capture HDR images at a different exposure to resolve this conflict&nbsp;is possible, but that&nbsp;adds&nbsp;cost and software demands,&nbsp;making it&nbsp;difficult to&nbsp;scale.&nbsp;As an alternative,&nbsp;Mobileye&nbsp;has developed&nbsp;a&nbsp;&ldquo;dual&nbsp;HDR image&nbsp;capture&rdquo;&nbsp;technology.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/944ad29ff3630a5637532ab53cdeb1bc_1765288467997.jpg\" alt=\"\" width=\"1200\" height=\"565\" />\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">By&nbsp;enabling&nbsp;a single camera to produce two HDR images with only a negligible gap between captures,&nbsp;this method&nbsp;is intended to&nbsp;allow&nbsp;the&nbsp;camera&nbsp;to&nbsp;provide&nbsp;two separate HDR&nbsp;images at different exposures that are&nbsp;read&nbsp;out&nbsp;from the&nbsp;camera,&nbsp;within the same frame time.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559738&quot;:240,&quot;335559739&quot;:240,&quot;335559740&quot;:278}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">This provides&nbsp;algorithms&nbsp;with richer visual data for scene understanding.&nbsp;As a result, the camera can capture both&nbsp;an image&nbsp;with longer&nbsp;HDR exposure and&nbsp;an image with a shorter HDR exposure&nbsp;of the scene&nbsp;almost simultaneously&nbsp;without needing a second camera.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559738&quot;:240,&quot;335559739&quot;:240,&quot;335559740&quot;:278}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">But&nbsp;while this process&nbsp;increases&nbsp;the number of HDR images, it&nbsp;doesn&rsquo;t&nbsp;inherently require&nbsp;double&nbsp;the processing needs.&nbsp;Rather than transmitting two full 8MP frames, one of&nbsp;the frames can be&nbsp;resized&nbsp;to&nbsp;a&nbsp;lower resolution,&nbsp;helping&nbsp;reduce&nbsp;MIPI bandwidth and&nbsp;manage&nbsp;the&nbsp;overall&nbsp;data&nbsp;load.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/1a63d36fa844850e940e3258c8019417_1765288501973.jpg\" alt=\"\" width=\"1200\" height=\"565\" />\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The solution&nbsp;is&nbsp;intended&nbsp;to:&nbsp;&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"3\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"1\" data-aria-level=\"1\">\u003Cspan data-contrast=\"auto\">Improve low light&nbsp;images&nbsp;by&nbsp;up to&nbsp;~2x&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"3\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"2\" data-aria-level=\"1\">\u003Cspan data-contrast=\"auto\">Reduce motion blur at night by&nbsp;up to&nbsp;~5x&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"3\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"3\" data-aria-level=\"1\">\u003Cspan data-contrast=\"auto\">Reduce&nbsp;LED flicker&nbsp;in daytime scenes&nbsp;with&nbsp;minimal&nbsp;impact on&nbsp;the detection of fast-moving objects while driving\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cp>\u003Cem>\u003Cspan data-contrast=\"none\">*Such improvements are&nbsp;achievable&nbsp;under&nbsp;optimal&nbsp;operating and driving conditions.&nbsp;&nbsp;\u003C/span>\u003C/em>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/fe8a1d7de9a3fb0377c1688d937ad512_1765288531346.jpg\" alt=\"\" width=\"1200\" height=\"565\" />\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">For sensor suppliers,&nbsp;this&nbsp;can&nbsp;offer\u003C/span>\u003Cspan data-contrast=\"auto\">s\u003C/span>\u003Cspan data-contrast=\"auto\">&nbsp;a practical path to&nbsp;support improved&nbsp;imaging&nbsp;capabilities. For automakers, it&nbsp;can&nbsp;provide&nbsp;more consistent&nbsp;environment information to support&nbsp;ADAS performance without adding system burden.&nbsp;Overall,&nbsp;this solution is intended to&nbsp;help&nbsp;improve&nbsp;imaging&nbsp;quality and efficiency&nbsp;in&nbsp;automotive&nbsp;systems.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch4>\u003Cem>Disclaimer: The dual capture approach was developed by Mobileye&nbsp;for&nbsp;collaboration with select image-sensor suppliers and is intended to support potential integration into automotive&nbsp;systems.&nbsp;\u003C/em>\u003C/h4>","2025-12-09T08:00:00.000Z","Industry, ADAS",{"id":177,"type":5,"url":178,"title":179,"description":180,"primary_tag":51,"author_name":10,"is_hidden":11,"lang":12,"meta_description":180,"image":181,"img_alt":182,"content":183,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":32,"publish_date":184,"tags":175},305,"tackling-global-regulations-and-safety-standards","Tackling global regulations and safety standards","A conversation with Mobileye's Sensing Product team","https://static.mobileye.com/website/us/corporate/images/56907f06e9d6113ecdd3bf2cdbca20da_1763631781685.jpg","Perspectives shared here reflect the opinions of members of Mobileye’s Sensing team and do not represent the company’s official position.","\u003Cp>OEMs might be wondering what to prioritize with so many regulatory considerations across multiple geographies. We sat down with Mobileye&rsquo;s Nir Hamzani, Regulation Manager for Assisted and Automated Driving and Shai Hershkovich, Senior Director of Sensing Product Management, whose day-to-day include studying and sharing information within Mobileye regarding current and future industry regulations, to share their personal perspectives.\u003C/p>\n\u003Ch3>\u003Cstrong>What regulatory considerations should OEMs prioritize?&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>To answer that, we should first zoom out and look at the bigger picture. Regulations are a broad term, which can include mandatory rules and assessment programs. For example, Euro NCAP is a car assessment program that provides an evaluation of vehicle safety systems in order for consumers to make comparisons. It&rsquo;s not mandatory regulation as such, but it is something that consumers look for. Many countries have their own car assessment programs, similar to NCAP, each with its own objectives, for example, China has recently added requirements and standards around electric bikes and scooters. There are mandatory regulations that OEMs have to adhere to in the respective country to sell in that market. When it comes to assessment programs, users rely on these to make comparisons, so ratings hold a lot of weight. These considerations can directly impact production planning, for instance, an OEM may need to integrate a front parking camera to improve safety, but must do so in a way that doesn&rsquo;t obstruct the driver&rsquo;s view over the hood.\u003C/p>\n\u003Ch3>\u003Cstrong>As autonomous driving is a new industry, how are safety benchmarks established?&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>Well, it is important to note that because the field is still so new, common safety standards have been slow to take shape. AVs need to become more widespread before regulations can fully catch up. Over time, as the industry grows, clear rules for testing, performance, and safety management will develop and lead to stronger and more consistent safety benchmarks.&nbsp;\u003C/p>\n\u003Cp>Let&rsquo;s look at the ADAS sector. All these assessment programs are continually updated, with regulators tracking and classifying accidents for different scenarios, altogether, setting safety benchmarks. In each program, they try to widen the envelope to prevent further accidents and fatalities. These programs also enable consumers to compare different vehicles with safety scoring. Different actors in the industry and regulators are working together to set these thresholds as ADAS technology expands. It would also be fair to say that the narrowing gap we see between ADAS and AV capabilities brings more regulations related to driver engagement. For example, the R79 initially told carmakers what&rsquo;s allowed when a vehicle steers itself, at least partly by ensuring the driver is fully engaged through warnings.&nbsp;\u003C/p>\n\u003Cp>However, EU regulation is becoming even more stringent with the recent adoption of DCAS regulation (Driver Control Assistance Systems) that requires level 2 vehicles to include a dedicated Driver Monitoring System (DMS) to monitor driver engagement. China is following a similar path with its upcoming CDAS framework (Combined Driver Assistance Systems), which also requires a DMS to be installed.&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>Do you think there ever will be a global standard for safety ratings? &nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>That&rsquo;s hard for us to determine, but for now we think that&rsquo;s unlikely. Each region has its own regulatory body and approach. These organizations work independently, and their standards reflect local realities, things like road design, infrastructure, driving culture, climate, and even societal norms. That said, many consider the Euro NCAP five-star rating as an established top global benchmark for vehicle safety, and premium automakers routinely design their vehicles to hit Euro NCAP&lsquo;s mark, despite its specificity to Europe.&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>In practical terms, what are the key differences in regulations you see between the U.S. and Europe?&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>That is a good question because we see these markets acting differently. The U.S. currently has no federal regulations around autonomous driving. For example, you cannot drive certain brands in Europe with the same functionality as in the U.S. Some have an autonomous function, but they can only currently be driven in a country where specific permissions were granted from the relevant authorities in order to conduct testing.&nbsp;\u003C/p>\n\u003Cp>Furthermore, it&rsquo;s important that we distinguish between testing and commercial deployment. In the U.S., there&rsquo;s no federal law restricting AV testing, automakers can test autonomous vehicles in most U.S. states under state-specific permits, even if those vehicles aren&rsquo;t yet approved for commercial use. For example, robotaxis are being tested in San Francisco and Los Angeles in California, and in Phoenix, Arizona. Meanwhile, Europe has been introducing regulations to operate autonomous capabilities such as hands-off driving. And it&rsquo;s worth noting that these are L2 partially autonomous capabilities under a safety driver, not fully autonomous driving. So now, if a driverless service were to be launched in London for example, authorization by the local government is needed.\u003C/p>\n\u003Ch3>\u003Cstrong>What are the upcoming regulations for ISA (Intelligent Speed Assist), and how effective is it in reducing accidents?&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>We&rsquo;ve read plenty of research on speeding-related road accidents and there&rsquo;s a very clear understanding about this as we see it. From July 2024, all new cars in Europe have to be fitted with a mandatory Intelligent Speed Assist System. Other regions are currently looking to Europe to understand the impact of implementing ISA and the results will likely be published in the next two or three years to demonstrate its impact. For each region or country, bringing the technology in as a regulation will depend on their specific objectives and how much they want to increase road safety.&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>But how can OEMs manage or optimize their production strategies for features such as ISA that have different regional standards?&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>Well, there are mandatory requirements for different regions that OEMs must adhere to. For example, all vehicles in Europe must include ISA or emergency braking systems. OEMs producing low-cost vehicles for the European market need to integrate a front camera to comply with GSR regulations. These requirements, and the level of specification, must be considered early in the design stage.&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>Finally, how does the upcoming FMVSS update impact OEMs planning for automated driving?&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>Well, first we have to note that the upcoming FMVSS-127 update is still being finalized. However, we think understanding its direction can help OEMs plan ahead. For them, aligning early with the intent of these evolving standards supports smoother validation and readiness once new requirements take effect. And although the U.S. framework remains more flexible than Europe&rsquo;s type-approval model, these developments point to a global shift toward stronger safety assurance and data-driven evaluation in ADAS.&nbsp;\u003C/p>","2025-11-20T08:00:00.000Z",{"id":186,"type":5,"url":187,"title":188,"description":189,"primary_tag":190,"author_name":10,"is_hidden":11,"lang":12,"meta_description":189,"image":191,"img_alt":192,"content":193,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":194,"tags":195},302,"level-3-autonomy-explained","Hands-off/eyes-off autonomy and what it means for automakers ","What is level 3 driving and what is required to make it happen? Discover why Level 3 autonomous driving is becoming the strategic focus for automakers",3,"https://static.mobileye.com/website/us/corporate/images/eaf2ceb1613e6963bc54063be51a9ef2_1759148050677.jpg","AI generated image","\u003Ch3>\u003Cstrong>What is Level three autonomy?&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>Level three, or hands-off/eyes-off autonomous driving is where the big leap into autonomy takes place. Defined by the \u003Ca href=\"https://www.sae.org/standards/content/j3016_202104\" target=\"_blank\" rel=\"noopener\">SAE \u003C/a>(Society of Automotive Engineers), as &ldquo;conditional driving automation&rdquo; hands-off/eyes-off autonomous driving allows a vehicle to drive itself under specific conditions without human input, while still requiring the driver to be ready to take back control when necessary. It offers a genuine autonomous experience, but only within well-mapped, controlled environments such as highways or during low-speed traffic jams.&nbsp;\u003C/p>\n\u003Cp>This level is often seen as the most sought-after step in the consumer AV space, offering hands-off, eyes-off capabilities and premium in-cabin experiences, without the full complexity and cost of a Level four system that is fully autonomous, and requires no human input. &nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>What makes this level so meaningful?&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>Hands-off/eyes-off autonomy is the point where the driving experience makes a real shift. While the driver is still required to be available for a takeover, should the need arise, the experience is no longer defined solely by continuous control. It opens space for small breaks, or \"mind-off\" moments, such as checking on kids in the back seat or a moment&rsquo;s decompression on a long commute, giving time back to drivers. It is the beginning of a new psychological relationship to driving.&nbsp;\u003C/p>\n\u003Cp>The gap between levels two and four may be just one, but the difference is striking. Level two, a standard form of \u003Ca href=\"https://www.mobileye.com/blog/how-surround-adas-delivers-the-new-standard-of-safety-and-tech/\" target=\"_blank\" rel=\"noopener\">low-level autonomy\u003C/a>, requires constant driver supervision. Level four, on the other hand, operates entirely on their own within defined boundaries, removing control, responsibility, and most importantly, intervention from the driver.&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>What is needed for level three autonomy?&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>Bringing level three autonomy to mass-market adoption involves multiple considerations, yet two stand out as fundamental:&nbsp;\u003C/p>\n\u003Cp>Firstly, robust technology. The type that can facilitate the delicate handover between driver and AI. A successful Level three autonomous driving system needs a suite of capabilities to ensure reliable handover and symbiotic experience. A huge part of that comes down to a chip&rsquo;s processing capabilities, and the speed and efficiency with which the full stack of driving functions is handled, including ADAS, DMS, sensor capabilities, mapping, parking, OTA updates, and more. Apart from the need to have trusted technology for driving autonomously, there&rsquo;s the second piece of the puzzle.&nbsp;\u003C/p>\n\u003Cp>The second piece, more external to car production, is the regulatory framework. Achieving successful level three autonomy depends on a setup shaped collectively by regulators, public policy experts, automakers, insurers, and informed by real-world data that make up the environment in which hands-off, eyes-off autonomy can thrive. This framework should consider not only the formal rules of the road but also the \u003Ca href=\"https://www.mckinsey.com/industries/automotive-and-assembly/our-insights/navigating-unknowns-auto-insurance-questions-in-a-new-mobility-era\" target=\"_blank\" rel=\"noopener\">changing nature\u003C/a> of driver behavior in response to autonomous technologies. &nbsp;Today, the frameworks around liability, regulation, and insurance are being aligned for mass adoption with the aim of gaining public acceptance.&nbsp;\u003C/p>\n\u003Cp>The hope is that, as more data is collected, clearer frameworks for various precedents will emerge, whether in the form of a mixed responsibility model or insurance products that fairly account for all stakeholders involved.&nbsp;\u003C/p>\n\u003Ch6>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/6dc3c49165df922bfbcb7509adac8357_1759149226755.png\" alt=\"\" width=\"600\" height=\"233\" />\u003C/h6>\n\u003Ch6>\"Bringing Level three autonomy to mass-market adoption involves multiple considerations\"\u003C/h6>\n\u003Ch3>\u003Cstrong>Why would automakers invest in producing level three systems?&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>A design that enables scalability across driver-assist capabilities gives automakers a direct path along the autonomy spectrum. With the right foundation, a single system can deliver a range of \u003Cspan class=\"NormalTextRun SCXW150852949 BCX0\">features\u003C/span>\u003Cspan class=\"NormalTextRun SCXW150852949 BCX0\"> determined\u003C/span> on the number of SoCs integrated.&nbsp;\u003C/p>\n\u003Cp>Level one and Level two driver-assistance features such as ACC (adaptive cruise control) and AEB (automatic emergency braking) are already standard in most new vehicles. In a market where consumers expect both safety and convenience&mdash;and automakers look for meaningful differentiation, the natural next step is Level 3. This enables OEMs to offer eyes-off driving in defined conditions, scale flexibly between L1, L2, and L3, and pave the way toward higher levels of automation. With advanced technologies becoming production-ready and regulatory frameworks taking shape, it&rsquo;s no surprise that more OEMs are preparing to bring Level 3 systems to market, fueling the surge of interest in conditional autonomy.&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>What is the Level 3 solution that automakers need?&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>Mobileye's L3 solution, Mobileye Chauffeur&trade;, enables eyes-off operation of standard driving functions on all regular road types at speeds of up to 80 mph (130 km/h).&nbsp;\u003C/p>\n\u003Cp>This comprehensive technology stack* combines a sensor suite of 11 cameras, surround radars, a front lidar, three EyeQ&trade;6H Systems-on-Chip (SoCs), and a continuously updated crowdsourced map powered by REM&trade; data. There are OEMs that begin by deploying a core driver assist system through a dedicated ADAS or technology provider that possess the know-how in implementing these systems and enabling customizable levels of autonomy and upgrade that system over time to reach higher levels of autonomy. This is where scalable architecture in today&rsquo;s automotive industry really comes into play.&nbsp;&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>The path forward\u003C/strong>\u003C/h3>\n\u003Cp>Eyes-off/hands-off autonomy represents more than just a technological milestone. It is a strategic inflection point for the automotive industry. It creates room for innovation without overextending into the complexity of full autonomy, giving automakers a viable path to deliver premium, differentiated experiences today while building the foundation for tomorrow. With solutions like Mobileye Chauffeur&trade;, the journey toward hands-off and eyes-off driving becomes not only possible but scalable, adaptable, and ready to evolve with regulations, technology, and consumer expectations.&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Learn more about the Mobileye Chauffeur&trade; technology \u003Ca href=\"https://www.mobileye.com/solutions/chauffeur/\" target=\"_blank\" rel=\"noopener\">here\u003C/a>\u003C/strong>\u003C/p>\n\u003Cp>*Availability and performance is subject to a number of factors, including product&rsquo;s and vehicle&rsquo;s specifications, manual, ODD, and law. When needed, driver to be engaged and ready for takeover.\u003C/p>","2025-10-29T07:00:00.000Z","Autonomous Driving, Industry, AV Safety",{"id":197,"type":69,"url":198,"title":199,"description":200,"primary_tag":73,"author_name":10,"is_hidden":11,"lang":201,"meta_description":200,"image":202,"img_alt":203,"content":204,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":205,"tags":206},304,"mobileye-at-jms-2025","Mobileye ジャパンモビリティショー2025 メディアキット","ジャパンモビリティショー メディアキットをチェック \nMobileye のビジュアル素材、背景資料、そしてイベントのハイライトをご覧いただけます。 ","jp","https://static.mobileye.com/website/us/corporate/images/971120c09954c29d96e9c99761c0fe13_1761659633670.jpg","Mobileye | ジャパンモビリティショー2025","\u003Cp>\u003Cstrong>ジャパンモビリティショー2025 | Mobileye プレス資料\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">次世代モビリティを支える革新技術を体感\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;335557856&quot;:16777215,&quot;335559738&quot;:0,&quot;335559739&quot;:0}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">2025年ジャパンモビリティショーにて、\u003C/span>\u003Cstrong>\u003Cspan data-contrast=\"none\">Mobileye\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"none\">&nbsp;は先進運転支援システム（ADAS）から完全自動運転までを実現する、スケーラブルなソリューションを展示します。世界の自動車メーカーやモビリティ事業者に向け、未来の移動を形づくる最先端技術をご紹介します。\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;335557856&quot;:16777215,&quot;335559738&quot;:0,&quot;335559739&quot;:0}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp aria-level=\"3\">\u003Cstrong>\u003Cspan data-contrast=\"none\">展示製品\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;134245418&quot;:true,&quot;134245529&quot;:true,&quot;335557856&quot;:16777215,&quot;335559738&quot;:60,&quot;335559739&quot;:60}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"2\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"1\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"none\">Surround ADAS&trade;\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"none\">&nbsp;&ndash; EyeQ&trade;6Hとマルチカメラ構成による360&deg;環境認識。高速道路でのハンズオフ走行を可能にします。\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;335557856&quot;:16777215,&quot;335559738&quot;:0,&quot;335559739&quot;:0}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"2\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"2\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"none\">Mobileye&nbsp;SuperVision&trade;\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"none\">&nbsp;&ndash; 最先端ADASプラットフォーム（SAE L2+）。ナビ連動支援や自動車線変更に対応し、ハンズオフ・アイズオン走行を実現。\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;335557856&quot;:16777215,&quot;335559738&quot;:0,&quot;335559739&quot;:0}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"2\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"3\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"none\">Mobileye Chauffeur&trade;\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"none\">&nbsp;&ndash; ライダーとレーダー冗長構成による条件付き自動運転（SAE L3）。特定条件下でのアイズオフ走行を可能にします。\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;335557856&quot;:16777215,&quot;335559738&quot;:0,&quot;335559739&quot;:0}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"2\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"4\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"none\">Mobileye Drive&trade;\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"none\">&nbsp;&ndash; 完全自動運転（SAE L4）プラットフォーム。ロボタクシー、公共交通、配送車両向けに設計。\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;335557856&quot;:16777215,&quot;335559738&quot;:0,&quot;335559739&quot;:0}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cp aria-level=\"3\">\u003Cstrong>\u003Cspan data-contrast=\"none\">展示予定のコア技術\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;134245418&quot;:true,&quot;134245529&quot;:true,&quot;335557856&quot;:16777215,&quot;335559738&quot;:60,&quot;335559739&quot;:60}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">EyeQ&trade;6Hシステム・オン・チップ、Road Experience Management&trade;（REM&trade;）マップインテリジェンス、Responsibility-Sensitive&nbsp;Safety&trade;（RSS）、Compound&nbsp;AI、Vision-First&nbsp;ADAS、Mobileye&nbsp;Imaging Radar&trade;、Mobileye&nbsp;DMS（Driver&nbsp;Monitoring System）。\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;335557856&quot;:16777215,&quot;335559738&quot;:0,&quot;335559739&quot;:0}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp aria-level=\"3\">\u003Cstrong>\u003Cspan data-contrast=\"none\">メディアキットのご案内\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;134245418&quot;:true,&quot;134245529&quot;:true,&quot;335557856&quot;:16777215,&quot;335559738&quot;:60,&quot;335559739&quot;:60}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">画像、動画、製品資料など、\u003C/span>\u003Cstrong>\u003Cspan data-contrast=\"none\">Mobileye\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"none\">&nbsp;の革新的技術を紹介するメディアキットをご用意しています。Vision‑First ADASから完全自動運転システムまで、未来のモビリティを切り拓くソリューションをご覧ください。\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;335557856&quot;:16777215,&quot;335559738&quot;:0,&quot;335559739&quot;:0}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch2>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;335557856&quot;:16777215,&quot;335559738&quot;:0,&quot;335559739&quot;:0}\">\u003Cstrong>報道資料・インフォグラフィック\u003C/strong>\u003C/span>\u003C/h2>\n\u003Cp>\u003Cstrong>Mobileye概要\u003C/strong>\u003C/p>\n\u003Cp>\u003Ca href=\"https://static.mobileye.com/website/common/files/General%20ME%20One-pager-2025-JAP-W.pdf\" target=\"_blank\" rel=\"noopener\">Mobileye, driving AI\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://static.mobileye.com/website/common/files/ME-Safety%20Leadership-JAP.pdf\" target=\"_blank\" rel=\"noopener\">Mobileye Safety Leadership\u003C/a>\u003C/p>\n\u003Cp>\u003Cstrong>製品情報\u003C/strong>\u003C/p>\n\u003Cp>\u003Ca href=\"https://static.mobileye.com/website/common/files/Mobileye%20Surround%20ADAS-25-JAP-W.pdf\" target=\"_blank\" rel=\"noopener\">Mobileye Surround ADAS&trade;\u003C/a>&nbsp;\u003C/p>\n\u003Cp>\u003Ca href=\"https://static.mobileye.com/website/common/files/Mobileye%20SuperVision-JAP-W.pdf\" target=\"_blank\" rel=\"noopener\">Mobileye SuperVision&trade;\u003C/a>&nbsp;\u003C/p>\n\u003Cp>\u003Ca href=\"https://static.mobileye.com/website/common/files/Mobileye%20Drive-onepage-JAP-W.pdf\" target=\"_blank\" rel=\"noopener\">Mobileye Drive&trade;\u003C/a>\u003C/p>\n\u003Cp>\u003Cstrong>システムセンサー構成\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:infographics[**]\u003C/p>\n\u003Ch2>\u003Cstrong>ビジュアル\u003C/strong>\u003C/h2>\n\u003Cp>\u003Cstrong>Mobileye EyeQ6 SoC\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileye-eyeq-system-on-chip[**]\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye ECU Series\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileye-ecu-series[**]&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye Japan Management\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileye-japan-management[**]\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye ADAS and AV Technology\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileye-technology-and-solutions[**]\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye SuperVision&trade;\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileyes-advanced-platforms-in-the-drivers-seat[**]\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye Drive&trade;\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:driven-by-mobileye[**]\u003C/p>\n\u003Cp>\u003Cstrong>Logo\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileye's-logo[**]&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Video\u003C/strong>\u003C/p>\n\u003Cp>[**]vimeo-press:1031870114[**]\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2025-10-28T07:00:00.000Z","Press Kit, ADAS, Autonomous Driving",{"id":208,"type":5,"url":209,"title":210,"description":211,"primary_tag":9,"author_name":10,"is_hidden":11,"lang":12,"meta_description":211,"image":212,"img_alt":213,"content":214,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":215,"tags":175},303,"what-is-fmvss-127-and-how-can-mobileye-help-automakers-comply","What is FMVSS 127 and how can Mobileye help automakers comply?","Increasing safety for road users remains a priority for regulators worldwide. Mobileye provides automakers with a simpler and more cost-effective solution for these changes.  ","https://static.mobileye.com/website/us/corporate/images/652ed8b507e14a1a56e5ac3897ba31be_1759222293495.jpg","AI generated image  ","\u003Cp>The long-standing goal of making our roads safer is still very much in motion. The U.S. National Highway Traffic Safety Administration (NHTSA) updated regulation for AEB systems in passenger cars is a clear sign that regulatory bodies are working hard to reduce accidents and fatalities, making future vehicles better at detecting and avoiding collisions with vehicles and pedestrians alike.\u003C/p>\n\u003Cp>But implementing the necessary safety features commercially and at scale is a process that often involves high costs, major oversight, complex technologies, and rigorous testing. Mobileye's EyeQ&trade; -powered, camera-only solution supports automakers in addressing new standards efficiently, without additional hardware, while reducing costs and simplifying integration.\u003Cstrong> \u003C/strong>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Ch6>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/f79a08807463cac5fbce40966eabe445_1759223356388.jpg\" alt=\"\" width=\"1200\" height=\"565\" />\u003C/h6>\n\u003Ch6>\u003Cspan class=\"TrackedChange SCXW73861470 BCX0\">\u003Cspan class=\"TextRun SCXW73861470 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW73861470 BCX0\">Mobileye AEB in action: detecting a pedestrian \u003C/span>\u003Cspan class=\"NormalTextRun SCXW73861470 BCX0\">emerging\u003C/span>\u003Cspan class=\"NormalTextRun SCXW73861470 BCX0\"> from behind an obstruction and \u003C/span>\u003Cspan class=\"NormalTextRun SCXW73861470 BCX0\">initiating\u003C/span>\u003Cspan class=\"NormalTextRun SCXW73861470 BCX0\"> braking\u003C/span>\u003C/span>\u003C/span>\u003Cspan class=\"TrackedChange SCXW73861470 BCX0\">\u003Cspan class=\"TextRun SCXW73861470 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW73861470 BCX0\"> at night\u003C/span>\u003C/span>\u003C/span>\u003C/h6>\n\u003Ch3>What is FMVSS 127 and why is it needed?&nbsp;\u003C/h3>\n\u003Cp>FMVSS No. 127 (Federal Motor Vehicle Safety Standard) is a new regulation from NHTSA mandating all passenger cars and light trucks to be equipped with automatic emergency braking (AEB), including pedestrian AEB (PAEB). &nbsp;\u003C/p>\n\u003Cp>The new standard is currently set for all cars to be able to stop and avoid contact with a vehicle in front of them at up to 62 miles per hour and that the systems must detect pedestrians in both daylight and night-time conditions. In addition, the standard requires that the system applies the brakes automatically up to 90 mph (145 km/h) when a collision with a lead vehicle is imminent, and up to 45 mph (73 km/h) when a pedestrian is detected. It is expected to significantly improve road safety for both drivers and pedestrians. \u003Ca href=\"https://www.nhtsa.gov/press-releases/nhtsa-fmvss-127-automatic-emergency-braking-reduce-crashes\" target=\"_blank\" rel=\"noopener\">NHTSA projects\u003C/a> that it will save at least 360 lives a year and prevent at least 24,000 injuries annually.&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>A simple vision-only solution&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>Under the current FMVSS 127 language, AEB systems must prevent the vehicle from colliding with lead vehicles and pedestrian test devices when evaluated in the standard&rsquo;s test procedures.&nbsp;\u003C/p>\n\u003Cp>The car should avoid collisions in a range of defined test scenarios, across a wide set of conditions and lighting environments, with timely braking response and consistent performance at highway speeds. Mobileye offers a cost-effective vision-only solution that is fully camera-based, with no need for radar or lidar. With fewer hardware components, automakers can lower system cost and engineering complexity, while still designed to meet the regulation&rsquo;s high performance standards using Mobileye&rsquo;s single System-on-Chip (SoC) solution.\u003C/p>\n\u003Cp>This is achieved with the support of the Mobileye EyeQ&trade;6L SoC equipped with our proprietary AI models that have been trained on large volumes of real-world driving data. This allows automakers to integrate Mobileye&rsquo;s SoC with a front-facing camera, equipped with a core safety feature that is designed to support alignment with regulatory needs without requiring major re-homologation or an additional validation processes.&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Ch6>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/b46f5817579c6dde02ff7137879fab7a_1759223475578.png\" alt=\"\" width=\"1200\" height=\"565\" />\u003C/h6>\n\u003Ch6>\u003Cspan class=\"TextRun SCXW109257498 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"none\">\u003Cspan class=\"NormalTextRun SCXW109257498 BCX0\">Mobileye AEB in action: \u003C/span>\u003C/span> \u003Cspan class=\"TrackedChange SCXW109257498 BCX0\">\u003Cspan class=\"TextRun SCXW109257498 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW109257498 BCX0\">detecting a pedestrian and helping avoid a collision\u003C/span>\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW109257498 BCX0\" data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335557856&quot;:16777215,&quot;335559739&quot;:120,&quot;335559740&quot;:279}\">&nbsp;\u003C/span>\u003C/h6>\n\u003Ch3>\u003Cstrong>Real-world data for regulatory testing&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>Mobileye driver-assist has been integrated into hundreds of vehicle models on the road today, making up our extensive data set that extends across roads, continents, and driving conditions.\u003C/p>\n\u003Cp>Using real-life data amounting to \u003Ca href=\"https://www.mobileye.com/blog/policy-meets-innovation-how-aeb-is-driving-safer-roads/\" target=\"_blank\" rel=\"noopener\">roughly 200,000 hours of driving and covering over 11 million kilometers\u003C/a>, AEB responses were rigorously analyzed across a wide range of scenarios in cities and towns across the United States, Europe, and Asia, as well as under various lighting conditions including daylight, dawn, dusk, twilight, and nighttime. The validation process yielded positive results overall, with both very low false negative and false positive activation.&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>Leaner production means easier compliance&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>The Mobileye EyeQ is a SoC designed as a cost-effective, mass-market ADAS solution, with an architecture optimized for front-facing camera sensors equipped with core safety features. Automakers can integrate Mobileye&rsquo;s SoC to support compliance efforts effectively, while keeping options open for future driver-assist additions down the line, thanks to its modular, future-ready design. This allows for flexibility for sensor upgrades, and advanced features, all centralized in a single chip, powered by a robust AI system.&nbsp;\u003C/p>\n\u003Cp>Mobileye consistently creates systems designed to comply with regulatory demands and standards. For example, Mobileye pioneered a vision-only Intelligent Speed Assist (ISA) solution, helping automakers meet new European Union (EU) General Safety Regulation (GSR) mandates cost-effectively. It was developed for the EyeQ platform and certified for use in all 27 EU countries. The solution allows OEMs whose vehicles already include the Mobileye EyeQ4 and EyeQ6 chip to comply with the regulation simply by updating the existing software, with no need for additional hardware.\u003C/p>\n\u003Cp>A combination of advanced technology, real-world validation, and close collaboration with many automakers is what makes Mobileye the right choice for FMVSS 127. With a vision-only system, a powerful and scalable SoC, and years of experience bringing ADAS to the road, Mobileye offers a straightforward and future-ready way to support automakers in their efforts of aligning with FMVSS 127. By lowering system complexity, cutting hardware costs, and enabling updates through software, Mobileye provides automakers technologies that can help meet current requirements and be adaptable for what comes next.&nbsp;\u003C/p>\n\u003Cp>\u003Cspan data-teams=\"true\">\u003Cem>*Performance and availability are subject to multiple factors, including product specifications, operational design domain (ODD), and environmental conditions.\u003C/em>\u003C/span>\u003C/p>","2025-10-09T07:00:00.000Z",{"id":217,"type":5,"url":218,"title":219,"description":220,"primary_tag":40,"author_name":10,"is_hidden":11,"lang":12,"meta_description":221,"image":222,"img_alt":223,"content":224,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":225,"tags":226},300,"from-pilot-testing-to-auto-pilot-driven-by-mobileye","From pilot testing to auto-pilot – Driven by Mobileye™ ","How first-generation Mobileye Drive™ paves the path towards the next generation of fully scalable autonomous deployment.","How first-generation Mobileye Drive™ paves the path towards the next generation of fully scalable autonomous deployment.  ","https://static.mobileye.com/website/us/corporate/images/f2dee641e3e42976750ce90d18b3ee40_1756291478845.jpg","VW ID Buzz.","\u003Cp>Autonomous mobility is no longer a futuristic vision, it's an evolving reality, steadily gaining traction in cities across the globe. At Mobileye, we believe in solving the hard parts first so we can hit the ground running when it comes to deployment at scale. Thanks to Mobileye&rsquo;s strategic ecosystem approach, which calls for collaboration with vehicle makers, fleet operators, and on-demand-service providers, Driven by Mobileye&trade; vehicles are already traveling in certain locations in Europe and North America, running pilots for the benefit of the complete ecosystem. Designed to enable autonomous Mobility as a Service (MaaS) such as robotaxi, ride sharing and ride pooling projects, while paving the way for the more advanced second-generation solutions to take over. &nbsp;\u003Cbr />\u003Cbr />\u003C/p>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"Mobileye MaaS commercial video\" src=\"https://player.vimeo.com/video/1114803762?h=a2cd9d910c&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" frameborder=\"0\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Cp>&nbsp;\u003Cbr />\u003Cspan lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">First generation Mobileye Drive&trade; technology is being used for pilots in Norway, Germany, and the United States. They run limited services and features, offering ecosystem participants to dip their toes into MaaS. For example, six first generation-powered NIO vehicles with a safety operator run in the Grorud Valley outside of Oslo, and six more are outside of Frankfurt, Germany. These pilots give ecosystem participants a chance to learn new processes of using an on-demand shuttle and adjust their platforms and user experience. These development vehicles hold the first version of our tech stack, showcasing a redundant sensor suite consisting of a camera system and a radar and lidar system and Mobileye EyeQ&trade;5 processors.\u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/b38b6a3cfa1c9ddd8cc1ff6a13de4751_1756291595167.jpg\" alt=\"\" width=\"600\" height=\"338\" />\u003C/p>\n\u003Cp>\u003Cspan lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">Building on these testing pilots, the next generation of Mobileye Drive technology is specifically optimized for use in mobility services and production at automotive scale. For example, the Mobileye Drive-powered VW ID.Buzz. relies on a newer Mobileye Drive ECU, leveraging four EyeQ&trade;6H chips - our most powerful chip to date. With improved accuracy thanks to a more robust sensing suite including the Mobileye Imaging Radar&trade;, optimized algorithms, and a transparent Compound AI approach, it&rsquo;s designed to deliver a high-performance, scalable and cost-effective product. Beyond the more precise and robust sensing, the production Mobileye Drive tech stack unlocks several new approaches to autonomy, such as the Primary Guardian Fallback AI system our CEO Prof. Amnon Shasua and CTO Prof. Shai Shalev-Shwartz discussed during \u003C/span>\u003Ca href=\"https://www.youtube.com/watch?v=92e5zD_-xDw\">\u003Cspan lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"none\">Mobileye Driving AI event\u003C/span>\u003Cspan lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"none\">.\u003C/span>&nbsp;\u003C/a>\u003C/p>\n\u003Cp>\u003Cspan class=\"TextRun SCXW28882524 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">Extensive\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> deployment \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">of autonomous \u003C/span>\u003Cspan class=\"NormalTextRun SpellingErrorV2Themed SCXW28882524 BCX0\">MaaS\u003C/span> \u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">services \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">means mastering both \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">the\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> road and \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">the\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> service in every sense, it\u003C/span> \u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">hinges on\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> more than just innovation. It requires precision, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">safety\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> and efficiency\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> -\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> at scale and by design, f\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">rom \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">being \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">able to\u003C/span> \u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">adapt to local driving culture\u003C/span> \u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">all \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">the\u003C/span> \u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">way through\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> to accounting for various edge cases on \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">the\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> driving side, as well as \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">anticipating\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> customer demands, building a user-friendly platform, and \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">general \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">veh\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">i\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">cle maintenance\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">. \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">The\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">se\u003C/span> \u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">are\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> all aspects \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">of\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> multiple layers of operation \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">that \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">must\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> be \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">considered\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">. \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">That is why \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">the\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> first generation of\u003C/span> \u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">Mobi\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">l\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">eye\u003C/span> \u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">D\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">rive\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> technology \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">is\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> crucial \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">for \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">refin\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">ing\u003C/span> \u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">the\u003C/span> \u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">next generation \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">product\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">s\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> that\u003C/span> \u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">veh\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">i\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">cle designers\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">, fleet operators, and service providers \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">can soon\u003C/span> \u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">deliver \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">on \u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\">the\u003C/span>\u003Cspan class=\"NormalTextRun SCXW28882524 BCX0\"> full promise of autonomous driving.&nbsp;\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW28882524 BCX0\" data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>","2025-09-01T07:00:00.000Z","Driverless MaaS, Autonomous Driving, AV Safety",{"id":228,"type":24,"url":229,"title":230,"description":231,"primary_tag":32,"author_name":10,"is_hidden":11,"lang":12,"meta_description":231,"image":232,"img_alt":233,"content":234,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":235,"tags":236},301,"prof-amnon-shashua-named-to-time100-ai-list","Prof. Amnon Shashua Named to TIME100 AI List","TIME100 AI recognizes the 100 most influential people in artificial intelligence worldwide.","https://static.mobileye.com/website/us/corporate/images/74fa93cba8273c6b5416108a4d4bc2f2_1756382949215.jpg","Mobileye Founder Prof. Amnon Shashua is honored on the TIME100 AI 2025 list. (Credit: Yanai Yechiel)","\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye President and CEO Prof. Amnon Shashua has been named to the \u003C/span>\u003Ca href=\"http://time.com/time100ai\">\u003Cspan data-contrast=\"none\">2025 TIME100 AI\u003C/span>\u003C/a>\u003Cspan data-contrast=\"auto\">, a list recognizing the 100 most influential people in artificial intelligence worldwide. Shashua was recognized for his broad contributions across real-world applications of AI, spanning autonomous driving, robotics, language models, and advanced reasoning.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">His career at the forefront of AI research has driven the commercial success and growth of Mobileye&rsquo;s advanced driver-assistance and autonomous driving solutions. Since its inception 25 years ago, Mobileye has been among the earliest adopters and innovators of AI breakthroughs, transforming cutting-edge research into lifesaving automotive technologies deployed in millions of vehicles around the world.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Autonomous vehicles represent one of the most promising applications of Physical AI &mdash; where intelligence must interact with the real world, in real time, with real consequences. Its potential to revolutionize transportation and save millions of lives depends on achieving both safety and scalability, a challenge that has motivated Shashua&rsquo;s work for more than two decades. Under his leadership, Mobileye has defined a clear path to autonomy and accessible automated driving that balances advanced AI with the engineering precision needed to make it safe, scalable, and transformative.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Shashua&rsquo;s commitment to safety and transparency has been central to his influence on the industry. In 2017, Mobileye introduced \u003C/span>\u003Ca href=\"https://arxiv.org/pdf/1708.06374\">\u003Cspan data-contrast=\"none\">Responsibility-Sensitive Safety\u003C/span>\u003C/a>\u003Cspan data-contrast=\"auto\"> (RSS), a framework for defining and ensuring safety in AI-driven autonomy. In 2024, he and Mobileye CTO Prof. Shai Shalev-Shwartz expanded this foundation by publishing \u003C/span>\u003Ca href=\"https://static.mobileye.com/website/us/corporate/files/SDS_Safety_Architecture.pdf\">\u003Cspan data-contrast=\"none\">A Safety Architecture for Self-Driving Systems\u003C/span>\u003C/a>\u003Cspan data-contrast=\"auto\">, proposing core principles for designing AV technology, measuring performance and eliminating unreasonable risk. These contributions to the industry create a clearer roadmap for regulators, automakers, and the public to evaluate the safety of autonomous vehicles.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">As co-founder of \u003C/span>\u003Ca href=\"https://www.menteebot.com/\">\u003Cspan data-contrast=\"none\">Mentee Robotics\u003C/span>\u003C/a>\u003Cspan data-contrast=\"auto\">, Shashua is leading AI-first development of humanoid robots designed to operate in real-world environments, capable of executing advanced tasks for household and warehouse settings. The reveal of Mentee&rsquo;s production-intent prototype in 2025, with advanced agility, strength, awareness and continuous operation, demonstrated the potential for Physical AI in industry and daily life.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">With \u003C/span>\u003Ca href=\"https://doubleai.com/\">\u003Cspan data-contrast=\"none\">AAI Technologies\u003C/span>\u003C/a>\u003Cspan data-contrast=\"auto\">, Shashua is tackling the next frontier in AI: cracking the code of superintelligence. Under Shashua&rsquo;s leadership, AAI is developing new paradigms in deep reasoning and learning that emulate how scientists iterate toward discovery, charting a path for reaching superintelligent AI that can match and exceed human-level STEM expertise.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Together, these efforts reflect Shashua&rsquo;s vision for AI across three pivotal domains affecting society: safe autonomy on the road, intelligent machines in the physical world, and expert-level reasoning to unlock breakthroughs in science and technology.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">To assemble the list, TIME&rsquo;s editors and reporters examined the key stories in AI over the past year and consulted with expert sources and industry leaders for recommendations.  The result is a list of 100 leaders, innovators, shapers, and thinkers who have a stake in the future of AI. \u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">See the full list here: \u003C/span>\u003Ca href=\"http://time.com/time100ai\">\u003Cspan data-contrast=\"none\">time.com/time100ai\u003C/span>\u003C/a>\u003Cspan data-contrast=\"none\">.\u003C/span>\u003C/p>","2025-08-28T07:00:00.000Z","Amnon Shashua, Awards, News",{"id":238,"type":5,"url":239,"title":240,"description":241,"primary_tag":140,"author_name":10,"is_hidden":11,"lang":12,"meta_description":241,"image":242,"img_alt":243,"content":244,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":245,"tags":145},299,"compound-ai-the-framework-powering-scalable-autonomy","Compound AI: The framework powering scalable autonomy ","We’re in the midst of an AI revolution, and its impact on driving is profound. As these systems grow more capable, the technology that underpins them becomes the focus of intense competition.","https://static.mobileye.com/website/us/corporate/images/b5767eb911fc0460e9110b1c81ec69ef_1753943068689.jpg","Breaking autonomy into clearly defined components such as sensing, planning, and acting, each corresponding with a dedicated model (or models).  ","\u003Cp>\u003Cspan data-contrast=\"auto\">When it comes to AI, Mobileye takes a different approach, grounded in system design, real-world validation, and true scalability. This is embodied in our Compound AI architecture: a blend of flexible end-to-end learning and purpose-built algorithms, designed to support safe and scalable deployment. It&rsquo;s more than an architectural system, but rather, a guiding framework that we believe powers our advanced ADAS and AV solutions today.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">In this blog, we will break down what Compound AI at Mobileye means and how it serves as the foundation for scalable autonomy.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>What is Compound AI for autonomous driving?\u003C/strong>\u003C/h3>\n\u003Cp>Compound AI refers to the integration of multiple specialized, purpose-built AI models working together to solve complex tasks that may be too challenging or inefficient for a single model to handle alone. Instead of relying on a monolithic, general-purpose AI, Compound AI takes a modular and layered approach. Each component is optimized for a specific sub-task, and the system coordinates their outputs to produce a cohesive, intelligent result.\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Many autonomous driving systems today still rely on a single, end-to-end model, treating autonomy as one massive learning task. In this approach, the system attempts to learn everything from visual inputs to build a driving policy, or &ldquo;photons to control.&rdquo;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">However, the most advanced AI results increasingly come from compound systems. These architectures are made up of multiple specialized components, each optimized for a specific function. As \u003C/span>\u003Ca href=\"https://bair.berkeley.edu/blog/2024/02/18/compound-ai-systems/\">\u003Cspan data-contrast=\"none\">Berkeley AI Research (BAIR)\u003C/span>\u003C/a>\u003Cspan data-contrast=\"auto\">\u003Ca href=\"https://bair.berkeley.edu/blog/2024/02/18/compound-ai-systems/\"> \u003C/a>notes, \"state-of-the-art AI results are increasingly obtained by compound systems with multiple components.&rdquo;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>In the case of Mobileye, Compound AI follows the same principle. It breaks autonomy into clearly defined components such as sensing, planning, and acting, each corresponding with a dedicated AI model (or models).\u003C/p>\n\u003Ch3>\u003Cstrong>How does Mobileye do compound AI?\u003C/strong>\u003C/h3>\n\u003Cp>As outlined in Mobileye&rsquo;s \u003Ca href=\"https://www.mobileye.com/blog/the-mobileye-safety-methodology-for-fully-autonomous-driving/\">A Safety Architecture for Self-Driving Systems\u003C/a> achieving the safety and reliability needed for scalable autonomy requires a strong technical foundation. At Mobileye, that foundation is built on modular design, multiple independent sensing modalities, and layered redundancy.\u003C/p>\n\u003Cp>This is Mobileye&rsquo;s strategic approach to developing real-world AI systems using a blend of technologies and expertise. Together, they form the to scalable, real-world deployment. Each plays a critical role in how Compound AI is utilized and applied into decision making:\u003C/p>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Wingdings\" data-listid=\"3\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Wingdings&quot;,&quot;469769242&quot;:[9642],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"1\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"auto\">Modularity: \u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\">Each layer of the autonomy stack, perception, planning, and actuation is developed and refined independently. This allows engineers to focus on specific driving functions, enabling flexibility and specialization.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Wingdings\" data-listid=\"3\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Wingdings&quot;,&quot;469769242&quot;:[9642],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"2\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"auto\">Redundancy: \u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\">To ensure performance in unpredictable environments, Mobileye integrates multiple sensing modalities (camera, radar, lidar), REM crowd-sourced driving intelligence inputs, diverse AI methods, and overlapping algorithmic layers. These independent paths reinforce one another and provide resilience, not only in standard driving conditions, but also in edge cases and complex scenarios.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cul>\n\u003Cli aria-setsize=\"-1\" data-leveltext=\"\" data-font=\"Wingdings\" data-listid=\"3\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Wingdings&quot;,&quot;469769242&quot;:[9642],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" data-aria-posinset=\"3\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"auto\">&nbsp;Abstraction: \u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\">Mobileye incorporates structured logic where it matters most. The RSS&trade; model encodes core safety principles that don&rsquo;t need to be learned. REM&trade; adds a crowdsourced map and driving intelligence layer that enhances real-time perception. These abstractions reduce variance while keeping systems stable and interpretable.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">The software-hardware integration\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The internal design of Mobileye&rsquo;s Compound AI approach reflects a careful balance between flexibility and efficiency. Mobileye&rsquo;s EyeQ&trade;6 High chip, designed for advanced automation and autonomous driving, is built with five specialized components; each designed for a different balance between flexibility and specialization. Two CPUs (MPC and MIPS) provide flexible processing power, while the XNN accelerator delivers highly efficient, targeted performance. Two additional accelerators (VMP and PMA) bridge the gap between flexibility and specialization, allowing the system to adapt dynamically to different operational needs.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The EyeQ6\u003C/span>\u003Cspan data-contrast=\"auto\"> \u003C/span>\u003Cspan data-contrast=\"auto\">High is rated at 34 TOPS (INT8)&mdash;but TOPS alone aren&rsquo;t enough; context and efficiency matter more. In real-world driving workloads, the chip delivers over 1,000 frames per second on pixel-labeling neural networks, demonstrating what smart architecture can achieve in practice.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/9ab71dda0ddbbda9ba39001e815b5bf2_1753942887727.png\" alt=\"\" width=\"1200\" height=\"673\" />\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Built for the road\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Compound AI, as an architectural system and a reflection of our ongoing mission to build solutions that prioritize safety, efficiency, and scalability, serves as the foundation for a spectrum of autonomous systems. These range from advanced driver-assistance to fully autonomous platforms like Mobileye Chauffeur&trade; and Mobileye Drive&trade;, designed for eyes-off and fully driverless mobility.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">As AI continues to reshape mobility, the question is no longer whether autonomy is possible. The focus is on how to make it scalable, safe, and real. Compound AI represents Mobileye&rsquo;s answer&mdash;a layered, modular, and disciplined approach grounded in real-world performance\u003C/span>\u003Cspan data-contrast=\"auto\">.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>","2025-07-31T07:00:00.000Z",{"id":247,"type":5,"url":248,"title":249,"description":250,"primary_tag":9,"author_name":10,"is_hidden":11,"lang":12,"meta_description":250,"image":251,"img_alt":252,"content":253,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":32,"publish_date":254,"tags":255},298,"presenting-the-mobileye-driver-monitoring-system-fusing-road-safety-inside-the-cabin","Presenting the Mobileye Driver Monitoring System™, fusing road safety inside the cabin ","Mobileye introduces DMS technology that lets the driver and vehicle work together","https://static.mobileye.com/website/us/corporate/images/90a2f202a7f27c08323d12e2c030cda5_1752662994017.jpg","The right side of the image contains AI-generated content","\u003Cp>\u003Cspan data-contrast=\"auto\">For years, driver monitoring systems have addressed one of the biggest safety risks on the road: the human factor. Fatigue, distraction, and impaired driving continue to cause millions of accidents each year. As vehicles take on more driving tasks, safety at scale depends on not only what the vehicle sees, but on how well the driver and system understand each other in real time.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">To help automakers meet these challenges, Mobileye has developed its own driver monitoring system, applying its deep expertise in AI-powered computer vision and ADAS. By fusing Mobileye DMS&trade; directly with Mobileye ADAS on a single chip, automakers are offered a highly integrated platform that simplifies development, lowers system costs, and scales easily across global vehicle programs. The result: an intuitive connection between drivers and vehicles, enabling a more natural, cooperative driving experience.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">&ldquo;We&rsquo;ve become a global leader in ADAS by constantly developing new technologies to solve challenges facing our customers,&rdquo; said Nimrod Nehushtan, executive vice president of business development for Mobileye. &ldquo;Our new DMS solution not only answers the business hurdles for integrating DMS and ADAS but improves the synergies between the two systems to make them even better for drivers.&rdquo;&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Unlike standalone solutions, Mobileye DMS works in concert with the vehicle&rsquo;s external sensing systems. By cross-referencing the driver's gaze with real-time road conditions captured by the external ADAS cameras, the system can assess whether the driver has noticed critical objects or vulnerable road users like pedestrians and cyclists, mitigate its response, reducing abrupt intervention. If the driver did not notice the risk the system can trigger timely alerts, helping prevent accidents before they occur. This level of seamless handover between humans and vehicles can allow drivers to put their attention to the right place in the right time.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">\u003Ciframe title=\"vimeo-player\" src=\"https://player.vimeo.com/video/1101190093?h=3f6c32b4fe\" width=\"640\" height=\"360\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/span>\u003C/p>\n\u003Ch3>Monitoring inside and outside the car&nbsp;\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye DMS uses an infrared camera installed inside the cabin, capturing high-frequency images of the driver&rsquo;s eyes at 60 frames per second. Leveraging our AI-powered neural networks, Mobileye DMS then analyzes the driver&rsquo;s eye movement and blinking speed to track gaze and engagement levels with high accuracy. The system has been trained to detect signs of drowsiness, like yawning; distractions like phone use while driving, and other signals of impaired driving.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">What sets Mobileye apart is fusing in-cabin monitoring with real-time road information from ADAS cameras. By correlating driver gaze patterns with actual road conditions, the system is designed to detect distraction that traditional DMS may miss, like when a driver's eyes are open but not scanning critical hazard areas. This contextual awareness enables enhanced detection of impaired driving states, from basic indicators like yawning to complex distraction patterns that cabin-only systems cannot identify.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">With global regulations moving quickly, DMS has become a growing requirement for meeting safety regulations. For example, Euro NCAP 2026 scoring includes both driver engagement monitoring and occupant monitoring requirements, while other regulations require DMS for compliance. But existing DMS solutions add cost and complexity to a vehicle stack.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">For automakers, the Mobileye DMS eliminates the need for integrating a separate processor on their technology stack, reducing costs and supporting SoC / ECU consolidation goals. The system can leverage either the Mobileye EyeQ&trade;6 Lite or EyeQ 6 High, with various camera and sensing options customized for specific needs. On advanced platforms such as Mobileye Surround ADAS&trade;, Mobileye SuperVision&trade; and Mobileye Chauffeur&trade;, which offer hands-off and/or eyes-off driving in specified areas (where available, and subject to product specifications, manual, ODD and law) the DMS-ADAS fusion can support drivers with more accurate system takeover requests, enabling smarter decision-making by combining data about the surroundings with the driver&rsquo;s state when requesting driver intervention. \u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The Mobileye DMS system can also be used for control functions; it enables dynamic vehicle behavior adjustments based on real-time driver attentiveness levels. If reduced driver attention is detected, the system can automatically increase following distance, adjust cruise control sensitivity, or limit automated lane changes. If driver attention is identified as insufficient, the system could require confirmation to switch lanes &ndash; for example, by prompting the driver to briefly glance at the sideview mirrors. These adaptive responses, fine-tuned by the automaker, help maintain safer driving until full driver engagement is restored.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/7d0209186824bdd3c62bef5d0433e7f0_1752663400244.jpg\" alt=\"\" width=\"600\" height=\"318\" />\u003C/span>\u003C/p>\n\u003Ch3>A look beyond&nbsp;\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Looking ahead, it is likely that these technologies will take on even greater importance as hands-off and autonomous driving solutions continue to expand. On supervised systems like Mobileye Surround ADAS and Mobileye SuperVision, as well as our hands off/eyes off platform Mobileye Chauffeur, the fusion of driver state with environmental perception enables more intelligent takeover requests, ensuring the driver is ready to resume control when necessary.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Driver monitoring systems will be essential for the next generation of safety and autonomy. By combining in-cabin awareness with advanced perception from outside the vehicle, Mobileye&rsquo;s integrated approach creates a simplified delivery process for the automaker and a symbiotic understanding between driver and driving environment, building the foundation for safer, smarter mobility at scale.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>","2025-07-17T07:00:00.000Z","ADAS, News, Industry",{"id":257,"type":5,"url":258,"title":259,"description":260,"primary_tag":51,"author_name":10,"is_hidden":11,"lang":12,"meta_description":260,"image":261,"img_alt":262,"content":263,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":32,"publish_date":264,"tags":265},297,"where-mobility-meets-the-game-mobileye-partners-with-vfl-wolfsburg","Where mobility meets the game: Mobileye partners with VfL Wolfsburg","Mobileye is becoming a top-tier partner of VfL Wolfsburg, a football team in Germany’s Bundesliga (Federal League). ","https://static.mobileye.com/website/us/corporate/images/3190dec398225b0d7744d1f29bcae4e6_1751278008967.png","Wolfsburg is a unique football team with a strong focus on innovation, vision and teamwork. ","\u003Cp>Delivering excellence on the roads has always been at the heart of what Mobileye does. Now, for the first time, we&rsquo;re bringing that spirit to the football field. In a unique collaboration, Mobileye is becoming a top-tier partner of VfL Wolfsburg, a football team in Germany&rsquo;s Bundesliga (Federal League). Here&rsquo;s what it means.\u003C/p>\n\u003Ch3>Where mobility and football meet\u003C/h3>\n\u003Cp>This\u003Ca href=\"https://www.vfl-wolfsburg.de/en/newsdetails/news-detail/detail/news/automotive-technology-meets-team-spirit-a-vision-for-the-future\"> strategic collaboration\u003C/a> means a lot for Mobileye, not just for the brand but for what we stand for. As an automotive technology company, this partnership opens up new opportunities to take the Mobileye brand to the next level in Germany, across Europe and beyond.\u003C/p>\n\u003Cp>Wolfsburg is a unique football team with a strong focus on innovation, vision and teamwork, which makes this partnership feel like a natural fit. It is about two worlds coming together with a shared drive to shape what comes next, in a country and city that has long been a symbol of smart mobility and progress.&nbsp;\u003C/p>\n\u003Ch3>About VfL Wolfsburg and the Bundesliga\u003C/h3>\n\u003Cp>VfL Wolfsburg plays in the Bundesliga, Germany&rsquo;s top football league and one of the most followed competitions worldwide. The league is home to many internationally known teams and has brought up world-class players past and present, which makes \u003Ca href=\"https://primetimesportstalk.com/the-bundesligas-growing-influence-outside-germany/\">tens of millions\u003C/a> of fans worldwide follow Bundesliga games every weekend.\u003C/p>\n\u003Cp>Wolfsburg claimed the Bundesliga title in 2009 and followed up with a German Cup title in 2015, making it one of the most successful teams since the year 2000. The club&rsquo;s home city is also the headquarters of Volkswagen, making Wolfsburg a place where automotive excellence and sporting ambition naturally come together.\u003C/p>\n\u003Ch3>Next level exposure\u003C/h3>\n\u003Cp>With the Mobileye brand going to the football field, this means a lot of exposure. Jersey branding, field banners and social media campaigns will put the Mobileye name front and center. Here&rsquo;s a bit more on where fans will see Mobileye during the 2025/2026 season:\u003C/p>\n\u003Cp>The Mobileye logo will appear on the left sleeve of VfL Wolfsburg&rsquo;s kits in all matches, from Bundesliga and DFB-Pokal to international and friendly games. Beyond the jersey, the Mobileye brand will have a strong presence in and around the stadium.\u003C/p>\n\u003Ch3>Football and more\u003C/h3>\n\u003Cp>This is an important step in strengthening relationships and another way for Mobileye to champion excellence everywhere. The core of our partnership is about connection, bringing our story to new audiences and showing how innovation on the road can stand alongside innovation on the pitch. Here&rsquo;s to the next season.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2025-06-30T07:00:00.000Z","Events, Industry",{"id":267,"type":5,"url":268,"title":269,"description":270,"primary_tag":51,"author_name":10,"is_hidden":11,"lang":12,"meta_description":270,"image":271,"img_alt":272,"content":273,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":274,"tags":175},296,"mobileyes-adas-poised-to-power-indias-safety-shift","Mobileye’s ADAS: Poised to power India's safety shift  ","Globally designed and locally tailored, Mobileye’s technology is uniquely positioned to help reduce road fatalities across the subcontinent.","https://static.mobileye.com/website/us/corporate/images/55f02e98cfb7c2063646f6d1945b57fa_1750245475904.jpg","The sub-continent has more than 1.4 billion residents who drive about 375 million vehicles across millions of kilometers ","\u003Cp>\u003Cspan data-contrast=\"auto\">For all its breathtaking views, renown sites, and colorful communities, India is not an easy place to drive. The sub-continent has more than 1.4 billion residents who drive about 375 million vehicles across millions of kilometers nationwide, and, while the government and auto industry are taking actions in an attempt to manage mobility at such a large scale, there is a clear unfortunate reality. In 2022, India counted at least 168 thousand fatalities in more than 400 thousand road accidents, more than any other country. Due to the staggering toll, emerging governmental and business initiatives are paving the way for technology to counter these numbers.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335557856&quot;:16777215,&quot;335559739&quot;:75}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Growing safety in India\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;335557856&quot;:16777215,&quot;335559739&quot;:75}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The Indian auto industry is growing, with some projections saying it could reach $300B by 2026. The country has one of the largest road networks in the world that is rapidly expanding inside urban areas and between major cities. Alongside the growth, local policymakers and industry leaders have been working to introduce more safety measures, with ADAS (Advanced driver-assistance system) becoming more commonplace in vehicles and lay the groundwork for the next generation of autonomous driving solutions.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Recently, the Indian Ministry of Road Transport and Highways has drafted legislation that could make ADAS safety features, including blind spot monitoring and collision warning, standard across various segments in the industry, while Bharat NCAP is expected to expand its requirements and set new bars for vehicle safety ratings in the country.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The integration of ADAS and other advanced safety measures in India is growing, with the aim that it will help counter accidents, save lives and even help direct the most populated country in world towards the autonomous future. However, to get on that road, current ADAS technologies have to account to India&rsquo;s unique attributes such as the local driving culture, infrastructure and different driving environments.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Localizing ADAS for India \u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">When training, testing and validating safety systems&mdash;from basic driver assist to a hands-off driving platform several key local factors must be considered. This is true around the globe but even more so in India, as it differs greatly from other geographies in driving behavior, road infrastructure, and hazards.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch6>\u003Cspan data-contrast=\"auto\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/386c77c2ed4ee16b8535f3dbc9dc34c0_1750245608966.jpg\" alt=\"\" width=\"1200\" height=\"565\" />\u003C/span>\u003C/h6>\n\u003Cp>\u003Cspan data-contrast=\"auto\">India presents uniquely complex challenges for the design of driver-assistance solutions and platforms. Roads are shared by a diverse mix of users- pedestrians, cars, trucks, rickshaws, bikes, animals, and other obstacles, often moving in different directions, even on highways, with some roads having minimal to no lane markings, and driving norms differ when compared to other countries. Safety systems must be equipped to handle real-world local scenarios where drivers navigate around rickshaws, pedestrians carrying sacks of produce, roaming cattle, and densely packed traffic, all while ensuring smooth and safe driving. \u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">The Mobileye-India connection\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The move in India to incorporate a growing number of ADAS features and beyond, mixed with Mobileye&rsquo;s expertise and abilities in the country, creates an alignment that pushes car safety to new heights.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Indian automakers, rightly expect that any technological addition to their vehicles must be tailored to the unique local driving conditions, in order to be rendered useful.&nbsp; In response, Mobileye refined and expanded its feature set, ensuring its ADAS could answer these specific needs.&nbsp; From base ADAS to Cloud-Enhanced ADAS&trade; to SuperVision&trade;, each solution is developed with a focus on localization and real-world applicability, designed to support safe navigation in India&rsquo;s demanding traffic environment.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">All of these solutions are powered by our dedicated automotive grade EyeQ&trade; system-on-chip, which enables ADAS applications such as Automatic Emergency Breaking and Automatic Cruise Control, and contributes to safer navigation in complex traffic conditions.&nbsp;\u003C/span>\u003C/p>\n\u003Ch6>\u003Cspan data-contrast=\"auto\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/1bf0dd3b182c20a75400c8f2c83355db_1750245798879.jpg\" alt=\"\" width=\"1200\" height=\"565\" />\u003C/span>\u003C/h6>\n\u003Cp>\u003Cspan data-contrast=\"auto\">One standout feature is Mobileye's animal detection tool. In India, cows roam freely and share the roadways with vehicles, and the ability to recognize them and take appropriate action is critical. Mobileye&rsquo;s algorithm is specifically trained to identify cows, whether moving or standing and from different angles, and respond as needed.&nbsp; Appropriately identifying cows is a local need that saw Mobileye invest in dedicated training and validation methods to ensure optimized performance &ndash; beyond standard animal detection.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/7b6946e2f443d6a377a950ab67712ece_1750247311030.jpg\" alt=\"\" width=\"1200\" height=\"565\" />\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">India&rsquo;s future path\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye&rsquo;s strong foundation in India, along with the local industry&rsquo;s drive towards more advanced technology opens an opportunity for a real change. India&rsquo;s clear goal to reduce vehicle casualties, combined with the market&rsquo;s unique culture and tech-first environment provides an opportunity for Mobileye to provide world leading ADAS technological that is custom made for Indian roads and drivers.\u003C/span>&nbsp;\u003Cbr />&nbsp;\u003Cbr />\u003Cem>\u003Cspan data-contrast=\"auto\">\"Mobileye\" in this article means Mobileye Vision Technologies Ltd.\u003C/span>\u003C/em>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:270,&quot;335559739&quot;:270}\">&nbsp;\u003C/span>\u003C/p>","2025-06-18T07:00:00.000Z",{"id":276,"type":24,"url":277,"title":278,"description":279,"primary_tag":28,"author_name":10,"is_hidden":11,"lang":12,"meta_description":279,"image":280,"img_alt":281,"content":282,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":32,"publish_date":283,"tags":284},295,"mobileye-imaging-radar-chosen-by-global-automaker-for-eyes-off-driving","Mobileye Imaging Radar chosen by global automaker for eyes-off driving","Tech breakthroughs lead to first nomination with new customer ahead of 2028 production","https://static.mobileye.com/website/us/corporate/images/220438c14cca71db62021f1e474931b8_1748429636893.png","Colored dots simulate the dense and precise data generated by Mobileye’s unique imaging radar.","\u003Cp>JERUSALEM, May 28, 2025 &mdash; For the first time, a leading global automaker has chosen Mobileye Imaging Radar&trade; as a key component of its upcoming eyes-off, hands-off automated driving system in personal vehicles, following an extensive years-long evaluation of Mobileye&rsquo;s technology and competing systems. Starting in 2028, this new customer for Mobileye plans to use the imaging radar to deliver SAE Level 3 automated driving at highway speeds, designed to provide exceptional detection of vehicles, people and objects in conditions such as fog or rain, and at long distances, that challenge existing sensors.\u003C/p>\n\u003Cp>In development since 2018, Mobileye&rsquo;s 4D imaging radar was designed to provide sensor redundancy, through superior perception in challenging lighting, environmental and traffic scenarios, to complement camera-based perception at affordable costs. This is intended to enable safe and scalable autonomous driving systems, from robotaxis to consumer AVs.\u003C/p>\n\u003Cp>&ldquo;The selection of our imaging radar by this new customer validates the groundbreaking work we undertook to develop our imaging radar,&rdquo; said Mobileye President and CEO Prof. Amnon Shashua. &ldquo;After recognizing how important this sensing modality would be to autonomous driving, we built what we believe is the industry&rsquo;s standard for imaging radar that can deliver the safety and accuracy self-driving systems require.&rdquo;\u003C/p>\n\u003Cp>Existing automotive radars provide estimates of distance from an object, their rough direction on a horizontal plane and a relative velocity. Imaging radars add an additional dimension &ndash; height.\u003C/p>\n\u003Cp>Built on a breakthrough architecture, the next-generation Mobileye Imaging Radar processes the entire signal digitally &mdash; end-to-end &mdash; through a Mobileye-designed radar processor. This is intended to enable unprecedented levels of accuracy, detail, and reliability in environmental perception, strengthening the positioning of Mobileye at the forefront of autonomous driving technology.\u003C/p>\n\u003Cp>At the core of the Mobileye Imaging Radar are Mobileye-designed, radar radio-frequency integrated circuits (RFIC). These advanced components allow exceptional flexibility in signal transmission and the ability to receive and sample the entire radar signal in a wide bandwidth while keeping noise at a low level, designed to support object detection with high confidence.\u003C/p>\n\u003Cp>Those RFICs are embedded in a unique architecture where the entire radar signal is sampled and digitally processed by a dedicated proprietary processor, with exceptionally powerful computing capability of 11 TOPS. This processor can handle more than 1,500 virtual channels at a high frame rate of 20 frames per second. The massive antenna array also delivers exceptional angular resolution below 0.5 degrees and ultra-low side-lobe levels of -40 dBc, and market-leading dynamic range of 100 dB, versus 60 dB in other automotive radars.\u003C/p>\n\u003Cp>This is designed to enable the Mobileye Imaging Radar to detect small, distant objects, even in complex scenarios with large nearby vehicles such as trucks or buses, along with precise detection of small hazards like a tire near a guardrail at a far distance, critical for safe highway driving autonomously at speeds above 75 mph or 130 kph. The system works to detect road users &mdash; pedestrians, motorcycles, and cyclists &mdash; at up to 315 meters and identify potential hazards up to 230 meters away. Crucially, where traditional radar systems often fail, such as in tunnels, construction zones, and other complex, cluttered environments, Mobileye Imaging Radar excels.&nbsp;\u003C/p>\n\u003Cp>The forward-facing BSR version of the radar uses its full sensing capabilities, while a smaller BSRC version for corner-mounted use has more than 300 channels.\u003C/p>\n\u003Cp>\u003Cstrong>Media Contact: \u003C/strong>Justin Hyde, justin.hyde@mobileye.com\u003C/p>","2025-05-28T07:00:00.000Z","News, Industry, Mobileye Inside, Autonomous Driving",{"id":286,"type":5,"url":287,"title":288,"description":289,"primary_tag":51,"author_name":10,"is_hidden":11,"lang":12,"meta_description":289,"image":290,"img_alt":291,"content":292,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":293,"tags":294},294,"a-new-balancing-act-for-automakers","The new balancing act for automakers  ","As vehicles become increasingly more software defined, automakers are rethinking everything from product roadmaps to their core identity. ","https://static.mobileye.com/website/us/corporate/images/da823c81d6e4b65ed464043cb532d556_1748340481466.jpg"," The goal is to pursue pragmatic strategies without compromising critical elements of the driving experience.  ","\u003Cp>\u003Cspan data-contrast=\"auto\">The automotive industry is experiencing a rapid introduction of more complex mobility systems.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003Cspan data-contrast=\"auto\">OEM operations have traditionally combined production, deep mechanical knowledge, skilled integration of components from various suppliers, and the management of complex supply chain networks.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003Cspan data-contrast=\"auto\">But over the past 5 to 7 years, the industry has started shifting from mechanical to digital and legacy automakers have been navigating this change, expanding their role as software companies.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">At the core of this shift are several emerging challenges, or opportunities, depending on how you look at them. These include the OEM's journey in navigating a new identity as a tech-enabled automaker, striking a balance between mechanical systems and digital capabilities, and working towards large-scale adoption of AVs.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>The evolving identity of the automaker&nbsp;&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Consumers are increasingly basing their buying decisions on \u003Ca href=\"https://www.spglobal.com/automotive-insights/en/blogs/2024/12/risks-and-rewards-of-automakers-on-software-defined-vehicles?utm_source=chatgpt.com\">software \u003C/a>\u003C/span>\u003Cspan data-contrast=\"auto\">and self-driving features. Major players in the traditional automotive sector are increasingly required to engage with advanced technologies such as artificial intelligence and big data, either through internal development or strategic collaborations.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Those who didn&rsquo;t establish themselves as software players from the start are now navigating new types of product roadmaps. Today, building a ten-year roadmap looks very different, filled with questions like: Should we develop in-house? What will future partnerships look like? What are the key components, and how complex will they be to execute? \u003C/span>\u003Cspan data-contrast=\"auto\">Ultimately, a shift will occur for the legacy automaker, and the greatest challenge will be the learning process of understanding what sort of features to prioritize for consumers and making overall strategic decisions for their brands.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>Balancing legacy manufacturing with software&nbsp;&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Within the software industry, changes can occur as rapidly as every year or two. That, however, is not the nature of the automotive industry. While the car has now become the latest frontier for new technologies (and while the pace of change has increased,) it&rsquo;s still critical to understand; the industry is not moving at the same pace, and for good reason.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">There are major considerations such as safety, drivability, and reliability&mdash;not technology for the sake of technology, but the creation of real-world systems that get people from A to B efficiently, safely, and pleasantly. The goal is to pursue balanced, gradual, and pragmatic strategies that bring new technologies to market without compromising safety and reliability, or without compromising other critical elements of the driving experience.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:276}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/0d59842888bd8ff01dbde2bb996da919_1748341765006.jpg\" alt=\"\" width=\"1200\" height=\"565\" />\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>How to mainstream autonomous driving &nbsp;&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">If we look at the current landscape, autonomous vehicles still account for only a tiny fraction of total miles driven globally. But the technology, business models, and regulatory progress are already in place&mdash;so large-scale adoption is no longer a distant future.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">In recent years, there&rsquo;s been a much broader realization that true, safe, and scalable autonomy will not be solved by buzzwords, but by a steady commitment to engineering and software innovation. Only then will autonomous driving, at a mass-market scale, bring real value to both consumers and OEMs.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>Getting consumers onboard&nbsp;&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">While the forces behind the scenes are coming together to bring autonomous driving into the mainstream, including technologists, suppliers, manufacturers, and regulators, the transformation will only truly be complete when a large number of consumers are comfortable with the idea of humans not actively driving.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">With the growing presence of Level 2+ (Eyes-on/Hands-off) and Level 3 (Eyes-on/Hands-off) assisted driving on highways, a hybrid mindset of automative and human driving is beginning to take shape. This shift is also notable in companies and robotaxi \u003C/span>\u003Ca href=\"https://www.mobileye.com/news/lyft-and-mobileye-team-up-to-enable-autonomous-mobility-at-scale/\">\u003Cspan data-contrast=\"none\">fleets\u003C/span>\u003C/a>\u003Cspan data-contrast=\"auto\"> that already provide commercial autonomous services in specific areas.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>Products, not prototypes&nbsp;&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye&rsquo;s deep expertise in developing high-performance software for next-gen ADAS and AV is rooted in years of experience. By working closely with leading OEMs, we&rsquo;ve learned what it takes to deliver advanced technology that&rsquo;s designed to perform right.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>","2025-05-27T07:00:00.000Z","AV Safety, ADAS, Industry",{"id":296,"type":5,"url":297,"title":298,"description":299,"primary_tag":9,"author_name":10,"is_hidden":11,"lang":12,"meta_description":299,"image":300,"img_alt":301,"content":302,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":303,"tags":304},293,"the-fast-lane-to-higher-levels-of-autonomy-with-the-eyeq6-soc","The fast lane to higher levels of autonomy with the EyeQ™6 SoC ","The Mobileye EyeQ ™6, the latest family of System-on-Chip solutions lets automakers plan their product roadmap more effectively and accelerate time to market. ","https://static.mobileye.com/website/us/corporate/images/4cf0a8a6c1d2c1369d282109b1a4975e_1744873267353.jpg","Designed to support automakers' wide range of products, from everyday standard-range vehicles to premium models.","\u003Cp>\u003Cspan data-contrast=\"none\">The new Mobileye EyeQ&trade;6 System-on-Chip (SoC) does more than just raise the bar on performance&mdash;it provides automakers with the foundational architecture to better customize their ADAS and autonomous technology while meeting the needs of the growing consumer AV market, from the tech-savvy premium brand enthusiasts to drivers of everyday cars.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">In this blog, we&rsquo;ll delve into what those new additions are, and how the EyeQ&trade;6 tech stack has been designed for automakers seeking to move seamlessly across the ADAS and AV spectrum.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"none\">An overview of the EyeQ&trade;6\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"none\">The EyeQ&trade;6 SoC family is the core of Mobileye&rsquo;s latest tech suit, powering the full spectrum of ADAS needs&mdash;from single front cameras to advanced multi-sensor systems.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">The family features the EyeQ&trade;6 Light, a lower-powered chip that's supports an expanded range of entry-level ADAS. The EyeQ&trade;6H, advances on this and powers premium ADAS, semi and autonomous driving capabilities (such as Mobileye SuperVision&trade; and Mobileye Chauffeur&trade;/Mobileye Drive&trade;). Through this unique system, automakers can take a modular approach and customize functions according to their customer base.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">A glimpse into the tech stack&nbsp;\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The true strength of the SoC architecture lies in its unique hardware &amp; software design. With multiple processors, accelerators, and a compound AI approach, the chip operates with remarkable efficiency. This is driven by the unique AI model named XNN, which allows the system to handle more tasks, both strenuous and simple&mdash;simultaneously.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The result is better workload distribution, enhanced deep learning, and optimized data flow, leading to improved performance. In fact, the system delivers a 10x boost in deep learning and a 1.1x to 1.9x increase in CPU performance compared to two previous generation SoCs.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/91487f605322ce4c99830c4d950995e0_1744873738525.png\" alt=\"\" width=\"1200\" height=\"668\" />\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"none\">Forward-thinking design, expansive product line\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"none\">The EyeQ&trade;6 family of SoCs is designed to support automakers' wide range of products across different vehicle trims, from everyday standard-range vehicles to premium models. More importantly, automakers could potentially be able to move towards higher levels of autonomy or functionality while maintaining core safety ADAS without the need for re-homologation and validation processes. This creates a pathway for manufacturers or providers to build on existing infrastructure to achieve greater levels of autonomy, ensure long-term viability, and benefit from cost-effectiveness.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"none\">Expand your base ADAS with the EyeQ\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\">&trade;\u003C/span>\u003Cstrong>\u003Cspan data-contrast=\"none\">6L&nbsp;\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Designed as a cost-effective mass-market ADAS solution, the EyeQ6L features an optimized architecture tailored for front camera sensors, with radar support if needed. Certain additions can be implemented separately by automakers, allowing for a balanced approach to cost and performance.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Using the same SoC but leveling up slightly, OEMs can integrate features like Occupant Monitoring Systems (OMS) and Driver Monitoring Systems (DMS) that are compliant with global standards. Additionally, when it comes to cameras, there&rsquo;s flexibility&mdash;OEMs can choose, for example, between a cost-effective 2-megapixel camera or an 8-megapixel option for high-level driving functions, which offer color-enhanced detection for lanes, signs, road markings, construction, and traffic lights.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">In short, automakers can make adjustments within the basic ADAS range and upgrade to more advanced functions through the capabilities built into the same SoC.&nbsp;&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/c2479642c97081cf05d556a0562fcfa9_1744873835527.jpg\" alt=\"\" width=\"1200\" height=\"670\" />\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>Go from driver 'assist' to 'driverless'&nbsp;&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"none\">Moving up the autonomous spectrum we have the EyeQ6H SoC. This is a major leap on two fronts: a seamless path to full autonomy, and a design that enables greater synergy between added functionalities. It&rsquo;s a design that consolidates safety-critical functions and additional ECUs&mdash;like parking applications and visualization&mdash;into a single system-on-chip, essentially producing a centralized interconnected system.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">This gives automakers a wide range of choices in the components they wish to add, as they gravitate toward designs that offer more flexibility, centralization, faster time to market, and new revenue streams. It&rsquo;s the EyeQ6H that forms the foundation of this shift.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">When it comes to autonomy, this setup is what takes the industry further, to a more software defined focus. Automakers build on core ADAS safety features while transitioning to more advanced driver assistance. They can start with Mobileye Surround ADAS, powered by a single EyeQ&trade; 6H, move up to Mobileye SuperVision&trade; with two EyeQ&trade; 6H SoCs&mdash;enabling point-to-point hands-free driving on most roads&mdash;and scale to Mobileye Chauffeur&trade; or Mobileye Drive&trade;, unlocking hands-off, eyes-off driving for consumer or commercial offerings respectively. A fully scalable path to autonomy, built to evolve.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/672c2922e4c5e0a8e25091f84d4bdf42_1745409779795.jpg\" alt=\"\" width=\"1200\" height=\"670\" />\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">OTA and \u003C/span>\u003C/strong>\u003Cstrong>\u003Cspan data-contrast=\"auto\">extended \u003C/span>\u003C/strong>\u003Cstrong>\u003Cspan data-contrast=\"auto\">lifecycle performance\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">A key aspect of the EyeQ6 is its long-term viability, achieved through its intelligent design. However, it&rsquo;s the over-the-air (OTA) capability that takes performance to the next level. OTA updates are essential for modern vehicles, especially with the rise of software-defined vehicles. These updates allow vehicle software to continuously improve and keep systems up to date; all delivered remotely for seamless enhancements without manual intervention.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye integrates this into the system, ensuring the vehicle remains adaptable and ready for future demands without requiring hardware upgrades.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Accelerating autonomy, preserving comfort\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">As we move closer to a new era of mobility\u003C/span>\u003Cspan data-contrast=\"auto\">,\u003C/span>\u003Cspan data-contrast=\"auto\"> and \u003C/span>\u003Cspan data-contrast=\"auto\">as \u003C/span>\u003Cspan data-contrast=\"auto\">vehicles become increasingly software-defined, \u003C/span>\u003Cspan data-contrast=\"auto\">it is critical that the technology supporting this shift are scalable, flexible, high performing and enhance capabilities.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">At \u003C/span>\u003Cspan data-contrast=\"auto\">Mobileye\u003C/span>\u003Cspan data-contrast=\"auto\">,\u003C/span>\u003Cspan data-contrast=\"auto\"> we aim to bring a custom-built SoC to support automakers&rsquo; software and hardware needs because we understand the importance of flexibility and scalability. Ensuring this at an architectural level is fundamental for automakers who wish to grow their market wider and do so faster. \u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">With the \u003C/span>\u003Cspan data-contrast=\"auto\">6\u003C/span>\u003Cspan data-contrast=\"auto\">th\u003C/span>\u003Cspan data-contrast=\"auto\"> generation \u003C/span>\u003Cspan data-contrast=\"auto\">SoC, Mobileye can cater to a wide range of \u003C/span>\u003Cspan data-contrast=\"auto\">automakers and their customers\u003C/span>\u003Cspan data-contrast=\"auto\">, creating solutions that make it possible for everyone to enjoy a new kind of driving\u003C/span>\u003Cspan data-contrast=\"auto\">,\u003C/span>\u003Cspan data-contrast=\"auto\"> and embrace all shades of autonomy.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>","2025-04-17T07:00:00.000Z","Autonomous Driving, ADAS",{"id":306,"type":5,"url":307,"title":308,"description":309,"primary_tag":51,"author_name":10,"is_hidden":11,"lang":12,"meta_description":309,"image":310,"img_alt":311,"content":312,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":32,"publish_date":313,"tags":175},290,"how-surround-adas-delivers-the-new-standard-of-safety-and-tech","Unveiling Surround ADAS: A new standard of safety and tech ","Mobileye Surround ADAS is a new segment that tackles long-standing challenges in advancing driver assistance technology","https://static.mobileye.com/website/us/corporate/images/568d780921661f2b09c78783f39caeda_1741266123244.jpg","Surround ADAS gives automakers a way to put these features in millions of mass-market vehicles on the road.","\u003Cp>To select a driving assistance system today, automakers typically face a binary choice: either basic ADAS with standard features while being affordable for mass-market vehicles, or more advanced systems with multiple cameras, radars, processors, and ECUs that could deliver features like limited hands-free driving, but at a system cost that makes them suitable only for premium models.\u003C/p>\n\u003Cp>That changes now, with the arrival of Mobileye Surround ADAS&trade;. It delivers a vertically integrated solution for software-defined hands-off eyes-on driving using Mobileye&rsquo;s advanced software stack from autonomous vehicle development, running on a single Mobileye EyeQ&trade;6 High System-on-Chip (SoC).\u003C/p>\n\u003Cp>\u003Cvideo autoplay=\"autoplay\" loop=\"loop\" muted=\"\" width=\"100%\" height=\"100%\">\u003Csource src=\"https://static.mobileye.com/website/us/corporate/videos/me_surround_video3.mp4\" type=\"video/mp4\" />\u003C/video>\u003C/p>\n\u003Cp>By leveraging the latest advancements in AI, a suite of multiple cameras and multiple radars totaling up to 11 sensors can be handled by a single ECU, integrating computer vision, sensor fusion, REM&trade; crowdsourced mapping, driving policy and over-the-air updates.\u003C/p>\n\u003Cp>The result? A breakthrough ADAS system that not only is designed to provide essential safety features, but also premium driving experiences within its operational design domain &ndash; like hands-free driving on various road types and smart parking &ndash; that are intended to meet both rising consumer expectations for highway automation and upcoming tougher safety regulations. And by lowering system costs, Surround ADAS gives automakers a way to put these features in millions of mass-market vehicles &ndash; with the first examples already slated to hit the road within a few years.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Ch6>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/d6b2ddb9d8518cfe8d1bda9b4d6e6c92_1742729414757.jpg\" alt=\"An end-to-end AI network on EyeQ6H integrates all camera inputs to create 360-degree perception, automatically producing a top-view model of the environment.\" width=\"1200\" height=\"1200\" />An end-to-end AI network on EyeQ6H integrates all camera inputs to create 360-degree perception, automatically producing a top-view model of the environment.&nbsp;\u003C/h6>\n\u003Cp>At the heart of the system lies software innovations by Mobileye, brought to life using advanced artificial intelligence techniques, including our compound AI system that blends end-to-end perception software with other key breakthroughs, designed to transform raw sensor data into precise, actionable insights.\u003C/p>\n\u003Cp>By leveraging Mobileye&rsquo;s experience in developing autonomous driving solutions and its SuperVision hands-free point-to-point driving system, Mobileye can raise the intelligence of regular ADAS features by integrating the core software stack with REM data and enabling over-the-air updates of new features over time. Some key solutions enabled in this software-defined approach include:\u003C/p>\n\u003Cul>\n\u003Cli>Eyes-on, hands-off driving on highway and selected road types (up to 130 km/h)\u003C/li>\n\u003Cli>Surround collision avoidance\u003C/li>\n\u003Cli>Lane support system\u003C/li>\n\u003Cli>Low-speed acceleration control\u003C/li>\n\u003Cli>Automatic lane changes\u003C/li>\n\u003C/ul>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/ccbb2d14a6d7b64c554168b80ef658ba_1742729977672.jpg\" alt=\"\" width=\"1200\" height=\"565\" />\u003C/p>\n\u003Cp>The software-first functionality also allows automakers to use Mobileye&rsquo;s tools in new ways. For example, the hands-free driving can be tuned within safety parameters by our Driving Experience Platform, or DXP. Other software such as parking functions and driver monitoring systems can be consolidated on a single ECU in close collaboration with automakers to support functionality requirements. Additionally, in-vehicle displays can leverage the rich data flow generated by the system for new and smartly integrated ways of sharing information with the driver.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/6213e68aa2bf8f06997c1719eafad4a2_1742729546827.jpg\" alt=\"\" width=\"1200\" height=\"521\" />\u003C/p>\n\u003Cp>The system typically relies on a long-range front-facing camera, four short-range parking cameras, and a set of up to five radars, with the potential for additional sensors such as a long-range rear-facing camera. The first Surround ADAS design win with VW on volume models was\u003Ca href=\"https://www.mobileye.com/news/volkswagen-group-cooperates-with-valeo-and-mobileye-to-enhance-driver-assistance-in-future-mqb-vehicles/\"> announced in March 2025\u003C/a>.&nbsp;\u003C/p>\n\u003Cp>While Surround ADAS offers hands-free driving on the highway and selected road types and advanced collision avoidance, it is distinct from Mobileye SuperVision&trade;, which adds true point-to-point navigation, including urban pilot capabilities (in each case, within its operational design domain). SuperVision remains Mobileye&rsquo;s premium eyes-on, hands-off driving solution, designed for automakers seeking a scalable path to eyes-off autonomy. With Surround ADAS, automakers now have a full spectrum of EyeQ6H-based solutions to make higher levels of active safety and automation available to both premium and volume segment vehicles.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/e5edcfae3344c2bb3add59cd52327adf_1742731580947.jpg\" alt=\"\" />\u003C/p>\n\u003Ch3>\u003Cstrong>Why the market needs a new category\u003C/strong>\u003C/h3>\n\u003Cp>Beyond technological advancements, this new ADAS category has emerged in response to several challenges faced by the industry, particularly high system costs resulting from an overly complex and fragmented supply chain ecosystem. As automakers push towards simplified, software-defined architectures, lowering the hardware complexity of a vehicle is a key goal. &nbsp;Mobileye&rsquo;s Surround ADAS meets that goal through a single ECU structure, ease of integration and, most importantly, best in class system performance.\u003C/p>\n\u003Cp>These changes have also been driven by a demand for even greater vehicle safety. Consumers expect comprehensive safety features as standard, and regulatory bodies are enforcing stricter guidelines to improve road safety and further reduce accidents amid increasing congestion and driver distractions. For example, in Europe automakers targeting a 5-star EuroNCAP rating in the near future will need to upgrade to surround sensing systems, as regulators recognize how side-view and wide-angle cameras expand protection for vehicles, pedestrians, cyclists, and other road users in critical situations.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/008f8de3497d2621bf3d48acf110bcbb_1742729931754.jpg\" alt=\"\" />\u003C/p>\n\u003Cp>Ultimately, consumers are at the heart of this evolution. They desire the premium, hands-off driving experience popularized by high-end models to be democratized to a broader market. Limited hands-off driving in specific roads and conditions is the highest growing level of autonomy, with projections indicating that one out of four new vehicles will be equipped with these capabilities by the year \u003Ca href=\"https://cdn.euroncap.com/media/74468/euro-ncap-roadmap-vision-2030.pdf\">2030\u003C/a>.\u003C/p>\n\u003Cp>For automakers, this new category represents multiple shifts across the automotive landscape, all converging into a new chapter of ADAS capabilities. It&rsquo;s an opportunity to bring the comfort and convenience of today&rsquo;s premium driving experience to the roads at an unprecedented scale. With Mobileye Surround ADAS, automakers can deliver the advanced safety and convenience that modern drivers demand.\u003C/p>","2025-03-25T07:00:00.000Z",{"id":315,"type":24,"url":316,"title":317,"description":318,"primary_tag":28,"author_name":10,"is_hidden":11,"lang":12,"meta_description":318,"image":319,"img_alt":320,"content":321,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":32,"publish_date":313,"tags":322},291,"volkswagen-group-cooperates-with-valeo-and-mobileye-to-enhance-driver-assistance-in-future-mqb-vehicles","Volkswagen Group cooperates with Valeo and Mobileye to enhance driver assistance in future MQB vehicles","The collaboration aims to enhance safety and driving comfort in Volkswagen’s upcoming\nMQB-based vehicle portfolio. ","https://static.mobileye.com/website/us/corporate/images/ff712ffe72a631b1f803a960b66169c5_1742828482708.jpg","Volkswagen Group, Valeo and Mobileye collaborate on Surround ADAS.","\u003Cp>Wolfsburg/Jerusalem/Paris, March 25, 2025 -- Volkswagen Group is working with Valeo and Mobileye to upgrade the advanced driver assistance systems up to Level 2+ (\"enhanced partially automated driving\") in its upcoming vehicle portfolio based on the MQB platform. Launching in the next few years, this cooperation will improve safety and driving comfort in high-volume vehicles, addressing both customer expectations and regulatory requirements.\u003C/p>\n\u003Cp>&ldquo;This cooperation supports us on our road to transformation: by sourcing hardware and software together, we streamline procurement, reduce complexity, and improve efficiency. It also empowers our performance program by enhancing technology while keeping costs competitive, ensuring high-quality solutions for our customers,&rdquo; says Dirk Gro&szlig;e-Loheide, Member of the Board of Management of the Volkswagen Brand responsible for Procurement and Member of Volkswagen AG&rsquo;s Extended Executive Committee.\u003C/p>\n\u003Cp>Beyond hands-free driving in specific conditions on approved highway sections, the system will offer features like traffic jam assist, hazard detection, parking assist, driver monitoring, and 360-degree emergency assist, with future-ready capabilities such as augmented reality displays.\u003C/p>\n\u003Cp>With this collaboration and streamlined procurement across multiple brands, Volkswagen Group is advancing vehicle safety and automation while ensuring efficient development and cost-effective solutions for its customers.\u003C/p>\n\u003Cp>\u003Cstrong>Improved assistance systems \u003C/strong>\u003C/p>\n\u003Cp>The new system features a 360-degree ring of multiple cameras and radars, along with software-defined capabilities, enabling hands-free driving on approved roads, smart parking, and improved occupant and pedestrian safety.\u003C/p>\n\u003Cp>Valeo provides high-performance ECUs, sensors, and parking solutions, while Mobileye contributes its \u003Ca href=\"https://www.mobileye.com/blog/how-surround-adas-delivers-the-new-standard-of-safety-and-tech/\" target=\"_blank\" rel=\"noopener\">Surround ADAS&trade; platform\u003C/a>, including the EyeQ&trade;6 High processor and mapping technologies. For the first time, these elements are integrated into a single system, replacing multiple ECUs with a centralized unit. This improves efficiency, system performance, and allows for over-the-air updates to meet evolving safety standards.\u003C/p>\n\u003Cp>&ldquo;At Valeo, we are committed to advancing innovation in driver assistance technology. We are excited to embark on a new journey and to offer to Volkswagen, together with Mobileye, this complete solution of affordable, state-of-the-art, advanced driving features for their end-users,&rdquo; explains Marc Vrecko, CEO of Valeo Brain Division.\u003C/p>\n\u003Cp>&ldquo;Working with Valeo and Volkswagen Group, this software and hardware integrated approach puts AI innovations to work in the real world.&rdquo; said Prof. Amnon Shashua, president and CEO of Mobileye. &ldquo;By improving efficiency and costs while upgrading capabilities for safety and comfort in driver assist, this system points the way to a new class of driving technology.&rdquo;\u003C/p>","ADAS, News, Industry, Mobileye Inside, Mapping & REM",{"id":324,"type":5,"url":325,"title":326,"description":327,"primary_tag":140,"author_name":10,"is_hidden":11,"lang":12,"meta_description":327,"image":328,"img_alt":329,"content":330,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":331,"tags":332},288,"modularity-from-adas-to-av","Mobileye’s new ECU Series: Modularity from ADAS to AV   ","Featuring our most powerful chip yet, this line of AI-powered solutions offers modularity throughout the automated to autonomous driving spectrum ","https://static.mobileye.com/website/us/corporate/images/97ee57a9843cd0c7ddfc4330047605f9_1740469437784.jpg","The heart of our new ECU series is a shared primary board featuring two EyeQ6H chips and an integrated MCU.","\u003Cp>Since its creation, the Mobileye EyeQ&trade; chip has made roads safer by powering Mobileye&rsquo;s advanced driver assistance systems (ADAS). Built for our vision-first approach, it powers lifesaving features such as automatic emergency braking and lane keeping. With over 200 million chips installed, our vision for an autonomous future is closer, as we introduce greater modularity between our automated and autonomous platforms. &nbsp;\u003C/p>\n\u003Cp>\u003Ciframe title=\"Mobileye ECU\" src=\"https://player.vimeo.com/video/1060382068?h=599b761860&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"600\" height=\"338\" frameborder=\"0\">\u003C/iframe>\u003C/p>\n\u003Cp>Now, we are taking that innovation further. Building upon Mobileye&rsquo;s pedigree of ingenuity, safety, design, and scalability, we are turning that vision into an on-the-road reality. Designed for future scalability, Mobileye&rsquo;s new Electronic Control Unit (ECU) series offers three product configurations to support various levels of driving automation. Powered by the latest generation Mobileye EyeQ&trade;6H SoC&ndash; these three platforms allow car manufacturers to efficiently progress up the autonomous driving spectrum, providing drivers with safety and comfort, while reducing development and validation risks.&nbsp;\u003C/p>\n\u003Ch3>Inside the ECU: The EyeQ advantage&nbsp;\u003C/h3>\n\u003Cp>A major challenge in creating a series of scalable &amp; modular advanced driving platforms is balancing flexibility and efficiency in its core. Mobileye EyeQ6H, our most advanced SoC&rsquo;s distinct architecture was designed to minimize that tradeoff. By bringing fixed-functionality and general-purpose processing capabilities together, the EyeQ6H can accelerate a range of parallel computing functions such as executing demanding AI deep learning and processing over 1,000 frames per second captured by surround cameras.&nbsp;\u003C/p>\n\u003Cp>The EyeQ6H is then placed on a Mobileye designed and made board, which allows us to address automaker needs, create a smoother integration process, and offer greater adaptability to their product roadmap.&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/d4a8c7bdccdc1a58e536dc266b1556e9_1740469761906.jpg\" alt=\"\" width=\"600\" height=\"389\" />\u003C/p>\n\u003Ch3>One board to power the product roadmap&nbsp;\u003C/h3>\n\u003Cp>That board is in the heart of our new ECU series, a shared primary board featuring two EyeQ6H chips and an integrated MCU. This compact yet powerful architecture provides automakers with the foundation for autonomous driving solutions, requiring only some adjustments to the future hardware and software design of the vehicle.&nbsp;\u003C/p>\n\u003Cp>By itself, the base configuration is Mobileye SuperVision&trade;, our hands-off/eyes-on platform that is connected to 11 cameras and an optional radar. SuperVision enhances the driving experience by maneuvering and performing certain driving actions including navigate-on-pilot up to 130/Kmh with continuous driver oversight (where available, and subject to specifications, manual, ODD and law). &nbsp;At just over five pounds and as big as a cereal box, the AI-powered system includes internal Advanced Parking functionalities inside the ECU and is built with durability by design.&nbsp;\u003C/p>\n\u003Ch3>Eyes-off adds on &nbsp;\u003C/h3>\n\u003Cp>By adding a secondary computing board to the SuperVision ECU, while keeping most of its design, we introduce Mobileye Chauffeur&trade; a hands-off/eyes-off platform (where available, and subject to specifications, manual, ODD and law) for consumer vehicles. The secondary board incorporating an additional EyeQ6H chip for additional functionality, also creates hardware and software redundancy.&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/23c7d34b85eb407e222a5796f4c8fe75_1740469856076.jpg\" alt=\"\" width=\"600\" height=\"389\" />\u003C/p>\n\u003Cp>Mobileye Chauffeur introduces an additional layer of perception to its cameras - a network of radar and lidar sensors that visualize the surroundings and enable eyes-off driving within an operational design domain (ODD), allowing the vehicle to navigate its environment independently as the driver remains alert and ready to intervene.&nbsp;\u003C/p>\n\u003Cp>The two-board platform offers an additional layer of safety, with designed redundancy that enables the system to perform minimum-risk maneuvers if needed and to safely stop on the side of the road in case of a failure. &nbsp;\u003C/p>\n\u003Ch3>Going driverless&nbsp;\u003C/h3>\n\u003Cp>Mobileye Drive&trade; adds even more around the primary board, while removing one major component &ndash; the driver. Designed for Mobility-as-a-Service (MaaS), the platform features a total of four Mobileye EyeQ6H chips on the primary and secondary boards, connected to up to 13 cameras, imaging radars and lidars, enabling it to maneuver with no human driver.&nbsp;\u003C/p>\n\u003Cp>While Mobileye Drive&rsquo;s purpose is meant for service providers and SuperVision and Chauffeur were designed primarily for consumer vehicles, when placing the three ECUs one next to the other it is easy to see &nbsp;the shared hardware and software backbone powering all three. Built with modularity and scalability in mind the three systems share a similar shape, interface, connectivity, and of course software core, giving carmakers the opportunity to map their product roadmap in advance and move between the platforms in a more efficient manner.&nbsp;\u003C/p>\n\u003Cp>The new ECU series brings Mobileye's vision closer to reality, offering scalable platforms tailored to the many stages of the autonomous journey. Our solutions assist OEMs to deliver safer, smarter, and more adaptable vehicles, balancing performance and time to market from hands-off driving to eyes-off driving, to completely driverless. &nbsp;\u003C/p>","2025-02-27T08:00:00.000Z","AV Safety, Autonomous Driving, Driverless MaaS, Video",{"id":334,"type":69,"url":335,"title":336,"description":337,"primary_tag":73,"author_name":10,"is_hidden":11,"lang":338,"meta_description":337,"image":339,"img_alt":340,"content":341,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":342,"tags":343},287,"mobileye-de","Mobileye Press Kit Deutschland","Bild- und Videomaterial","de","https://static.mobileye.com/website/us/corporate/images/2e5889acf39f53ce66e29c574be885a8_1739099877362.jpg","Weltweiter Unternehmenshauptsitz von Mobileye","\u003Cp>\u003Cstrong>Mobileye Technologie &amp; L&ouml;sungen\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileye-technology-and-solutions-de[**]\u003C/p>\n\u003Cp>\u003Cstrong>Weltweiter Unternehmenshauptsitz von Mobileye\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileye-global-headquarters-de[**]\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye SuperVision auf der Stra&szlig;e\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileye-supervision-on-the-road-de[**]\u003C/p>\n\u003Cp>\u003Cstrong>Driven by Mobileye\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:driven-by-mobileye-de[**]\u003C/p>\n\u003Cp>\u003Cstrong>Professor Amnon Shashua\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:professor-amnon-shashua-de[**]\u003C/p>\n\u003Cp>\u003Cstrong>Infografiken\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:infographics-de[**]\u003C/p>\n\u003Cp>\u003Cstrong>Video\u003C/strong>\u003C/p>\n\u003Cp>[**]vimeo-press:1031870114[**]\u003C/p>\n\u003Cp>[**]vimeo-press:1041715342[**]\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2025-02-09T08:00:00.000Z","Press Kit, News, Autonomous Driving",{"id":345,"type":69,"url":346,"title":347,"description":348,"primary_tag":73,"author_name":10,"is_hidden":11,"lang":12,"meta_description":348,"image":349,"img_alt":350,"content":351,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":352,"tags":343},270,"mobileye","Mobileye Press Kit","Media Assets","https://static.mobileye.com/website/us/corporate/images/eeaeeb5f8d0193b0f397ff9c19b1b100_1724856705999.jpg","Mobileye's Global Corporate Headquarters","\u003Cp>\u003Cstrong>Mobileye Technology &amp; Solutions\u003C/strong>\u003C/p>\r\n\u003Cp>[**]gallery:mobileye-technology-and-solutions[**]\u003C/p>\r\n\u003Cp>\u003Cstrong>Mobileye Global Headquarters\u003C/strong>\u003C/p>\r\n\u003Cp>[**]gallery:mobileye-global-headquarters[**]\u003C/p>\r\n\u003Cp>\u003Cstrong>Mobileye SuperVision on the Road\u003C/strong>\u003C/p>\r\n\u003Cp>[**]gallery:mobileyes-advanced-platforms-in-the-drivers-seat[**]\u003C/p>\r\n\u003Cp>\u003Cstrong>Driven by Mobileye\u003C/strong>\u003C/p>\r\n\u003Cp>[**]gallery:driven-by-mobileye[**]\u003C/p>\r\n\u003Cp>\u003Cstrong>Professor Amnon Shashua\u003C/strong>\u003C/p>\r\n\u003Cp>[**]gallery:professor-amnon-shashua[**]\u003C/p>\r\n\u003Cp>\u003Cstrong>Infographics\u003C/strong>\u003C/p>\r\n\u003Cp>[**]gallery:infographics[**]\u003C/p>\r\n\u003Cp>\u003Cstrong>Video\u003C/strong>\u003C/p>\r\n\u003Cp>[**]vimeo-press:1031870114[**]\u003C/p>\r\n\u003Cp>[**]vimeo-press:1041715342[**]\u003C/p>\r\n\u003Cp>&nbsp;\u003C/p>\r\n\u003Cp>&nbsp;\u003C/p>","2025-02-01T08:00:00.000Z",{"id":354,"type":5,"url":355,"title":356,"description":357,"primary_tag":28,"author_name":10,"is_hidden":11,"lang":12,"meta_description":357,"image":358,"img_alt":359,"content":360,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":361,"tags":362},286,"mobileyes-ces-2025-showing","Highlights from Mobileye’s CES 2025 showing ","Mobileye’s presence was felt in Las Vegas, with our latest technology on display ","https://static.mobileye.com/website/us/corporate/images/e8f3cdb7e19232218e35664f2bec3a8c_1737633952001.jpg","A quick recap of everything Mobileye from CES 2025! ","\u003Cp>&nbsp;\u003Cbr />CES 2025 might be over but there is so much more to say about our Las Vegas presence. During the 4-day event, the Mobileye approach to mobility was front and center thanks to insightful stage talks, eye-catching demonstrations and fascinating discussions about the future of mobility and autonomous vehicles.&nbsp;\u003C/p>\n\u003Cp>Conference goers were welcomed at the Mobileye booth by two Mobileye Drive&trade;-equipped vehicles, the VW ID Buzz and the Holon Mover, next to a 3D interactive product catalogue made of layers of plexiglass and in a shape of a car. Throughout the week, Mobileye leadership took part in many thought leader panels sharing Mobileye insight on key topics in the automotive industry. &nbsp;\u003C/p>\n\u003Cp>Here is a quick recap of everything Mobileye from CES 2025!&nbsp;\u003Cbr />\u003Ciframe title=\"Mobileye at CES 2025: Highlights\" src=\"https://player.vimeo.com/video/1047443316?h=904d935a52&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"600\" height=\"338\" frameborder=\"0\">\u003C/iframe>\u003C/p>\n\u003Ch4>\u003Cstrong>Now. Next. Beyond. &nbsp;\u003C/strong>\u003C/h4>\n\u003Cp>In his annual CES keynote, Mobileye president and CEO Prof. Amnon Shashua shared his vision for the future of the autonomous driving industry &ndash; sharing how self-driving vehicles are moving quickly from experiment to everyday reality for consumers.\u003Cbr />\u003Ciframe title=\"Mobileye: Now. Next. Beyond CES 2025 Press Conference with Prof. Amnon Shashua\" src=\"https://player.vimeo.com/video/1047296511?h=a4dba090ba&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"600\" height=\"338\" frameborder=\"0\">\u003C/iframe>&nbsp;\u003C/p>\n\u003Ch4>\u003Cstrong>Tech in hand&nbsp;\u003C/strong>\u003C/h4>\n\u003Cp>On the booth floor, tall city-like geometric structures hosted our newest product tables, where hundreds of visitors entered to learn about our technology. Inside, our newest ECU suite was featured, alongside Mobileye&rsquo;s impressive imaging radar and other tech.\u003Cbr />\u003Ciframe title=\"Mobileye at CES 2025 Booth Walkthrough\" src=\"https://player.vimeo.com/video/1047295510?h=b1bfab4cd2&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"600\" height=\"338\" frameborder=\"0\">\u003C/iframe>&nbsp;\u003C/p>\n\u003Ch4>\u003Cstrong>Mobileye main stage&nbsp;\u003C/strong>\u003C/h4>\n\u003Cp>Safety was the key word during Mobileye&rsquo;s presentations inside its booth. Discussions on redundancy, AI powered platforms, and collaborations all touched on the importance of increasing the MTBF for safer mobility. &nbsp;\u003C/p>\n\u003Cp>We presented Mobileye&rsquo;s story: from driver-assist systems to robotaxis, discussed our AI-driven approach to autonomy, and explained how redundancy in our systems enhances reliability. And, in a unique opportunity, we hosted an in-booth panel discussion titled Unlocking Autonomous Mobility Globally featuring Mobileye EVP Johann (JJ) Jungwirth, Volkswagen ADMT SVP Sebastian Lasek and MOIA CEO Sascha Meyer. The three executives spoke about the Mobileye and VW group collaboration on driverless services and the VW ID. Buzz AD, the software and hardware enabling the project in different geographies, and the future of mobility.\u003Cbr />\u003Ciframe title=\"Unlocking Autonomous Mobility Globally with Volkswagen, MOIA and Mobileye\" src=\"https://player.vimeo.com/video/1047297083?h=60431b5f7a&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"600\" height=\"338\" frameborder=\"0\">\u003C/iframe>\u003C/p>\n\u003Ch4>\u003Cstrong>Leadership participation&nbsp;\u003C/strong>\u003C/h4>\n\u003Cp>When talking about the future of mobility, Mobileye had a strong presence in the discussion at CES. This year was another opportunity for our leadership to lend their voice on key issues in the automotive industry.\u003C/p>\n\u003Cp>Mobileye CTO Prof. Shai Shalev-Shwartz participated in a panel on the state of autonomous vehicle technology. Looking at the present and future of autonomous driving, Prof. Shalev-Shwartz said he is optimistic about the possibility that eyes-off driving systems for consumer vehicles, along with lower costs of robotaxi services, could, in the next few years, transform transportation as we think of it today, &ldquo;this is a point of revolution, this is a point where products come to transform the way we think about commuting.&rdquo;&nbsp;\u003Cbr />\u003Ciframe title=\"CTA &amp; PAVE Autonomous Vehicle Roundtable at CES 2025\" src=\"https://player.vimeo.com/video/1047299158?h=32a75cd4b6&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"600\" height=\"338\" frameborder=\"0\">\u003C/iframe>\u003Cbr />Mobileye Executive Vice President, Autonomous Vehicles Johann (JJ) Jungwirth joined top industry leaders to discuss integrating autonomous vehicles into communities, building consumer trust, and leveraging AI to enhance AV technology.&nbsp;\u003Cbr />\u003Ciframe title=\"JJ at Autonomous Vehicles - The Future is Finally Here Panel (CES 2025)\" src=\"https://player.vimeo.com/video/1049234229?h=ed2b30df34&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"600\" height=\"338\" frameborder=\"0\">\u003C/iframe>\u003C/p>\n\u003Ch4>\u003Cstrong>Around the hall&nbsp;\u003C/strong>\u003C/h4>\n\u003Cp>Our team had an exciting time at CES, not only presenting Mobileye technology, but also connecting with customers, experts, &nbsp;and colleagues in person. We love meeting those who share our enthusiasm about the future of mobility and we look forward to staying connected to collaborate in the future.\u003Cbr />\u003Ciframe title=\"Man On The Street at CES 2025\" src=\"https://player.vimeo.com/video/1051080523?h=a875a35f16&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"600\" height=\"338\" frameborder=\"0\">\u003C/iframe>\u003C/p>","2025-01-23T08:00:00.000Z","Industry, News, Amnon Shashua, Autonomous Driving, Video",{"id":364,"type":5,"url":365,"title":366,"description":367,"primary_tag":9,"author_name":10,"is_hidden":11,"lang":12,"meta_description":367,"image":368,"img_alt":369,"content":370,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":371,"tags":21},285,"policy-meets-innovation-how-aeb-is-driving-safer-roads","Automatic Emergency Braking: Where policy meets innovation","Explore how Automatic Emergency Braking and evolving policies are reshaping road safety, and advancing the future of mobility for drivers and pedestrians alike.","https://static.mobileye.com/website/us/corporate/images/a3e2aef29983db88c76f4b12b31a733f_1736423779190.jpg","The National Highway Traffic Safety Administration (NHTSA) finalized a new Federal Motor Vehicle Safety Standard.","\u003Cp>\u003Cspan data-contrast=\"auto\">The era of accepting traffic accidents as inevitable is ending, with road safety becoming a growing priority. Leading this transformation is Automatic Emergency Braking (AEB), a technology set to become required in the United States on all new passenger cars and light trucks starting in September 2029.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:278}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">But as we move closer to a future with fewer accidents, it&rsquo;s advancements in technology, not just policy, that will ultimately take us there. In this blog, we&rsquo;ll explore why AEB is redefining road safety and what it takes to deliver this life-saving innovation.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:278}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Why AEB is vital now\u003C/span>\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">In 2022, the U.S. saw its\u003Ca href=\"https://www.ghsa.org/resources/news-releases/GHSA/Ped-Spotlight-Full-Report22\"> highest pedestrian fatality rate in 40 years\u003C/a>, a stark reminder of the \u003C/span>\u003Cspan data-contrast=\"none\">New Projection: U.S. Pedestrian Fatalities Reach Highest Level in 40 Years | GHSA\u003C/span>\u003Cspan data-contrast=\"auto\"> persistent dangers on the road. Several factors contribute to this issue, including increased traffic, \u003Ca href=\"https://www.cbsnews.com/news/taller-vehicles-suvs-pickups-pedestrian-injury-risks-crash-accident-iihs/\">larger vehicle sizes\u003C/a> (with more SUVs and pickup trucks), and smartphone distractions. In fact, a \u003Ca href=\"https://www.cbsnews.com/newyork/news/distracted-driving-survey/\">survey \u003C/a>of 3 million motorists found that almost 9 out of 10 drivers admitted to using their smartphones while behind the wheel.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">To combat the rise in pedestrian-related accidents, the National Highway Traffic Safety Administration (NHTSA) finalized a new Federal Motor Vehicle Safety Standard (FMVSS No. 127). This regulation mandates that Automatic Emergency Braking (AEB), including pedestrian-specific AEB, be a standard feature in all passenger cars and light trucks registered after September 2029. While other policymakers, such as the General Safety Regulation in Europe, typically allow for a grace period during transitions, NHTSA set a clear compliance deadline, encouraging automakers to plan accordingly.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">According to NHTSA Deputy Administrator Sophie Shulman, they are &ldquo;requiring these systems to be even more effective at higher speeds and to detect pedestrians.&rdquo; Under the updated regulations for Automatic Emergency Braking (AEB) systems, all new cars and light trucks must include AEB technology capable of preventing collisions at speeds of up to 62 mph and detecting pedestrians in both daylight and nighttime conditions.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The policy&rsquo;s goal is to is to significantly reduce accidents and enhance vehicle safety by ensuring that cars are equipped with technology capable of detecting and preventing collisions with both pedestrians and vehicles, ultimately contributing to safer streets for everyone.&nbsp; This technology has proven to bring significant impact to the road. According to a report from the Partnership for Analytics Research in Traffic Safety (PARTS), Advanced Driver Assistance Systems (ADAS\u003C/span>\u003Cspan data-contrast=\"auto\">)\u003C/span>\u003Cspan data-contrast=\"auto\"> are improving and responsible for a sharp drop in rear-end vehicle-to-vehicle crashes, the research found &ldquo;\u003Ca href=\"https://www.forbes.com/sites/edgarsten/2025/01/27/study-shows-automatic-emergency-braking-improving-as-trump-looks-to-kill-mandate/\">a 49% reduction i\u003C/a>n front-to-rear crashes for vehicles equipped with one ADAS feature in particular, automatic emergency braking, or AEB, across all vehicle segments and model years.&rdquo;&nbsp;&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">AEB is uniquely positioned to tackle these challenges. By intervening when drivers are distracted, slow to react, or unable to respond in time, AEB can significantly reduce the risk of fatal collisions. According to NHTSA, integrating AEB with pedestrian detection could potentially &ldquo;prevent at least \u003Ca href=\"https://www.nhtsa.gov/press-releases/nhtsa-fmvss-127-automatic-emergency-braking-reduce-crashes\">24,000 injuries annually\u003C/a>&rdquo;.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Raising the bar with Mobileye \u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;335559739&quot;:0}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"none\">While regulators and the various bodies push forward laws and safety initiatives, it&rsquo;s the underlying technology that has the power to truly deliver the results. \u003C/span>\u003Cspan data-contrast=\"auto\">Fortunately, Mobileye&rsquo;s decade-long deployment of AEB systems on the road has provided an extensive dataset. Having Mobileye systems integrated into hundreds of models on the road today, we were able to utilize extensive real-life data amounting to roughly 200,000 hours of real-world\u003C/span> \u003Cspan data-contrast=\"auto\">driving (with a distance over 11 million km) \u003C/span>\u003Cspan data-contrast=\"none\">thus allowing us to fine tune\u003C/span>\u003Cspan data-contrast=\"auto\"> our technology.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">We analyzed AEB responses across a wide range of scenarios, in cities and towns in the United States, Europe, and Asia, as well as various lighting conditions&mdash;daylight, dawn, dusk, twilight, and nighttime. The rigorous validation process yielded positive results overall, with both very low false negative and false positive activation.\u003C/span>\u003Cspan data-contrast=\"none\">&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">The success of this technology is fundamental to the successful implementation of the updated AEB mandate, allowing the system to be more effective under more challenging driving conditions.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"none\">Achieving accuracy in AEB\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cdiv class=\"font-claude-message  pr-4  md:pr-9  relative  leading-[1.65rem]  [&amp;_pre&gt;div]:bg-bg-300  [&amp;_.ignore-pre-bg&gt;div]:bg-transparent  [&amp;_pre]:-mr-4  md:[&amp;_pre]:-mr-9\">\n\u003Cdiv>\n\u003Cdiv class=\"grid-cols-1 grid gap-2.5 [&amp;_&gt;_*]:min-w-0\">\n\u003Cp class=\"whitespace-pre-wrap break-words\">One of the key challenges in developing Automatic Emergency Braking (AEB) systems is balancing safety with natural driving behavior. While these systems must prioritize accident prevention, they shouldn't feel too cautious, abrupt, or uncomfortable in their operation. The technology continues to evolve to better match reasonable judgment while adapting to different driving conditions.\u003C/p>\n\u003C/div>\n\u003C/div>\n\u003C/div>\n\u003Cp>\u003Cspan data-contrast=\"auto\">A recent real-world example is Mobileye&rsquo;s AEB system in India where driving requires constant adaptability. Drivers skillfully navigate dense traffic, constantly adjusting to the flow of vehicles, pedestrians, animals and unexpected obstacles (such as barricades). It might look chaotic, but it operated as a highly efficient system, keeping things moving in busy, crowded streets. \u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">In this setting, the activation rates of safety systems showed a more &ldquo;localized&rdquo; response, with AEB activations considering car proximity and even demonstrating heightened cow detection, a common obstacle, throughout the streets in India. This example demonstrates that current AEB systems perform well in specific environments and highlights the importance of testing data from diverse regions to ensure robust performance worldwide.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Enhanced ADAS for a safer future&nbsp;\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan class=\"TextRun SCXW251036562 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW251036562 BCX0\">The entire industry is making significant strides toward a future where mobility is safer and more efficient. With technology serving as the backbone of true advancement, Mobileye&rsquo;s AEB systems are well-positioned to lead the way in driving assist features and meet evolving requirements. These ongoing efforts focus on enhancing the effectiveness of AEB systems to protect both drivers and pedestrians, advancing the future of road safety.\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW251036562 BCX0\" data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>","2025-01-09T08:00:00.000Z",{"id":373,"type":5,"url":374,"title":375,"description":376,"primary_tag":28,"author_name":10,"is_hidden":11,"lang":12,"meta_description":376,"image":377,"img_alt":378,"content":379,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":380,"tags":381},284,"prof-amnon-shashua-at-ces-2025","CES 2025: Prof. Amnon Shashua on revolutionizing mobility","In his CES 2025 keynote, Now. Next. Beyond., Mobileye President and CEO Prof. Amnon Shashua asked the audience a key single question – what does it take to  revolutionize transportation?","https://static.mobileye.com/website/us/corporate/images/7d850b79bb7c52216b36173a9f6bcfb3_1736363040927.jpg","By blending cutting-edge technology, mapping expertise, and cost-efficiency, Mobileye is shaping the future of mobility.","\u003Cp>For the 10\u003Csup>th\u003C/sup> year, Mobileye founder and CEO Prof. Amnon Shashua shared his vision for the future of the autonomous driving industry &ndash; sharing how self-driving vehicles are moving quickly from experiment to everyday reality, thanks to breakthroughs in AI. While robotaxis have begun to roll out in North America and Europe, Mobileye sees autonomy sparking a revolution in consumer transportation.\u003C/p>\n\u003Cp>Watch the keynote here.&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/MQnkqXoMEOc?si=VayQlJneUY5nIrwU\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Ch3>Precision vs. Recall&nbsp;\u003C/h3>\n\u003Cp>Prof. Shashua referred to the two critical dimensions needed to achieve scalable, fully autonomous driving: recall (availability) and precision (safety). Recall ensures that a system can handle diverse scenarios, geographies, and environments, while precision is focused on safety by minimizing errors, such as false detections or missed hazards.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/3c584de8c954aee7d8eadcd75d1953b5_1736363079618.jpg\" alt=\"Precision vs. Recall\" width=\"600\" height=\"389\" />\u003C/p>\n\u003Ch3>Creating safety architecture&nbsp;\u003C/h3>\n\u003Cp>Touching on Mobileye&rsquo;s safety architecture and how it is built to address unreasonable risks, Prof. Shashua explained that Mobileye&rsquo;s approach addresses four error types. He also explained how the Mobileye system is designed to mitigate these errors through a range of software and hardware redundancies. He went on to describe Mobileye's novel fusion method, \u003Cstrong>Primary-Guardian-Fallback (PGF)\u003C/strong>, which is a layered decision-making model. The model generalizes the majority rule to non-binary decisions.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/ced2c26cc06d0aeed8104fdf72d06dd3_1736363132926.jpg\" alt=\"\" width=\"600\" height=\"389\" />\u003C/p>\n\u003Ch3>\u003Cstrong>Engineering new ways to see the world\u003C/strong>\u003C/h3>\n\u003Cp>Prof. Shashua also introduced the unique sensor technology for layered visuals, which generates 3D perception to build a redundant and reliable understanding of the environment. This advanced processing is powered by Mobileye&rsquo;s latest system on a chip, the EyeQ6&trade;.\u003C/p>\n\u003Cp>Mobileye continues to climb the precision axis with imaging radar technology, which has evolved significantly. This technology, set to enter production in 2026, offers high resolution and delivers sensing capabilities that address camera weak spots. This technology plays a key role for both Mobileye Drive&trade; and Mobileye Chauffeur&trade; and has garnered strong interest from customers.\u003C/p>\n\u003Ch3>\u003Cstrong>Looking ahead \u003C/strong>\u003C/h3>\n\u003Cp>While describing what he believes is needed for revolutionizing mobility, Prof. Shashua presented some key factors and showcased the exponential growth of Mobileye REM&trade;, which harvested 29.6 billion miles of data in 2024 alone. &ldquo;This data is essential for creating a memory,&rdquo; he noted, emphasizing the importance of detailed maps for hands-off driving systems.\u003C/p>\n\u003Cp>He also addressed the challenge of scaling from demos to real-world products, highlighting affordability as key to making advanced systems accessible. By blending cutting-edge technology, mapping expertise, and cost-efficiency, Mobileye is shaping the future of mobility.\u003C/p>","2025-01-08T08:00:00.000Z","Amnon Shashua, Autonomous Driving, Events, Industry, News",{"id":383,"type":5,"url":384,"title":385,"description":386,"primary_tag":51,"author_name":10,"is_hidden":11,"lang":12,"meta_description":386,"image":387,"img_alt":388,"content":389,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":390,"tags":391},282,"from-driver-assisting-to-self-driving","From driver assisting to self-driving: Mobileye’s most FAQs ","We asked ourselves 10 questions – so you can learn more about Mobileye","https://static.mobileye.com/website/us/corporate/images/73c2d7e97e69bedac2fd9fd86521902d_1735469376518.png","\"We design all of our solutions with safety in mind, from our base ADAS system, through our automated driving features, to our driverless technology, road-user safety is our highest priority.\"","\u003Cp>\u003Cstrong>What does Mobileye do?&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>Mobileye develops and provides some of the world's leading solutions for autonomous-driving and driver-assist technologies, harnessing world-renowned expertise in computer vision, machine learning, and mapping.&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Ok, but what does that really mean?&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>In short, we want to make roads safer by making cars safer and smarter&mdash;whether driven by a human driver or without. We have the know-how in how to program a microchip and unlock its full potential and sync it with sensors such as cameras. &nbsp;&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>So, what kind of company are you? A car company? Tech? AI?&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>Funny you ask. It is kind of all three. While we don&rsquo;t make cars, we provide car manufacturers with safety solutions, in fact, if your car has capabilities like lane keep assist, automatic emergency braking or intelligent speed assistance for example, there is a good chance it has Mobileye tech like our chip, ADAS sensors or cameras.&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Ah, and you have been doing this for how long?&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>Well, the company was founded in 1999 by Prof. Amnon Shashua, whose simple but radical idea&mdash;to use a camera sensor as the basis of life-saving technology&mdash;set the foundation for its mission. Since then, Mobileye has pioneered groundbreaking technologies and introduced the sensing approach of True Redundancy&trade;, and the RSS&trade; safety model, which support our ADAS (Advanced Driver Assistance Systems) and AV solutions. Our industry-leading platforms enhance the driving experience through safety and are integrated into the vehicle&rsquo;s design, ranging from a single camera to AI for autonomous vehicles.&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/27efb38a1b51449ba186725bb1fb9b3d_1733904095025.png\" alt=\"\" width=\"600\" height=\"389\" />\u003C/p>\n\u003Cp>\u003Cstrong>Ok, so what do you make?&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>Quite a few things. When we talk about ADAS, we are talking about a range of technologies and features that enhance safety, improve driving, and automate certain driving functions. Within the ADAS spectrum, there are multiple levels, first there is basic ADAS, which includes a front camera for enhanced safety. ADAS includes features like Automatic Emergency Braking (AEB), Forward Collison Warning (FCW), Lane Keep Assist, Automatic Cruise Control (ACC), and Traffic Sign Recognition, among others. This technology is already embedded in millions of vehicles on the road today, helping prevent collisions and reduce injuries. &nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Ok, is that it?&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>Certainly, not! Then there&rsquo;s the Mobileye Cloud-Enhanced Driver-Assist&trade; system, which provides a safer, smoother, and more natural driving experience &ndash; marking a leap in ADAS performance, with no need for additional hardware. It leverages crowdsourced data from millions of Mobileye-equipped vehicles around the globe, providing centimeter-level localization through continuously updated information about the driving scene in near real time.&nbsp;\u003C/p>\n\u003Cp>And building on the two mentioned platforms, there is Mobileye Surround ADAS&trade;, a highway hands-off/eyes-on solution, powered by our EyeQ&trade;6H system-on-chip, our most powerful to date. It leverages up to six cameras and five radars, providing the vehicle with the data it needs to smoothly maneuver in its environment. &nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>What about a self-driving car?&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>Before we dive into our autonomous driving platform, let&rsquo;s take a moment to highlight Mobileye SuperVision&trade;. The bridge between our ADAS products and our autonomous vehicle solutions. With its 360&deg; vision and sensing system built with up to 11 cameras and two EyeQ&trade;6H SoCs, SuperVision&trade; allows drivers to take their hands off the wheel while keeping their eye on the road during standard driving functions across various road types.&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>So, what&rsquo;s the step after hands-off/eyes-on?&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>Hands-off/eyes-off of course! Based on our methodology for safety architecture, advanced computer vision features, and our hardware that includes radar, lidar and 11 cameras, our hands-off and eyes-off solution for consumer vehicles Mobileye Chauffeur&trade; can reach up to 130 Kph and provides a point-to-point navigate-on-pilot experience within its defined ODD. &nbsp;\u003C/p>\n\u003Cp>Then, there is also Mobileye Drive&trade;, our end-to-end self-driving system that takes our technology even further as it enables automakers and transportation operators to offer a no-driver solution for different types of autonomous vehicles such as robotaxis, ride-pooling, public transport, and goods delivery. &nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Ok, but what about safety? Are self-driving cars safe? Are they safer than humans?&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>We design all of our solutions with safety in mind, from our base ADAS system, through our automated driving features, to our driverless technology, road-user safety is our highest priority. Our latest EyeQ6 SoC family is our most advanced yet, enabling vehicles to navigate the road with greater precision and reliability. Our Responsibility-Sensitive Safety (RSS&trade;), a mathematical-based model, which adheres to five core safety rules, and builds our driving policy, while our HD mapping technology - REM&trade; creates a continuously updated HD map to enable safer driving in our ADAS and autonomous platforms. &nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Ok then, so how does Mobileye see the future? &nbsp;\u003C/strong>\u003C/p>\n\u003Cp>At Mobileye, we have two key ideas in our vision for the future: a world where autonomous capabilities drastically reduce traffic accidents, and for consumers to benefit from self-driving vehicles in new ways such as access to autonomous mobility for transit and reducing traffic congestion and pollution in urban areas. &nbsp;This is our goal, and while we have a lot of our advanced technology already on the road, every day we&rsquo;re working to solve all kinds of complex scenarios and edge cases to make fully autonomous driving a reality.&nbsp;\u003C/p>","2024-12-29T08:00:00.000Z","News, Industry, Autonomous Driving",{"id":393,"type":24,"url":394,"title":395,"description":396,"primary_tag":397,"author_name":10,"is_hidden":11,"lang":12,"meta_description":396,"image":398,"img_alt":399,"content":400,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":401,"tags":402},283,"mobileye-announces-ces-2025-press-conference","Mobileye announces CES 2025 press conference","Mobileye: Now. Next. Beyond. with Prof. Amnon Shashua Presented on January 7 at 11:00 a.m. PT ",11,"https://static.mobileye.com/website/us/corporate/images/1fbc54899406499ef56105747d30bfbf_1734381310020.jpg","Prof. Amnon Shashua delivers his annual CES address, Mobileye: Now. Next. Beyond. ","\u003Cp>JERUSALEM, December 17, 2024 -- At CES 2025, Mobileye (Nasdaq: MBLY) will showcase the technologies and solutions driving its scalable approach to safer roads and autonomous mobility. Kicking off the show with its annual press conference, &ldquo;Mobileye: Now. Next. Beyond.&rdquo; presented by President and CEO Prof. Amnon Shashua, Mobileye will highlight innovation across a range of platforms and applications, from advanced driver assistance to fully autonomous solutions.\u003C/p>\n\u003Cp>Mobileye: Now. Next. Beyond. will be held on Tuesday, January 7, 2025, at 11:00 a.m. PT in LVCC West Hall W326 and livestreamed. Prof. Shashua will share insights into Mobileye&rsquo;s progress toward delivering safe and scalable autonomous driving solutions and outline the company&rsquo;s vision for 2025 and beyond. \u003Ca href=\"https://www.mobileye.com/ces-2025/\">Register here\u003C/a> for in-person or virtual attendance.\u003C/p>\n\u003Cp>Throughout CES, attendees are invited to visit the Mobileye booth at LVCC West Hall, Level 1, Booth 4700, where a series of demonstrations and talks will highlight Mobileye&rsquo;s latest advancements in AI, sensor fusion, imaging radar and other core technologies purpose-built for safer roads. These innovations underpin Mobileye&rsquo;s modular product portfolio, which will be on display and includes:\u003C/p>\n\u003Cul>\n\u003Cli>Advanced Driver Assistance Systems, from base to cloud-enhanced by REM&trade; mapping for greater accuracy.\u003C/li>\n\u003Cli>Surround ADAS: A surround view system for comprehensive safety.\u003C/li>\n\u003Cli>Mobileye SuperVision&trade;: An eyes-on, hands-off assisted driving platform.\u003C/li>\n\u003Cli>Mobileye Chauffeur&trade;: Eyes-off autonomy for consumer-owned vehicles.\u003C/li>\n\u003Cli>Mobileye Drive&trade;: A full self-driving system designed for Mobility-as-a-Service.\u003C/li>\n\u003C/ul>\n\u003Cp>Attendees will have the opportunity to view vehicles equipped with Mobileye Drive at the Mobileye booth.\u003C/p>\n\u003Cp>Additionally, Mobileye will participate in the following CES events:\u003C/p>\n\u003Cp>\u003Ca href=\"https://web.cvent.com/event/160622a8-4160-4674-9c3d-776f6912c99d/summary\">CTA &amp; PAVE Autonomous Vehicle (AV) Roundtable\u003C/a>\u003C/p>\n\u003Cp>On Wednesday, January 8, Mobileye CTO Prof. Shai Shalev-Shwartz will join executives from Aurora and Waabi for a panel on the State of AV Tech as part of a series of sessions hosted by CTA and Partners for Automated Vehicle Education (PAVE).\u003C/p>\n\u003Cul>\n\u003Cli>\u003Cstrong>Date: \u003C/strong>January 8, 2025\u003C/li>\n\u003Cli>\u003Cstrong>Time:\u003C/strong> 10:30 a.m. PT\u003C/li>\n\u003Cli>\u003Cstrong>Location: The \u003C/strong>Venetian Expo, Room 403\u003C/li>\n\u003Cli>\u003Cstrong>Registration: \u003C/strong>Visit \u003Ca href=\"https://web.cvent.com/event/160622a8-4160-4674-9c3d-776f6912c99d/summary\">here\u003C/a> for event information and registration details.\u003C/li>\n\u003C/ul>\n\u003Cp>\u003Ca href=\"https://www.ces.tech/schedule/autonomous-vehicles-the-future-is-finally-here/\">CES Conference Session | Autonomous Vehicles: The Future is Finally Here\u003C/a>\u003C/p>\n\u003Cp>On Thursday, January 9, Mobileye Executive Vice President of AV Johann &ldquo;JJ&rdquo; Jungwirth will join leading industry voices for a discussion on integrating AVs into communities, building trust with consumers, and using new AI strategies to fine-tune AV technology.&nbsp; The session will be moderated by Pete Bigelow, Director of Technology and Innovation Coverage at Automotive News.\u003C/p>\n\u003Cul>\n\u003Cli>\u003Cstrong>Date: \u003C/strong>January 9, 2025\u003C/li>\n\u003Cli>\u003Cstrong>Time:\u003C/strong> 3:00 p.m. PT\u003C/li>\n\u003Cli>\u003Cstrong>Location: \u003C/strong>LVCC West Level 2 W219\u003C/li>\n\u003Cli>\u003Cstrong>Registration:\u003C/strong> CES \u003Ca href=\"https://www.ces.tech/topics/vehicle-tech-and-advanced-mobility/\">Vehicle Tech and Advanced Mobility\u003C/a> conference track pass is required.\u003C/li>\n\u003C/ul>\n\u003Cp>For more information on Mobileye at CES 2025, including booth activities, events and news, visit \u003Ca href=\"https://www.mobileye.com/ces-2025/\">https://www.mobileye.com/ces-2025/\u003C/a>. &nbsp;\u003C/p>\n\u003Cp>+++\u003C/p>\n\u003Cp>Contacts:\u003C/p>\n\u003Cp>Dan Galves&nbsp;\u003C/p>\n\u003Cp>Investor Relations&nbsp;\u003C/p>\n\u003Cp>investors@mobileye.com&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>Justin Hyde&nbsp;\u003C/p>\n\u003Cp>Media Relations&nbsp;\u003C/p>\n\u003Cp>justin.hyde@mobileye.com\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>Mobileye (Nasdaq: MBLY) leads the mobility revolution with its autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in computer vision, artificial intelligence, mapping, and data analysis. Since its founding in 1999, Mobileye has pioneered such groundbreaking technologies as REM&trade; crowdsourced mapping, True Redundancy&trade; sensing, and Responsibility Sensitive Safety (RSS). These technologies are driving the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions, powering industry-leading advanced driver-assistance systems and delivering valuable intelligence to optimize mobility infrastructure. To date, about 190 million vehicles worldwide have been built with Mobileye technology inside. In 2022 Mobileye listed as an independent company separate from Intel (Nasdaq: INTC), which retains majority ownership. For more information, visit https://www.mobileye.com.\u003C/p>\n\u003Cp>&ldquo;Mobileye,&rdquo; the Mobileye logo and Mobileye product names are registered trademarks of Mobileye Global. All other marks are the property of their respective owners.\u003C/p>","2024-12-17T08:00:00.000Z","Events, News",{"id":404,"type":69,"url":405,"title":406,"description":407,"primary_tag":32,"author_name":10,"is_hidden":11,"lang":12,"meta_description":407,"image":408,"img_alt":409,"content":410,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":411,"tags":412},281,"mobileye-at-ces-2025","Mobileye at CES 2025","Visit our press kit throughout the show for the latest events, news, multimedia and more.","https://static.mobileye.com/website/us/corporate/images/1833572c15ccfb76fcd471355eedb899_1734044782739.png","Mobileye Kicked off CES 2025 with our Annual Press Conference Presented by Prof. Amnon Shashua","\u003Cp>\u003Cstrong>Mobileye at CES 2025\u003C/strong>\u003Cbr />Mobileye returns to Las Vegas for CES 2025, showcasing the technologies and solutions driving our scalable approach to safer roads and autonomous mobility. From advanced driver-assistance systems (ADAS) to fully autonomous vehicles, explore how we&rsquo;re transforming the future of intelligent driving.\u003C/p>\n\u003Cp>Visit \u003Ca href=\"https://www.mobileye.com/ces-2025/\">mobileye.com/ces-2025\u003C/a> for full details on our press conference, booth demonstrations, and key events.\u003C/p>\n\u003Cp>\u003Cstrong>News\u003C/strong>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Ca href=\"https://www.mobileye.com/news/mobileye-announces-ces-2025-press-conference/\" target=\"_blank\" rel=\"noopener\">Mobileye Announces CES 2025 Press Conference\u003C/a>\u003C/strong>\u003C/p>\n\u003Cp>\u003Cstrong>Presentations\u003C/strong>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Ca title=\"Mobileye's Annual CES Press Conference\" href=\"https://www.dropbox.com/scl/fi/xtwkl103ja9pgrvx2r667/CES-2025-Now.-Next.-Beyond.pdf?rlkey=n2yar93u4mbzizcou2hhs8c6n&amp;e=1&amp;st=kn6xypin&amp;dl=0\" target=\"_blank\" rel=\"noopener\">Mobileye: Now. Next. Beyond. with Prof. Amnon Shashua\u003C/a>\u003C/strong>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>CES 2025 Booth Experience\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileye-ces-2025-booth-experience[**]\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye: Now. Next. Beyond. Annual CES Press Conference&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:now-next-beyond-2025[**]\u003C/p>\n\u003Cp>\u003Cstrong>On Stage at CES 2025\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileye-at-ces-2025[**]\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye Technology &amp; Solutions\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileye-technology-and-solutions[**]\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye SuperVision on the Road\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileyes-advanced-platforms-in-the-drivers-seat[**]\u003C/p>\n\u003Cp>\u003Cstrong>Driven by Mobileye\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:driven-by-mobileye[**]\u003C/p>\n\u003Cp>\u003Cstrong>Professor Amnon Shashua\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:professor-amnon-shashua[**]\u003C/p>\n\u003Cp>\u003Cstrong>Infographics\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:infographics[**]\u003C/p>\n\u003Cp>\u003Cstrong>Video\u003C/strong>\u003C/p>\n\u003Cp>[**]vimeo-press:1031870114[**]\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2024-12-08T08:00:00.000Z","Amnon Shashua, Autonomous Driving, ADAS, AV Safety",{"id":414,"type":5,"url":415,"title":416,"description":417,"primary_tag":32,"author_name":10,"is_hidden":11,"lang":12,"meta_description":417,"image":418,"img_alt":419,"content":420,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":421,"tags":422},280,"driving-ai","5 Takeaways from Mobileye's AI day ","CEO Prof. Amnon Shashua and CTO Prof. Shai Shalev-Shwartz explored key AI advancements in autonomous mobility. ","https://static.mobileye.com/website/us/corporate/images/9d5275f3635f6d26bd00f9edf597d298_1732784251661.jpg","Prof. Shai Shalev-Shwartz and Prof. Amnon Shashua\nPresident and CEO at Driving AI","\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye has been one of the leaders in navigating the path towards full self-driving systems. Achieving a fully autonomous \"eyes-off\" system is the goal, but it demands exceptionally high safety standards. Developing such a complex system requires substantial long-term investment.\u003C/span>\u003Cspan data-contrast=\"auto\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">To reach this milestone, we focus on maintaining a sustainable business by generating revenue today through our leadership in advanced driver assistance technologies all the while keeping our eyes on the goal of full autonomy. But how do we get there? \u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003Cspan data-contrast=\"auto\">Last month, our CEO Prof. Amnon Shashua and CTO, Prof. Shai Shalev-Schwartz took to the stage at Mobileye&rsquo;s Driving AI 2024 to discuss and share the company&rsquo;s innovative AI methods to reach that milestone.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Here are the five main takeaways from the lecture on how Mobileye, through its intelligent leveraging of AI, is solving autonomy one system at a time.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">Watch the full video below.\u003C/span>\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/92e5zD_-xDw?si=AFtw2AQWwgfm0Eoz\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Ch3>\u003Cstrong>There's no one way to solve autonomy&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">In his opening remarks, Mobileye CEO, Prof. Amnon Shashua, outlined Mobileye's perspective on various approaches towards solving autonomy. &nbsp;Examples of these approaches include Waymo's Lidar-centric strategy with a compound-ai system approach (CAIS), Tesla's camera-only and end-to-end AI approach, and Mobileye's own camera-centric method with a CAIS AI model. &nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Prof. Shashua emphasized the importance of examining each approach against four key pillars for successful autonomy: cost, modularity, geographic scalability, and Mean Time Between Failures (MTBF). While each approach has unique strengths and limitations, none has fully solved autonomy by meeting all four pillars. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">With each approach, there are trade-offs. One approach may offer high accuracy but be completely inefficient to produce, while another, by itself, may be limited or unreliable. For example, a lidar-centric approach provides high accuracy, resulting in a very high (good) MTBF, but its cost hinders scalability, making it less suitable for the wider market.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/53a8091b10a0f6baa10a8cac1aec0a67_1732785971870.png\" alt=\"\" />\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Conversely, a camera-only or camera-centric approach is more affordable but typically results in more challenges in reaching a high MTBF and a pure end-to-end approach awakens the alignment problem in automotive AV (the difficulty of ensuring that the goals of an AI system/machine learning model align with our human objective). &nbsp;A balanced approach can bridge these gaps, offering both safety and scalability for the evolving AV market.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>A pure end-to-end approach has its limitations&nbsp;&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The End-to-End approach is built on the premise that the more data you feed into the system, the better it gets at mimicking human driving behavior, eventually reaching or even surpassing human-level performance. This method eliminates the need for \"glue code,\" or manual coding. Instead, it's all about data&mdash;unsupervised data specifically. &nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The transformer-based neural network continuously learns from millions of cars sending driving data, eliminating the need for the manual process of humans labelling or \u003C/span>\u003Cem>\u003Cspan data-contrast=\"auto\">interpreting \u003C/span>\u003C/em>\u003Cspan data-contrast=\"auto\">that data. However, this approach faces three significant challenges: the lack of abstractions, the shortcut learning problem, and the long-tail problem, each of which highlights the limitations of current systems in effectively handling the complexities of real-world driving scenarios.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">In his talk, Prof. Shashua explained its limitations with the lack of abstractions through the \"calculator problem\", which is the difficulty ChatGPT has with reliably handling complex, multi-step calculations due to the limitations of its language-based architecture. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">To solve it, the answer was almost too easy: it must refer to a calculator tool.&nbsp; To address this, a Python environment was integrated to enhance computational accuracy.&nbsp; Prof. Shashua argued that relying solely on unsupervised data in the End-to-End approach to address all the complexities of a safety-critical system like autonomous vehicles is both questionable and risky.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">This approach also risks embedding undesired or even dangerous driving behaviors in the learning phase. Here, the \"alignment problem\" in AV becomes evident as the model may prioritize common but incorrect behaviors over rare but correct ones.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/c090cff3e255d37cf9c145fd17435972_1732784761464.jpg\" alt=\"\" width=\"1272\" height=\"825\" />\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">For example: human drivers often perform rolling stops at stop signs or engage in rude driving behaviors, this is where the system might learn as common actions despite being incorrect. \u003C/span>\u003Cspan data-contrast=\"auto\">So, distinguishing between correct and incorrect actions&mdash;especially in rare but correct scenarios&mdash;remains a complex challenge.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">A practical illustration of these issues can be seen in the data collected from the Tesla Full Self-Driving (FSD) tracker. The long-tail problem emerges here when you see that the FSD tracker data indicates that even with extensive data input, the model struggles to adequately address these rare events, ultimately affecting the system's overall safety and reliability and hindering its ability to improve mean time between failures (MTBF) with each new generation. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">This problem highlights the limitations of relying on large datasets alone, as rare but critical driving scenarios are often underrepresented.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:true,&quot;134233118&quot;:true,&quot;201341983&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">The Primary, Guardian, and Fallback Fusion is critical for safety system\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">There are countless decisions we encounter throughout a driving journey. From very simple, binary ones&mdash;such as turning left or turning right, braking or not braking, to the more complex and nuanced choices, such as, if I brake, should I do so gently or harshly?&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">A typical approach involves multiple systems offering their answer and taking the majority rule - two out of three systems say merge left, so merge left. But what if the three systems offer three different suggestions, such as turn left, turn right, or keep straight? The majority rule is not available here as a solution.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The answer is the Primary, Guardian, and Fallback Fusion (PGF) approach, The Primary, Guardian, and Fallback (PGF) system functions as a layered decision-making model. Here&rsquo;s how it works:\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335551550&quot;:1,&quot;335551620&quot;:1}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cul>\n\u003Cli data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"13\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" aria-setsize=\"-1\" data-aria-posinset=\"1\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"auto\">Primary: \u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\">This system is a standard self-driving system (SDS) that generates a trajectory&mdash;a planned route or course of action for the vehicle. It&rsquo;s essentially the main decision-maker in most situations, outputting the initial suggested path or movement.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cul>\n\u003Cli data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"13\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" aria-setsize=\"-1\" data-aria-posinset=\"2\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"auto\">Fallback: \u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\">Like the Primary, the Fallback is another SDS that can generate its own trajectory or alternative route. It serves as a backup in case the Primary system encounters an issue or if the Guardian system detects a potential problem.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cul>\n\u003Cli data-leveltext=\"\" data-font=\"Symbol\" data-listid=\"13\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" aria-setsize=\"-1\" data-aria-posinset=\"3\" data-aria-level=\"1\">\u003Cstrong>\u003Cspan data-contrast=\"auto\">Guardian: \u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\">The Guardian acts as a monitoring layer. Instead of producing a route, it evaluates the Primary system&rsquo;s trajectory to ensure it meets certain safety and feasibility standards. The Guardian&nbsp;in short evaluates whether the Primary system&rsquo;s suggested action is safe and viable, If the Guardian detects an issue, it can prompt a switch to the Fallback system to ensure safe navigation\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335551550&quot;:1,&quot;335551620&quot;:1}\">&nbsp;\u003C/span>\u003C/li>\n\u003C/ul>\n\u003Cp>\u003Cspan data-contrast=\"auto\">In a binary scenario of deciding whether to apply brakes, using the example of the&nbsp; triple-sensor system: the camera (primary) ,the radar (guardian), and the lidar (fallback) we rely on a majority vote: if the camera and radar agree with each other, we follow suit; if they disagree with each other,&nbsp; we defer to the lidar, which will inevitably align with one of the other two, ensuring we follow the majority &ndash; thus PGF is the equivalent to the 2/3 majority vote.&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The notion of &ldquo;majority over three sub-systems&rdquo; is well-defined only for binary decisions. However, many of the decisions we need to make while driving are not binary decisions. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Most notably, the geometry of the lane we are driving in is not a binary decision, and this geometry has profound implications on RSS decisions. We therefore propose a generalization of the majority vote, which we call the Primary-Guardian-Fallback (PGF) fusion system. This fused system follows the Primary or the Fallback systems depending on the output of the Guardian system. \u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cspan data-ccp-props=\"{}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/b241684e79737202ff86b19c995c3fb1_1732784889378.jpg\" alt=\"\" width=\"1272\" height=\"825\" />\u003C/span>\u003Cstrong>\u003Cspan data-contrast=\"none\">Transformer efficiency can be increased up to 100 times\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">In the realm of object detection, communication is paramount. The process of inputting images accurately into the tokenization process&mdash;transforming them into understandable and processable data&mdash;needs to be executed as precisely as possible. But to take this to the next level of efficiency, beyond just the intelligent tokenization process, is to create a system where tokens can communicate with each other en masse, in an effective and organized way. \u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Bear in mind, the communication process needs to be highly efficient due to the heavy computational demands on a single chip. For the chip to perform seamlessly, data flow must be optimized to prevent performance lags that could ultimately slow down communication.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">In his talk, Mobileye CTO, Prof. Shai Shalev-Schwartz explains how we tackle this process and boost transformer efficiency by 100 times with the STAT (Sparse Typed Attention) method.\u003C/span> \u003Cspan data-contrast=\"auto\">The STAT method optimizes communication between tokens by organizing them into structured groups. Think of thousands of people trying to talk to each other\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335551550&quot;:3,&quot;335551620&quot;:3,&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003Cspan data-contrast=\"auto\">in a giant stadium.&nbsp; This would lead to chaos. The same concept applies to thousands of tokens that, in the same scenario, would struggle to communicate effectively with each other. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Unless, of course, they&rsquo;re divided into specific roles&mdash;such as 'regular tokens' and 'manager tokens'&mdash;to create a more organized communication structure, or in this case, relevant connectivity. This is essentially the idea behind STAT, where introducing structure and parameters improves model efficiency through improved typing and organization.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">So how do we achieve this? We establish order by dividing and grouping the tokens with &ldquo;managers&rdquo; or link tokens, allowing regular tokens to communicate independently with the link token (&ldquo;manager&rdquo;). For example, by grouping 300 regular tokens with 32 link tokens, the regular tokens can communicate with the link tokens, and the link tokens can communicate with each other. This structured approach significantly reduces complexity and leads to a 100-fold increase in efficiency.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">There&rsquo;s a sweet spot in t\u003C/span>\u003C/strong>\u003Cstrong>\u003Cspan data-contrast=\"auto\">he efficiency to flexibility spectrum&nbsp;\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Prof. Shai Shalev-Schwartz outlined the balance between efficiency and flexibility when it comes to chip operation. Simply put, if we were to create a super-efficient chip with only one built-in purpose, it would be extremely efficient, but also very limited. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">On the other hand, a chip designed to handle many tasks will be flexible, but its performance wouldn't be nearly as efficient. That&rsquo;s the essential trade-off between efficiency and flexibility &ndash; especially when talking about a chip on-board a vehicle. However, the EyeQ&trade;6 High chip hits the sweet spot for automated driving, offering the right mix of both flexibility and efficiency.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">To achieve this, there are a variety of components within the chip, each with varying degrees of flexibility and efficiency. Prof. Shai Shalev-Schwartz mentions five components&mdash;with five distinct architectures&mdash; that range from highly specific and efficient to highly flexible. Starting from two CPUs&mdash;MPC and MIPS, which are highly flexible&mdash;to the XNN which is highly efficient and specific to two additional accelerators in between, the chip adapts depending on the operation, moving across the spectrum.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The efficiency of the Mobileye Eye Q6 High in executing demanding AI deep learning tasks is impressive. With a capability of 34 TOPS (terra operations per second), it significantly outperforms its predecessor, the Eye Q5, which offers less TOPS. However, simply comparing TOPS figures doesn&rsquo;t tell the whole story.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The true measure of a chip's effectiveness for automated driving applications lies in its ability to process frames per second across various neural network tasks. For example, the Eye Q6 High can handle over 1,000 frames per second for a pixel labeling NN, compared to just 91 frames per second on the Eye Q5&mdash;showing a more than tenfold increase in efficiency. This improvement comes not just from higher clock speeds but from a specialized architecture mostly from the XNN designed for high utilization in specific applications.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/53a8091b10a0f6baa10a8cac1aec0a68_1732785971875.png\" alt=\"\" />\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">In comparison to Nvidia's Orin chip for example, which can reach 275 TOPS, the raw numbers may suggest it&rsquo;s superior. Yet, when running a standard ResNet-50 network, the difference in frame processing is only a factor of two. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">This highlights that TOPS alone isn&rsquo;t a sufficient measure of effectiveness; context and efficiency are key.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003Cspan data-contrast=\"auto\">Overall, the EyeQ 6High&rsquo;s design focuses on tailored functionality and efficiency, supported by a solid software stack that allocates tasks optimally across its various accelerators. \u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003Cspan data-contrast=\"auto\">In essence, the Eye Q6 High&rsquo;s real strength is its smart design, tailored to handle specific tasks efficiently proving that raw TOPS numbers alone don&rsquo;t capture true performance.\u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan class=\"TextRun SCXW51215864 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW51215864 BCX0\">To sum up, Mobileye&rsquo;s \u003C/span>\u003Cspan class=\"NormalTextRun SCXW51215864 BCX0\">expertise\u003C/span>\u003Cspan class=\"NormalTextRun SCXW51215864 BCX0\"> in\u003C/span>\u003Cspan class=\"NormalTextRun SCXW51215864 BCX0\"> AI\u003C/span>\u003Cspan class=\"NormalTextRun SCXW51215864 BCX0\"> combined with \u003C/span>\u003Cspan class=\"NormalTextRun SCXW51215864 BCX0\">purpose-built \u003C/span>\u003Cspan class=\"NormalTextRun SCXW51215864 BCX0\">hardware \u003C/span>\u003Cspan class=\"NormalTextRun SCXW51215864 BCX0\">optimized\u003C/span>\u003Cspan class=\"NormalTextRun SCXW51215864 BCX0\"> for \u003C/span>\u003Cspan class=\"NormalTextRun SCXW51215864 BCX0\">efficiency\u003C/span> \u003Cspan class=\"NormalTextRun SCXW51215864 BCX0\">reflects\u003C/span>\u003Cspan class=\"NormalTextRun SCXW51215864 BCX0\"> a clear path toward scalable autonomy. As we continue progressing step-by-step, these breakthroughs bring us closer to the vision of a fully autonomous future\u003C/span>\u003Cspan class=\"NormalTextRun SCXW51215864 BCX0\">.\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW51215864 BCX0\" data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>","2024-11-27T08:00:00.000Z","Amnon Shashua, Autonomous Driving, AV Safety",{"id":424,"type":5,"url":425,"title":426,"description":427,"primary_tag":428,"author_name":429,"is_hidden":11,"lang":12,"meta_description":427,"image":430,"img_alt":431,"content":432,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":433,"tags":434},279,"the-mobileye-safety-methodology-for-fully-autonomous-driving","The Mobileye safety methodology for fully autonomous driving","Our methodology for safety architecture in autonomous driving builds on foundational principles and industry standards to mitigate different types of risks.",18,"Prof. Shai Shalev-Shwartz and Prof. Amnon Shashua","https://static.mobileye.com/website/us/corporate/images/114f82911350ead2e8d23f09ea3935a9_1732626715488.jpg","Photo credit: VW","\u003Cp>When it comes to autonomous driving, how safe is safe enough?\u003C/p>\n\u003Cp>To date, even with thousands of self-driving vehicles on the road worldwide, and with AV commercialization on the near horizon, this question remains unsettled. One key benchmark so far has been mean time between failures (MTBF) -- basically, can a self-driving system drive better than an average human in terms of frequency of accidents or harm? This metric is easy to grasp and measure, but under close examination its limits become clear. Human statistics are heavily impacted by illegal and careless behavior; a self-driving system can&rsquo;t get drunk or text someone. More importantly, humans aren&rsquo;t just measured by accidents, but by avoiding reckless behavior &ndash; taking a duty of care toward themselves and other road users to avoid unreasonable risks. This is why we argue that while a high MTBF is critical, it is not sufficient to demonstrate a safe SDS.&nbsp;\u003C/p>\n\u003Cp>After years of meaningful progress, both Mobileye and the industry at large now have a clearer picture for safety requirements around fully autonomous driving. Today in a new paper, we unveil a framework designed to deploy safe, self-driving systems at scale building upon two key principles:\u003C/p>\n\u003Cul>\n\u003Cli>Overall mean time between failure of the system should be at least as good as human statistics.&nbsp;\u003Cbr />\u003Cbr />\u003C/li>\n\u003Cli>The system should eliminate unreasonable risk, where the self-driving system provider is transparent about the boundary between reasonable and unreasonable risk.&nbsp;\u003C/li>\n\u003C/ul>\n\u003Cp>The first requirement addresses the greater good baseline &ndash; adding self-driving vehicles to the road must not cause more harm than the status-quo of roads with human-driven vehicles, as measured by MTBF. However, this statistical measurement is not sufficient and therefore we complement it with a requirement of eliminating unreasonable risk &ndash; a sort of adjusted MTBF that incorporates the principles of transparency, accountability, and adherence to robust safety standards.\u003C/p>\n\u003Cp>\u003Cspan data-teams=\"true\">The difficult parts are to define the boundary between &ldquo;reasonable&rdquo; and &ldquo;unreasonable&rdquo; risk in a rigorous manner as well as to set up the methodology for eliminating unreasonable risk. For the technical details on how we do it please read our \u003Ca href=\"https://static.mobileye.com/website/us/corporate/files/SDS_Safety_Architecture.pdf\" target=\"_blank\" rel=\"noopener\">paper\u003C/a>.\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2024-11-26T08:00:00.000Z","From our CEO, Amnon Shashua, Autonomous Driving",{"id":436,"type":24,"url":437,"title":438,"description":439,"primary_tag":28,"author_name":10,"is_hidden":11,"lang":12,"meta_description":439,"image":440,"img_alt":441,"content":442,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":443,"tags":444},278,"mobileye-releases-2023-sustainability-report","Mobileye releases 2023 Sustainability Report","Inaugural report highlights ESG progress and commitment to enhancing safety, efficiency and accessibility in transportation","https://static.mobileye.com/website/us/corporate/images/39b8d459bdbcac10c125983c238ff285_1731587065234.jpg","Mobileye's headquarters in Jerusalem","\u003Cp>JERUSALEM, NOVEMBER 14, 2024 &mdash;Mobileye (Nasdaq: MBLY) has published its inaugural Sustainability Report for 2023, a comprehensive overview of its efforts and commitment to environmental, social and governance (ESG) initiatives. The foundational report marks the beginning of Mobileye&rsquo;s ESG journey, detailing company-wide business practices aligned with the Company&rsquo;s vision of enhancing safety, efficiency and accessibility in transportation. See the report \u003Ca href=\"https://www.mobileye.com/about/esg/\">here\u003C/a>.\u003C/p>\n\u003Cp>&rdquo;I&rsquo;m pleased to share our inaugural Sustainability Report,&rdquo; said Kobi Ohayon, Chief Operating Officer of Mobileye. &ldquo;2023 served as a pivotal period in which Mobileye defined key areas of focus and established initial processes and baseline measures to assess future progress. We eagerly anticipate progressing along the path we&rsquo;ve established as an independent company for the benefit of all stakeholders.&rdquo;\u003C/p>\n\u003Cp>Since its founding in 1999, Mobileye&rsquo;s mission has been to save lives and prevent road accidents through technology innovations. Mobileye&rsquo;s growing global workforce, spanning six countries, is unified in its ambition to bring the life-changing benefits of intelligent driving to everyone, everywhere.\u003C/p>\n\u003Cp>Guided by that mission, Mobileye offers a comprehensive portfolio of Advanced Driver Assistance Systems (ADAS) and autonomous vehicle (AV) solutions designed to address road safety challenges while improving driver experiences. Leveraging expertise in computer vision, machine learning, and data analysis to enhance automotive safety and autonomous mobility, Mobileye aims to improve automotive safety and autonomous mobility, positively impacting everyday lives.\u003C/p>\n\u003Cp>The report provides an in-depth look at Mobileye&rsquo;s focus on quality and product stewardship, privacy and data protection, the well-being of its people and communities, while striving towards a more sustainable future. Some highlights include:\u003C/p>\n\u003Cul>\n\u003Cli>\u003Cstrong>LEED Platinum Certification: \u003C/strong>Mobileye&rsquo;s new headquarters in Jerusalem received a LEED Platinum rating from the U.S. Green Building Council (USGBC). The campus excelled in Water Efficiency, Innovation and Regional Priorities, underscoring Mobileye&rsquo;s dedication to ambitious sustainability goals and efficient operations.\u003C/li>\n\u003Cli>\u003Cstrong>Democratizing Safety:\u003C/strong> As of December 30, 2023, Mobileye solutions had been installed in hundreds of vehicle models worldwide, with the EyeQ&trade; system-on-chip (SoC) deployed in about 170 million vehicles to date, demonstrating the Company&rsquo;s commitment to democratizing safety and realizing the societal benefits of driver assistance.\u003C/li>\n\u003Cli>\u003Cstrong>Community Contributions:\u003C/strong> In 2023, Mobileye allocated $1.2 million to numerous non-profit organizations and social causes, supporting a diverse range of initiatives, including across education, healthcare, and various welfare causes.\u003C/li>\n\u003Cli>\u003Cstrong>Waste Diversion: \u003C/strong>Mobileye&rsquo;s offices collectively recycled 28 tons of material, including cardboard, paper, electronics, plastic, and batteries, with electronic waste comprising 63% of total materials diverted.\u003C/li>\n\u003C/ul>\n\u003Cp>The company aims to provide an annual comprehensive disclosure of the ongoing initiatives that make Mobileye a progressive corporate citizen.\u003C/p>\n\u003Cp>\u003Cstrong>Media Contact: \u003C/strong>Dan Galves, esg@mobileye.com\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2024-11-14T08:00:00.000Z","News",{"id":446,"type":24,"url":447,"title":448,"description":449,"primary_tag":190,"author_name":10,"is_hidden":11,"lang":12,"meta_description":449,"image":450,"img_alt":451,"content":452,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":453,"tags":454},277,"lyft-and-mobileye-team-up-to-enable-autonomous-mobility-at-scale","Lyft and Mobileye team up to enable autonomous mobility at scale","Lyft and Mobileye announced plans to facilitate the widespread commercialization of autonomous vehicles' services by industry-leading fleet operators.","https://static.mobileye.com/website/us/corporate/images/3b2365766871f66fc014d639f6d51eac_1730848606995.jpg","Lyft and Mobileye ","\u003Cp>JERUSALEM, NOVEMBER 6 &mdash; Lyft, Inc. (Nasdaq: LYFT), one of the largest transportation networks in North America, and Mobileye (Nasdaq: MBLY), a leading provider of self-driving technology and advanced driver assistance systems (ADAS), announced today plans for an alliance that aims to facilitate the widespread commercialization of autonomous vehicles' services by industry-leading fleet operators.\u003C/p>\n\u003Cp>Using the latest advancements in AI for autonomy, Mobileye provides AV technology to an ecosystem of purpose-built vehicle manufacturers, who integrate Mobileye&rsquo;s AV technology in AV-ready vehicles for purchase by vehicle fleet operators and transportation service providers. The alliance of Lyft and Mobileye intends to leverage Lyft&rsquo;s network of 40 million annual riders and exceptional reputation as a leading transportation network in North America to provide a seamless demand platform for Mobileye Drive&trade;-based AV fleets.\u003C/p>\n\u003Cp>The objective is for future AV operators who want to deploy and manage large-scale fleets in various metropolitan areas in North America to purchase Mobileye Drive equipped, &ldquo;Lyft-ready&rdquo; vehicles from vehicle builders, access Lyft&rsquo;s rider demand and optimize utilization and profitability of their fleets.\u003C/p>\n\u003Cp>The companies also plan to utilize Mobileye&rsquo;s new cloud-based AV demand technology, which connects Mobileye Drive-equipped vehicles with AV fleet operators. Through Mobileye&rsquo;s turnkey AV ecosystem and Lyft&rsquo;s suite of AV Partner APIs, fleets of AVs are expected to be monetized, giving Lyft users faster and broader availability of AVs.\u003C/p>\n\u003Cp>&ldquo;Mobileye&rsquo;s full-stack technology is an important part of getting autonomous fleets Lyft-ready,&rdquo; said David Risher, CEO at Lyft. &ldquo;As we make more AVs available to our 40 million annual riders, we&rsquo;re laser-focused on building a platform where fleet owners will be proud to put their assets to work. We welcome Mobileye as an important strategic partner on the road to an autonomous future.&rdquo;\u003C/p>\n\u003Cp>&ldquo;Cooperating with leading mobility providers and operators are essential steps to bring autonomous mobility services to reality,&rdquo; said Prof. Amnon Shashua, President and CEO of Mobileye. &ldquo;Enabling Mobileye Drive with Lyft&rsquo;s network of 40 million annual riders in North America would allow our AV customers to reach new markets and geographies with autonomous services and provide the benefits of the technology through a sustainable business.&rdquo;\u003C/p>\n\u003Cp>The planned joint effort marks another milestone in Mobileye&rsquo;s ambition to bring autonomous mobility to passengers across the globe. Working together with transportation network providers, vehicle operators and vehicle manufacturers, Mobileye has provided its AV technology to various mobility-as-a-service projects in Europe and North America. Mobileye Drive-equipped vehicles are currently piloted with several mobility operators in Germany, Norway, Croatia and the United States.\u003C/p>","2024-11-06T08:00:00.000Z","Autonomous Driving, Industry, News",{"id":456,"type":5,"url":457,"title":458,"description":459,"primary_tag":140,"author_name":10,"is_hidden":11,"lang":12,"meta_description":459,"image":460,"img_alt":461,"content":462,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":463,"tags":464},275,"mobileyes-camera-vision-beyond-what-you-see","Mobileye’s camera vision: Beyond what you see  ","A look into our ADAS solutions and how cameras help us reach safer roads ","https://static.mobileye.com/website/us/corporate/images/860936e4c2dac83b3ff7e0f6ef066af7_1729509729689.png","Mobileye’s different product platforms can consist of anything between one to 13 cameras. ","\u003Cp>\u003Cspan data-contrast=\"auto\">It has been said that the eyes are the window to the soul, and if that is true, then the camera is the gateway to Mobileye&rsquo;s innovative technology. The basis for everything we do at Mobileye, from our Advanced Driver Assistance System (ADAS) to autonomous vehicles, is rooted in our approach to leverage cost-effective and scalable cameras, sensors and computer vision technology.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003Cspan data-contrast=\"auto\">In this blog we explore how Mobileye leverages that technology for safer roads. \u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Vision and an autonomous future\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The way cameras are configured around the car, their range for identifying objects, and their integration into Mobileye's platforms are crucial for ushering in a safe and autonomous future. Keeping in mind visual constraints such as lighting or distance, Mobileye's cameras are designed to handle things like lens distortion in order to\u003C/span> \u003Cspan data-contrast=\"auto\">assist drivers to react effectively to objects on the road.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye&rsquo;s different product platforms can consist of anything between one to 13 cameras. This range includes various formations, but all share a common goal &ndash;to identify and adjust the drive according to changing conditions on the road.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Powered by our sophisticated algorithms, high-performance hardware, and the powerful brain of the system- the EyeQ&trade; SoC, our solutions are designed to compute more data as efficiently as possible. In other words, Mobileye&rsquo;s ADAS platforms are designed to extract and label elements from the constant stream of input coming from the cameras, analyze the vehicle's surroundings and suggest the correct action, based on deep learning techniques to understand the driving environment around the vehicle. \u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/863a173db5dee3a32376c1a9c77847d5_1729510125253.png\" alt=\"\" />\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Safety built in\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">With ADAS safety standardization becoming an increasingly integral part of vehicle production, the future of our roads is moving toward safer driving experiences. These advancements have not gone unnoticed. Numerous reports highlight the significant impact driver-assistance technologies have on reducing road incidents. For instance, a 2023 report by the \u003C/span>\u003Ca href=\"https://newsroom.aaa.com/2023/08/your-autos-safety-net-the-lifesaving-potential-of-driving-assistance-tech/\">\u003Cspan data-contrast=\"none\">AAA Foundation for Traffic Safety\u003C/span>\u003C/a>\u003Cspan data-contrast=\"auto\"> suggests that these technologies could potentially prevent up to 37 million crashes in the United States over the next 30 years&mdash;if they become standard in all vehicles.\u003C/span>\u003Cspan data-contrast=\"auto\">1\u003C/span>\u003Cspan data-contrast=\"auto\"> This underscores the powerful potential of scaled-up ADAS systems and the future of road safety.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">A look at the features\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;335559738&quot;:240,&quot;335559739&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye&rsquo;s ADAS features utilize a 120 degrees 8MP front camera, ensuring robust detection and response to a wide range of objects. With it, drivers may benefit from a set of tools that enhance their drive, from Advanced Warning System (AWS), Automatic Cruise Control (ACC)\u003C/span>\u003Cspan data-contrast=\"auto\">,\u003C/span>\u003Cspan data-contrast=\"auto\"> Automatic Emergency Braking (AEB) and Lane Keep Assist, to Lane Departure Warning, cloud-supported lane centering (where applicable), red traffic light warning and more. The multi-camera configuration allows for a highway hands-off/eyes-on driving experience, features the EyeQ&trade;6H chip connected to six cameras including two long range (front and rear) and four short range, in a manner that anticipates future safety regulations.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Additionally, Mobileye&rsquo;s Intelligent Speed Assist (ISA) solution for automakers became recognized as the first &lsquo;vision-only&rsquo; solution to meet the new European Union (EU) \u003C/span>\u003Ca href=\"https://www.mobileye.com/news/mobileye-launches-the-first-camera-only-intelligent-speed-assist-to-meet-new-eu-standards/\">\u003Cspan data-contrast=\"none\">General Safety Regulation (GSR) standards\u003C/span>\u003Cspan data-contrast=\"none\">.\u003C/span>\u003C/a>\u003Cspan data-contrast=\"auto\">&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/a6287c9dcde9b68184e9d662880cf40e_1729510161477.png\" alt=\"\" width=\"1518\" height=\"984\" />\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Safety at scale\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">To sum up, it&rsquo;s evident that vehicles equipped with ADAS features and advanced camera technology have significant potential to enhance road safety, particularly through Mobileye&rsquo;s efforts in providing cost-effective and scalable solutions to drivers worldwide. By leveraging cutting-edge camera vision and AI-driven systems, Mobileye is leading the way toward a future where road incidents are drastically reduced, and autonomous driving becomes a reality.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>","2024-10-21T07:00:00.000Z","ADAS, AV Safety",{"id":466,"type":5,"url":467,"title":468,"description":469,"primary_tag":51,"author_name":10,"is_hidden":11,"lang":12,"meta_description":469,"image":470,"img_alt":471,"content":472,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":473,"tags":474},274,"mobileyes-award-winning-approach-to-collaboration","Mobileye’s award-winning approach to collaboration ","Working with OEMs on safety and autonomy is what we do – and the industry is taking notice","https://static.mobileye.com/website/us/corporate/images/c4c36e446896ef331d1b126a6bfb3762_1726741799204.jpg","\"Mobileye can meet carmakers wherever they are on the buy vs. build continuum\"","\u003Cp>Buy or build? \u003Ca href=\"https://www.mobileye.com/blog/buy-vs-build-the-mobileye-way/\" target=\"_blank\" rel=\"noopener\">That is the question!\u003C/a> Automakers contemplate this very question as they consider their path towards equipping advanced driver assistance systems (ADAS) and automated safety applications in their vehicles. Should their company try its hand at developing these features in-house with their brand or group in mind? Or should it bring in others from the industry who focus on specific solutions for automakers?&nbsp;\u003C/p>\n\u003Cp>With over 25 years of experience in the automotive industry, developing hardware and software, Mobileye can meet carmakers wherever they are on the buy vs. build continuum and support their efforts. We work with the OEMs on providing them with safety platforms and solutions aimed at balancing costs, performance, and time to market without compromising on their brand.&nbsp;\u003C/p>\n\u003Cp>We saw proof of Mobileye&rsquo;s deep, established relations with OEMs this past July when \u003Ca href=\"https://www.mobileye.com/news/volkswagen-group-names-mobileye-among-top-suppliers-for-2024/\" target=\"_blank\" rel=\"noopener\">Volkswagen AG honored CEO Prof. Amnon Shashua\u003C/a> on stage in Germany with the prestigious Volkswagen Group Award for 2024. The award is given in ten categories as VW recognizes top suppliers for outstanding partnership-based cooperation, long-standing practiced values, and shared success, with Mobileye winning the award for digitalization.&nbsp;\u003C/p>\n\u003Cp>&ldquo;We are deeply honored to receive this acknowledgement from Volkswagen AG for our work together toward delivering the future of transportation,&rdquo; said Prof. Shashua. &ldquo;It&rsquo;s a testament to the dedication of our teams and the spirit of innovation in road safety, intelligent driving and autonomous vehicles.&rdquo;&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/d99dc88bfdccd6e6e8a527873426dd25_1726742055181.jpg\" alt=\"Mobileye accepts Volkswagen Group's 2024 award for supplier excellence in Digitalization category (Photo: Business Wire).\" width=\"600\" height=\"389\" />\u003C/p>\n\u003Ch6>Mobileye accepts Volkswagen Group's 2024 award for supplier excellence in Digitalization category (Photo: Business Wire).\u003C/h6>\n\u003Cp>The Mobileye-VW relationship was also noted by ABI Research. &ldquo;Through the adoption of Mobileye&rsquo;s SuperVision and Chauffeur technologies&rdquo; the market research firm wrote, \"Volkswagen AG will be able to equip its premium brands with capabilities such as eyes-off operation, autonomous overtaking, and automated stopping at red light and stop lines, catching up with and, in some ways, exceeding the capabilities of its two primary German competitors.&rdquo; &nbsp;\u003C/p>\n\u003Cp>Aside from its collaboration with VW, Mobileye was also named an Enabling Technology Leader in the Global Passenger Vehicle ADAS Industry by U.S. consulting firm Frost &amp; Sullivan. &nbsp;\u003C/p>\n\u003Cp>Touching on the matter of car makers buying solutions or developing them themselves,\u003Ca href=\"https://www.frost.com/wp-content/uploads/2024/06/Mobileye-Award-Write-Up.pdf\" target=\"_blank\" rel=\"noopener\"> the Frost &amp; Sullivan report reads\u003C/a> &ldquo;The company [Mobileye] strongly emphasizes product personalization, offering off-the shelf, fully co-developed, and hybrid solutions.&rdquo; The report adds that &ldquo;Mobileye&rsquo;s commitment to innovation and creativity enables it to develop technologies that address the full spectrum of advanced mobility for passenger and commercial vehicles, including basic ADAS applications (e.g., automatic braking and blind spot monitoring) and cloud-enhanced L2, L2+, and L3 functionalities.&rdquo; &nbsp;\u003C/p>\n\u003Cp>These few recent accolades add to an already respectable list of accomplishments Mobileye accumulated over a quarter of a century. They show Mobileye&rsquo;s commitment to remain a leader in the industry and serve as an example of Mobileye&rsquo;s spirit of collaborating with its customers. &nbsp;&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2024-09-19T07:00:00.000Z","Industry, ADAS, Autonomous Driving, Awards",{"id":476,"type":24,"url":477,"title":478,"description":479,"primary_tag":28,"author_name":10,"is_hidden":11,"lang":12,"meta_description":479,"image":480,"img_alt":481,"content":482,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":483,"tags":444},273,"mobileye-to-end-internal-lidar-development","Mobileye to end internal Lidar development","Technology progress makes next-gen FMCW Lidar technology less essential to future products.","https://static.mobileye.com/website/us/corporate/images/a3d1e6fe94fe2a45aa8346519e766001_1725880227796.jpg","Mobileye","\u003Cp>JERUSALEM, SEPTEMBER 9 &mdash;Mobileye (Nasdaq: MBLY) has chosen to end the internal development of next-generation frequency modulated continuous wave (FMCW) Lidars for use in autonomous and highly automated driving systems.\u003C/p>\n\u003Cp>As part of our regular review of the long-term technology roadmap, we now believe that the availability of next-generation FMCW Lidar is less essential to our roadmap for eyes-off systems. This decision was based on a variety of factors, including substantial progress on our EyeQ6-based computer vision perception, increased clarity on the performance of our internally developed imaging radar, and continued better-than-expected cost reductions in third-party time-of-flight Lidar units. &nbsp;&nbsp;\u003C/p>\n\u003Cp>This action does not impact any of our customer product programs or product development in general. It also has no bearing on Mobileye&rsquo;s commitment to development of our in-house imaging radar, which is meeting performance specifications based on B-samples and is expected to enter production next year, on schedule. In terms of Mobileye&rsquo;s internal sensor development, imaging radar is a strategic priority. This is a core building-block technology that we expect to drive competitive advantage for Mobileye-based eyes-off systems in cost/performance optimization and scalability.\u003C/p>\n\u003Cp>The Lidar R&amp;D unit will be wound down by the end of 2024, affecting about 100 employees. Operating expenses for the Lidar R&amp;D unit are expected to total approximately $60 million in 2024 (including approximately $5 million related to share-based compensation expenses). While this action is not expected to have a material impact on Mobileye&rsquo;s results in 2024, it will result in the avoidance of Lidar development spending in the future.\u003C/p>\n\u003Cp>\u003Cstrong>Media Contact: \u003C/strong>PR@mobileye.com\u003C/p>","2024-09-09T07:00:00.000Z",{"id":485,"type":5,"url":486,"title":487,"description":488,"primary_tag":190,"author_name":10,"is_hidden":11,"lang":12,"meta_description":488,"image":489,"img_alt":490,"content":491,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":492,"tags":493},272,"buy-vs-build-the-mobileye-way","Buy vs. build: The Mobileye way ","Mobileye provides carmakers with a spectrum of market-ready automated driving solutions","https://static.mobileye.com/website/us/corporate/images/4701e784c8cf57d1cc2661587a5a9832_1723535595334.jpg","\"Mobileye can help chart a course balancing cost, performance, and time to market\"","\u003Cp>Charting the path to a safer future with advanced driving platforms requires experience built on years of industry knowledge and expertise. It's about understanding the dynamics of the mobility market and anticipating the needs of automakers to deliver the right solutions.\u003C/p>\n\u003Cp>The car industry is at an inflection point. A recent \u003Ca href=\"https://www.mckinsey.com/industries/automotive-and-assembly/our-insights/navigating-unknowns-auto-insurance-questions-in-a-new-mobility-era\">McKinsey report\u003C/a> noted, regarding new vehicles, that &ldquo;market participants optimistic about the pace of innovation expect that by 2030...some (maybe one in six)will have Level 3+ autonomous-driving capabilities&mdash;such as self-driving without constant human supervision.&rdquo; In another report, discussing the consumer side, \u003Ca href=\"https://www.mckinsey.com/industries/automotive-and-assembly/our-insights/spotlight-on-mobility-trends\">McKinsey also found\u003C/a> that about a third of consumers are interested in automated driver assistance features in their next vehicle.\u003C/p>\n\u003Cp>These trends indicate a push towards autonomous technologies, leaving carmakers to decide whether to develop these platforms in-house or partner with specialized companies. While some automakers are planning small steps forward, others will make giant leaps to deliver better experiences for their customers. But regardless, across this spectrum, Mobileye can help chart a course balancing cost, performance, and time to market, while enabling customers to protect and elevate the brand- regardless of where they are on their journey.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"Buy or Build? | Mobileye\" src=\"https://player.vimeo.com/video/998092604?h=3b6de818d1&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" frameborder=\"0\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Ch6>Mobileye Director of Technical Business Development Naama Symonds discusses \"buy vs. build\"\u003C/h6>\n\u003Cp>For over a quarter of a century, Mobileye has been at the forefront of enhancing road safety through automated driving technologies. We invest in some of the industry's most critical technological areas, drawing on years of experience to help us anticipate the needs of automakers and their consumers. Our scalable solutions are designed to balance costs, enhance capabilities, and accelerate time to market, supporting automakers at every step of their journey. As a leader in ADAS (Advanced Driver Assistance Systems) and autonomous mobility, we understand the importance of preserving a carmaker&rsquo;s brand. &nbsp;\u003C/p>\n\u003Cp>For automated driving to become widespread, automakers will need to tailor the driving experience to meet their customers' expectations. One solution Mobileye offers that enables brand customization is the Mobileye Driving Experience Platform (DXP), which allows car makers the ability to personalize their automated vehicles. As Mobileye CTO Prof. Shai Shalev-Shwartz and Mobileye CEO Prof. Amnon Shashua \u003Ca href=\"https://www.mobileye.com/opinion/mobileye-dxp-as-a-novel-approach/\">explain\u003C/a> DXP enables automakers to &ldquo;create code that selects the appropriate package during online driving, based on application parameters like locality, road type, regulation, driving mode, and weather conditions.&rdquo; &nbsp;\u003C/p>\n\u003Cp>DXP embodies Mobileye&rsquo;s commitment to safety and scalability. It allows automakers to showcase their brand, by introducing the latest safety and comfort features, while allowing automakers to set custom settings. The solution involving DXP is intended to help automakers keep expenses at bay, reduce development efforts and meet deadlines, while allowing automakers some flexibility to introduce the unique aspects that set them apart in the eyes of the consumer.&nbsp;\u003C/p>\n\u003Cp>Regardless of whether an automaker is making a major leap into building new technologies or advancing with smaller steps, Mobileye has the experience and know-how to support them through this transformative decision.&nbsp;\u003C/p>","2024-08-13T07:00:00.000Z","ADAS, Autonomous Driving, AV Safety",{"id":495,"type":24,"url":496,"title":497,"description":498,"primary_tag":28,"author_name":10,"is_hidden":11,"lang":12,"meta_description":498,"image":499,"img_alt":500,"content":501,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":502,"tags":503},271,"zeekr-and-mobileye-to-accelerate-technology-collaboration-in-china","Zeekr and Mobileye to accelerate technology collaboration in China","Companies will leverage successful technology collaboration and expand Mobileye SuperVision™ platform to future Zeekr models globally","https://static.mobileye.com/website/us/corporate/images/a2e8181d7023078c69e57c4e1bf7322d_1722457942434.jpg","Zeekr and Mobileye to expand collaboration in China and globally. Photo: Zeekr","\u003Cp>Hangzhou and Jerusalem, August 1, 2024 &mdash; Building on the success from their collaboration over the last several years, Zeekr and Mobileye have announced their plan to accelerate technology localization in China, integrate Mobileye technologies into the next-generation Zeekr models, and further their state-of-the-art driving safety and automation there and in the global market.\u003C/p>\n\u003Cp>Since the end of 2021, Zeekr has delivered more than 240,000 Zeekr 001 and Zeekr 009 vehicles, powered by the Mobileye SuperVision&trade; platform, to customers in China and globally. To be more responsive to the growing customer demands in China, the companies intend to accelerate the scaling and delivery of core underlying technologies for the Mobileye SuperVision platform.\u003C/p>\n\u003Cp>Zeekr will be able to use the powerful road intelligence technologies for any of its vehicles. The expanded collaboration will enable Zeekr engineers to better utilize Mobileye&rsquo;s technologies and development tools to conduct validation of data and more efficiently deliver software upgrades to customers. Additionally, the collaboration will accelerate comprehensive automated driving solutions roll-out for other Mobileye customers in China.\u003C/p>\n\u003Cp>The joint effort will also locally tailor other key Mobileye technologies such as the Driving Experience Platform (DXP), a collaborative tool that enables automakers to customize automated driving styles and customer experience. Moreover, the companies will leverage Zeekr&rsquo;s state-of-the-art vehicle technologies and Mobileye&rsquo;s autonomous driving technology to launch next-generation products for ADAS, automated and driverless vehicles (Level 2+ through Level 4), based on the EyeQ6H system-on-chip, for Zeekr and its related brands in global markets.\u003C/p>\n\u003Cp>Zeekr plans to expand the installation of the SuperVision platform on additional vehicles, including next-generation platforms, and further increase the highway and urban coverage of its existing Navigation Zeekr Pilot (NZP) systems. To date, highway NZP based on SuperVision is active in more than 150 cities across China.\u003C/p>\n\u003Cp>&ldquo;The partnership with Mobileye has provided industry-leading intelligent mobility solutions to Zeekr users in the past few years,&rdquo; said Andy An, CEO of Zeekr. &ldquo;We will hold a more opening-up cooperation and strengthen the collaboration with Mobileye in the future, to reach new technological milestones and provide a better driving experience for our users globally.&rdquo;\u003C/p>\n\u003Cp>NZP has been a critical success for Zeekr. Customer feedback has been positive, demonstrating the value of advanced navigate-on-pilot systems to consumers.&nbsp;\u003C/p>\n\u003Cp>&ldquo;This new chapter in the Mobileye and Zeekr relationship will bolster Mobileye&rsquo;s efforts toward the localization of its SuperVision technologies in China while making such localized infrastructure, especially of road intelligence technologies, available for Mobileye&rsquo;s customers in China,&rdquo; said Prof. Amnon Shashua, Mobileye&rsquo;s president and chief executive officer. &ldquo;It also broadens the cooperation between the two companies to include next-generation products on a wide spectrum of product portfolios from Level 2+ through Level 4.&rdquo;\u003C/p>","2024-08-01T07:00:00.000Z","News, Financial, Autonomous Driving, ADAS, Mapping & REM, Industry",{"id":505,"type":5,"url":506,"title":507,"description":508,"primary_tag":190,"author_name":10,"is_hidden":11,"lang":12,"meta_description":508,"image":509,"img_alt":510,"content":511,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":512,"tags":513},268,"mobileyes-imaging-radar-takes-the-wheel","Mobileye’s imaging radar takes the wheel ","Radar technology has been around for years, but Mobileye’s Imaging Radar introduces our own new category.  ","https://static.mobileye.com/website/us/corporate/images/af7696dab443a2b5d1483cef50715beb_1720082783552.jpg","\"Unlike traditional radar systems, Mobileye’s imaging radar has more detailed object detection capabilities\"","\u003Cp>Today, hands-free (but eyes-on) autonomous driving solutions are possible with cameras-only. However, as the mobility industry advances towards more and more autonomous capabilities, key challenges such as reliability and accuracy within intricate driving environments remain. Overcoming them pushes the industry to seek appropriate solutions that prioritize safety for passengers and pedestrians, while also ensuring vehicles can assist drivers and eventually operate independently. &nbsp;\u003C/p>\n\u003Cp>To achieve safe autonomous driving capabilities, some in the industry agree that additional sensing modalities beyond cameras are necessary. Mobileye&rsquo;s approach, known as True Redundancy&trade;, relies on Lidars and Mobileye&rsquo;s new category of Imaging Radars to create a sensing state that operates independently from the camera-based sensing state. &nbsp;This allows for different sensor configurations that enhance autonomous capabilities and road safety. &nbsp;In this approach, our radar technology is emerging as a key tool in enabling autonomous features. &nbsp;\u003C/p>\n\u003Cp>On the road, traditional ADAS\u003Cspan class=\"NormalTextRun SCXW236082798 BCX0\"> (Advanced Driving Assistance Systems)\u003C/span>&nbsp;radars are effective in simple, less dense environments as they provide sparse information on dynamic objects, but they face challenges in detecting static objects. To overcome these challenges, Mobileye developed its new category of radars that delivers reliable output independently of cameras or lidars. This innovation supports the core principles of the \"True Redundancy\" architecture.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Ch6>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/b3WSAYguMaY?si=gnhNwvXMLpvj3M-g\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>&nbsp;\u003Cbr />&nbsp;Radar Technology: From Basic ADAS to Reliable Autonomous Driving, VP Radar Yaniv Avital\u003C/h6>\n\u003Ch3>Mobileye&rsquo;s sense&nbsp;\u003C/h3>\n\u003Cp>From its inception, the Mobileye imaging radar was purposefully designed for autonomous driving capabilities to achieve an AV system that outperforms human perception and decision-making on the road. This advanced imaging has a higher level of dynamic range than traditional radars in challenging scenarios, such as the ability to detect a child 150 meters away on a road when a bus is only 10 meters from the vehicle. &nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Ch6>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/fb0dae917e686f8e58cf1218bfa02486_1720083684181.jpg\" alt=\"\" width=\"600\" height=\"389\" />\u003Cbr />\"Mobileye developed its new category of radars that delivers reliable output independently of cameras or lidars\"&nbsp;\u003C/h6>\n\u003Cp>Unlike traditional radar systems with limited levels of object detection, Mobileye&rsquo;s imaging radar has more detailed object detection capabilities with heightened elevational resolution. This precise detection enables the system to discern objects in unique scenarios, such as stationary vehicles under a bridge. It generates a rich point cloud that facilitates AV driving capabilities such as exact lane assignment and the ability to react quickly at high speeds. The system can detect road users- pedestrians, motorcycles, and cyclists- at a distance of 315 meters and identify potential hazards up to 230 meters away. &nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Ch6>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/3onQ-ZLadOU?si=pbzXFW6ENHC6v3lh\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>&nbsp;\u003Cbr />Mobileye's Imaging Radar fits perfectly with the VW ID Buzz, VP of Autonomous Vehicles Johann Jungwirth&nbsp;&nbsp;\u003C/h6>\n\u003Ch3>A radar that hits the mark&nbsp;\u003C/h3>\n\u003Cp>The automotive industry is experiencing a profound shift in radar aimed at advancing and enhancing current capabilities. The Mobileye Radar portfolio features high angular resolution, expansive dynamic range, and substantial sidelobe suppression, enabling detection in challenging situations. This ensures reliable detection of small and low-reflective objects on the highway while driving up to 130km/hours. The development of the Mobileye imaging radar comes as a commitment from Mobileye to shape the future of the automotive industry by consistently innovating and introducing new categories of expertise. &nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2024-07-10T07:00:00.000Z","AV Safety, Autonomous Driving, Video",{"id":515,"type":24,"url":516,"title":517,"description":518,"primary_tag":28,"author_name":10,"is_hidden":11,"lang":12,"meta_description":518,"image":519,"img_alt":520,"content":521,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":522,"tags":523},269,"volkswagen-group-names-mobileye-among-top-suppliers-for-2024","Volkswagen Group names Mobileye among top suppliers for 2024","Mobileye presented Digitalization award by Volkswagen Group at special event in Wolfsburg on July 2. ","https://static.mobileye.com/website/us/corporate/images/79541d286aa8749c024428a757c98333_1720426754034.jpg","Mobileye accepts Volkswagen Group's 2024 award for supplier excellence in Digitalization category.","\u003Cp>JERUSALEM, July 8 &mdash; For the 20th year, Volkswagen AG has recognized its best suppliers globally, awarding a select group of 10 firms the prestigious Volkswagen Group Award for 2024. Mobileye (Nasdaq: MBLY) was honored to receive the award for Digitalization at a special event in Wolfsburg on July 2.\u003C/p>\n\u003Cp>The awards in 10 categories with 40 nominees in total recognized suppliers for outstanding partnership-based cooperation, long-standing practiced values and shared success. The award for Digitalization was presented by Hauke Stars, Member of the Board of Management of Volkswagen AG responsible for IT, and Stefan K&uuml;hne, Head of Group Procurement, Interior.\u003C/p>\n\u003Cp>With Mobileye, &ldquo;we can offer our customers innovative solutions and comfort features,&rdquo; said K&uuml;hne. &ldquo;Thanks to its extensive product portfolio, we can create great synergies throughout the VW Group, across different brands, with a real benefit for our products and customers.&rdquo;\u003C/p>\n\u003Cp>&ldquo;The past, present and future success of the Volkswagen Group is only possible with strong partners,&rdquo; said Dirk Gro&szlig;e-Loheide, Member of the Extended Executive Committee and Member of the Volkswagen Brand Board of Management responsible for Procurement.&nbsp;&nbsp;&nbsp; &nbsp;&ldquo;The anniversary of the Volkswagen Group Award highlights the importance we place on close, partnership-based cooperation with our suppliers and underscores the esteem that goes with this. The 20th Volkswagen Group Award stands for 20 years of partnership and 20 years of shared success &ndash; a perfect combination. As we move forward, we will shape the future with our partners based on this conviction.&rdquo;\u003C/p>\n\u003Cp>&ldquo;We are deeply honored to receive this acknowledgement from Volkswagen AG for our work together toward delivering the future of transportation,&rdquo; said Prof.Amnon Shashua, Mobileye President and CEO. &ldquo;It&rsquo;s a testament to the dedication of our teams and the spirit of innovation in road safety, intelligent driving and autonomous vehicles.&rdquo;\u003C/p>\n\u003Cp>Mobileye&rsquo;s work across the Volkswagen Group ranges from advanced driving assist technologies to developing hands-free and eyes-off driving solutions and autonomous technology across multiple continents.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2024-07-08T07:00:00.000Z","News, Awards, Industry",{"id":525,"type":24,"url":526,"title":527,"description":528,"primary_tag":40,"author_name":10,"is_hidden":11,"lang":12,"meta_description":528,"image":529,"img_alt":530,"content":531,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":32,"publish_date":532,"tags":533},267,"verne-unveils-urban-mobility-service-driven-by-mobileye","Verne unveils urban mobility service driven by Mobileye","Event held at Rimac Campus reveals purpose-built car design, autonomous platform powered by Mobileye Drive™, service concept, and functionalities.","https://static.mobileye.com/website/us/corporate/images/b190119bb4ff513af98a9e27a3d9ea6d_1719345826270.jpg","Introducing Verne, a new urban autonomous mobility ecosystem using the Mobileye Drive™ platform.","\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"none\">Zagreb, Croatia, June 26, 2024\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"none\"> - Verne, a redefined approach to urban autonomous mobility in cities, has been introduced.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Founded by Mate Rimac and two of his closest colleagues and friends from Rimac Group - Marko Pejković, now CEO of Verne, and Adriano Mudri, the designer of Nevera and Chief Design Officer at Verne.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">A collaboration with Mobileye, a world leader in autonomy, enables Verne&rsquo;s autonomous capabilities. The vehicle will be fully autonomous, with a system capable of driving in dynamic urban traffic. \u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">For the past several years, the company has been cooperating with Mobileye. Mobileye's advanced AD platform Mobileye Drive&trade;, will be integrated into the purpose-built Verne vehicle and together with a sophisticated sensor set of cameras, radar and lidar enable the automated driving capabilities. The platform is designed to be highly flexible and scalable, to meet the demands of autonomous driving in a variety of locations, on different road types, under varying weather conditions and even taking local driving styles into account, within its operational design domains. All of this is crucial for Verne&rsquo;s future plans.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">&ldquo;Around the world, the promise and benefits of autonomous vehicles to road safety and accessibility have started to come into focus,&rdquo; said Johann &ldquo;JJ&rdquo; Jungwirth, Executive Vice President, Autonomous Vehicles at Mobileye.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003Cspan data-contrast=\"none\">&ldquo;Verne&rsquo;s innovative new vehicle platform and ecosystem reflect the potential autonomous mobility has to change our expectations of personal transportation. Only with a system like Mobileye Drive&trade;, that&rsquo;s built for scalability would something like this be possible, and we&rsquo;re proud to work with Verne on this program.&rdquo;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:276}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Named after Jules Verne, the author who imagined humanity's potential through amazing journeys, Verne was formerly known as P3 Mobility. The collaboration with Mobileye was first \u003C/span>\u003Ca href=\"https://www.mobileye.com/news/mobileye-and-project-3-mobility-announce-collaboration-for-scalable-urban-autonomous-mobility-project/\">\u003Cspan data-contrast=\"none\">announced\u003C/span>\u003C/a>\u003Cspan data-contrast=\"none\"> in February 2024.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">\u003Cspan class=\"TextRun SCXW233002470 BCX8\" lang=\"EN-GB\" xml:lang=\"EN-GB\" data-contrast=\"none\">\u003Cspan class=\"NormalTextRun SCXW233002470 BCX8\">Visit \u003Ca href=\"https://www.letsverne.com/media/verne-journey-to-the-future-of-mobility\">letsverne.com\u003C/a> to learn more.&nbsp;\u003C/span>\u003C/span>\u003C/span>\u003C/p>","2024-06-26T07:00:00.000Z","Autonomous Driving, Driverless MaaS, News",{"id":535,"type":5,"url":536,"title":537,"description":538,"primary_tag":28,"author_name":10,"is_hidden":11,"lang":12,"meta_description":538,"image":539,"img_alt":540,"content":541,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":542,"tags":543},265,"mobileye-drive-hits-the-road-with-deutsche-bahn","Mobileye Drive™ hits the road with Deutsche Bahn ","Mobileye’s AV platform clears rigorous regulation for public transport street testing in Germany ","https://static.mobileye.com/website/us/corporate/images/aa49509be5d446c6917a5cf97eb4bb85_1719229227061.jpg","Photo: RMV/DB/Landwehr","\u003Cp>In a big push towards an autonomous future, Mobileye's fully autonomous no-driver system (also known as L4) has been \u003Ca href=\"https://www.deutschebahn.com/de/presse/pressestart_zentrales_uebersicht/Pionierprojekt-KIRA-startet-mit-autonomen-Fahrzeugen-fuer-den-OePNV--12926526\" target=\"_blank\" rel=\"noopener\">announced\u003C/a> for in-traffic testing in Germany. The Mobileye Drive&trade; platform was introduced by the country&rsquo;s Deutsche Bahn (DB) aboard six on-demand shuttles (with no passengers at this point) in Darmstadt and Offenbach in the Rhine-Main area. The KIRA project, as it's called, plans to utilize Mobileye Drive to expand on-demand shuttle service, and to further include rural areas into the Rhein-Main-Verkehrsverbund (RMV) service area.&nbsp;\u003C/p>\n\u003Cp>This accomplishment speaks volumes to Mobileye&rsquo;s software and hardware abilities, since the platform had to clear Germany&rsquo;s \u003Ca href=\"https://www.kba.de/EN/Themen_en/Typgenehmigung_en/Autonomes_automatisiertes_Fahren_en/nationale_Betriebserlaubnis_en/nationale_betriebserlaubnis_node_en.html\" target=\"_blank\" rel=\"noopener\">&ldquo;national type approval for motor vehicles with a fully automated driving function,&rdquo;\u003C/a> or AFGBV, a regulation process that standardizes AV testing and could be used as a blueprint for regulators and institutions beyond Germany. &nbsp;\u003C/p>\n\u003Cp>It is worth noting that Mobileye Drive is designed as a &ldquo;one-size-fits-most\" platform, meaning its agnostic nature allows it to work in different ODDs in certain circumstances. Combined with its regulatory success so far, the platform could be attractive for various projects, as demonstrated by this recent progress as well as by the March announcement of \u003Ca href=\"https://www.mobileye.com/news/volkswagen-admt-announces-agreement-with-mobileye-for-autonomous-driving/\" target=\"_blank\" rel=\"noopener\">Mobileye and VW&rsquo;s collaboration on the ID Buzz\u003C/a>, and by the number of European local governments and providers who are eyeing the platform as a way of reshaping mobility in their cities. Parking congestion, pollution, and ill-suited public transportation services are just a few causes for the growing interest in AV solutions such as Mobileye Drive. &nbsp;\u003C/p>\n\u003Ch3>A track record shaping the future&nbsp;\u003C/h3>\n\u003Cp>The DB collaboration is far from the first time Mobileye's solutions have passed rigorous regulations, and in some cases, it has helped shape the industry&rsquo;s expectations. Based on extensive research and experience, Mobileye delivers more than its hardware/software know-how, it also offers a record of fulfilling OEMs&rsquo;, regulators&rsquo;, policy makers&rsquo; and end users&rsquo; demands and expectations, doing so in a manner that is safe, cost-effective and scalable.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/033ee1c99dc649bb62512b0a54c2970e_1719214588031.jpg\" alt=\"Photo: &ldquo;RMV/DB/Landwehr&rdquo;\" width=\"600\" height=\"389\" />\u003C/p>\n\u003Ch6>Photo: RMV/DB/Landwehr\u003C/h6>\n\u003Cp>In early 2024 it was announced that Mobileye Drive will begin \u003Ca href=\"https://www.mobileye.com/news/mobileye-and-project-3-mobility-announce-collaboration-for-scalable-urban-autonomous-mobility-project/\" target=\"_blank\" rel=\"noopener\">testing and validation in Zagreb, Croatia\u003C/a> as part of a collaboration with local company Project 3 Mobility. It has also been tested on the streets of \u003Ca href=\"https://www.mobileye.com/news/volkswagen-commercial-vehicles-begins-av-testing-with-mobileye-drive/\" target=\"_blank\" rel=\"noopener\">Munich and Austin\u003C/a> as part of Mobileye's work with the VW group. The company also reinforced our commitment to the highest level of safety regulation at \u003Ca href=\"https://www.mobileye.com/blog/mobileye-at-ncap24-centering-on-safety/\" target=\"_blank\" rel=\"noopener\">the first Global NCAP Forum\u003C/a>. So, as the demand from consumers, automakers and regulators for safe mobility solutions constantly grows, Mobileye delivers essential capabilities that drive the automotive industry forward.&nbsp;\u003C/p>\n\u003Ch3>Street level autonomous MaaS &nbsp;\u003C/h3>\n\u003Cp>Autonomous mobility-as-a-service (MaaS) solutions operating on the street, such as on-demand shuttles, have an important role in the future of mobility. Public transport operators and authorities aspire to offer greater access to mobility to more people through cost-efficient services, despite a growing scarcity of drivers. These are pains autonomous MaaS could help alleviate.&nbsp;\u003C/p>\n\u003Cp>Communities often look for innovative ideas that could reinvigorate public mobility services within their borders while also satisfying regulators&rsquo; demands. Mobileye&rsquo;s progress in Germany reflects an ambitious vision, reimagining how people travel, which is turning into a reality. &nbsp;\u003C/p>","2024-06-25T07:00:00.000Z","Driverless MaaS, Autonomous Driving, News, Events",{"id":545,"type":5,"url":546,"title":547,"description":548,"primary_tag":51,"author_name":10,"is_hidden":11,"lang":12,"meta_description":548,"image":549,"img_alt":550,"content":551,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":552,"tags":553},266,"marking-iwied-2024","Marking IWIED 2024:  A glimpse into how Mobileye is enhancing the future","On International Women in Engineering Day we investigate how solving problems and designing advanced tech drive these Mobileye engineers ","https://static.mobileye.com/website/us/corporate/images/4f4fd9a496345c64a585e51de14ff92f_1719230164824.jpg","Left: VP of R&D Production Programs Noa Fleishman, Algorithm Engineer Neta Zimerman-Katz, and Algorithm Developer Merav Shechter-Diamant","\u003Cp>For 25 years, Mobileye has been enhancing the technology that drives the automated future. Our mobility solutions are making roads safer and are laying the groundwork for consumer Autonomous Vehicles (AVs) and driverless MaaS (Mobility as a Service) projects. However, with the rapid pace of futuristic advances in our lives, it is easy to overlook the fact that inside the labs and behind the keyboards are real people, working day and night to solve problems and develop the precise rules needed to propel our tech forward.&nbsp;\u003C/p>\n\u003Cp>A closer look at these unsung industry heroes of our industry, reveals variations to the common profile we might think of when we say &ldquo;engineer.&rdquo; \u003Ca href=\"https://swe.org/research/2024/employment/\" target=\"_blank\" rel=\"noopener\">According to the Society of Women Engineers\u003C/a>, the percentage of women employed in engineering in the U.S. has increased, although slowly. In the 1990&rsquo;s, women represented about nine percent of the working force in architecture and engineering. By 2023, that number had grown to 16.7 percent. There has been and increase between 2011 and 2021 in the number of women working in \u003Ca href=\"https://ncses.nsf.gov/pubs/nsf23315/report/the-stem-workforce\" target=\"_blank\" rel=\"noopener\">STEM fields from 9.4 million to 12.3 million\u003C/a>, however, they represent about a third of the overall American STEM workforce. &nbsp;In the UK, as of 2021, \u003Ca href=\"https://www.inwed.org.uk/wp-content/uploads/2023/08/INWED-Impact-Report-2023-Final.pdf\" target=\"_blank\" rel=\"noopener\">women make only 16.5% of engineers\u003C/a>, and less than \u003Ca href=\"https://swe.org/wp-content/uploads/2022/08/SWE-Fast-Facts-2023_Final.pdf\" target=\"_blank\" rel=\"noopener\">a third of engineers worldwide\u003C/a>. &nbsp;\u003C/p>\n\u003Cp>Leading change is never easy, but like the talented individuals working at Mobileye, defying technological and societal expectations, on this International Women in Engineering Day, we can see more women enhancing our world through technology.&nbsp;\u003C/p>\n\u003Ch3>&ldquo;I want to be an algorithm engineer&rdquo;&nbsp;\u003C/h3>\n\u003Cp>Algorithm developer Merav Shechter-Diamant who has been working at Mobileye since 2022 is an example of a new breed of rising talent among engineers in recent years. &ldquo;At a young age I told my dad I wanted to be an algorithm engineer, among other dreams of course,&rdquo; the 25-year-old joked. She started here at Mobileye as a student and transitioned to full-time after graduating from Hebrew University. &nbsp;\u003C/p>\n\u003Cp>Merav comes from a religious household where faith and knowledge are deeply valued, with a father who works as a programmer and a mother who teaches physics. &ldquo;From an early age, my dad would give me the problems he gave his team,&rdquo; Merav recalled, &ldquo;I imagined him going to work and solving riddles.&rdquo; When asked if that is what she feels she does at Mobileye, Merav had her own way of explaining. &ldquo;Generally speaking, you could say we solve riddles, but there is much more to it. It is a riddle you work on for weeks &ndash; it is much more challenging.&rdquo;&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/1dcea285a866ec37fd06161ef7c1248b_1719229678324.jpg\" alt=\"Algorithm Developer Merav Shechter-Diamant\" width=\"600\" height=\"389\" />\u003C/p>\n\u003Ch6>\u003Cspan class=\"TextRun SCXW101218987 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW101218987 BCX0\">Algorithm Developer Merav Shechter-Diamant\u003C/span>\u003C/span>\u003C/h6>\n\u003Cp>Merav is part of the &ldquo;Ground Truth Algorithms 3D - Lidar Based&rdquo; team. The code she writes analyzes real world data and is used for training and validating our platforms. &ldquo;This code goes through mass amounts of data from the road and generates high quality and accurate information that assists in training the vehicle,&rdquo; she said. &ldquo;I work on lidar processing, meaning measuring and modeling the world around you in 3D, with laser.&rdquo; Her work is crucial in the ability of vehicles equipped with the Mobileye Drive &trade; and Mobileye Chauffeur&trade; platforms to maneuver.&nbsp;\u003C/p>\n\u003Cp>&ldquo;Working at Mobileye, I was challenged from day one, I was expected to figure things out,&rdquo; Merav describes her first impressions. &ldquo;But I was also given the confidence and support to try and that there are people behind me,&rdquo; she added, stressing her dream to leave a mark on our world through technology. &nbsp;\u003C/p>\n\u003Ch3>&ldquo;I want to be at the forefront of technology&rdquo;&nbsp;\u003C/h3>\n\u003Cp>&ldquo;I saw engineering as taking a complicated problem, and somehow making it work. And if you can make an impact along the way, even better.&rdquo; That is how Neta Zimerman-Katz described what attracted her to follow in her father&rsquo;s footsteps and become an engineer. &nbsp;\u003C/p>\n\u003Cp>Since graduating with a master's in electrical engineering from Ben Gurion University three years ago, Neta has been working at Mobileye as an algorithm engineer. She admitted it took her some time to see results, but now she feels she is making a difference. &ldquo;At first they give you a small problem, which is part of a bigger one, and you learn from it and from others and what they do,&rdquo; she explained, &ldquo;but now, seeing my work included in a future design, is very exciting, and feels like I have arrived.&rdquo;&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/4e20983fa0b265832b355c967ef12536_1719225909729.jpg\" alt=\"Algorithm Engineer Neta Zimerman-Katz\" width=\"600\" height=\"389\" />\u003C/p>\n\u003Ch6>Algorithm Engineer Neta Zimerman-Katz\u003C/h6>\n\u003Cp>Like Merav, Neta also works on the Mobileye lidar system, focusing on signal processing. Her role involves developing algorithms embedded in the sensor to transform collected data from around the vehicle into tangible information on speed and distance.&nbsp;\u003C/p>\n\u003Cp>&ldquo;When I think of what I do and where I work, it is important for me to feel connected to the overall mission. And I see autonomous driving as something that could improve the world, being part of it all provides me with a lot of satisfaction.\" &nbsp;\u003C/p>\n\u003Cp>On her Mobileye experience, Neta noted she is one of only three women on her team, and while she cannot say she feels marginalized or looked at differently at work, she does recognize the challenges all women face in tech. \"Before coming here, I was a little worried about working in a predominantly male environment, and I can say that in my team I don&rsquo;t feel out of place. However, I would like to see more women in our industry overall.&rdquo;&nbsp;\u003C/p>\n\u003Cp>When asked about her future goals, the 32-year-old declared she still feels at the initial stages of her journey, &ldquo;I feel like there is still so much to learn, I want to progress and don&rsquo;t want to feel bored,&rdquo; she said. &ldquo;I want to be at the forefront of technology that has the potential to take us even further.&rdquo; &nbsp;&nbsp;\u003C/p>\n\u003Ch3>&ldquo;Gender paradigms are meant to be broken&rdquo;&nbsp;\u003C/h3>\n\u003Cp>If Merav and Neta represent engineers at the beginning of their careers, Noa Fleishman, Mobileye&rsquo;s Vice President of R&amp;D Production Programs, serves as an example of the continued influence these careers could have. &nbsp;&nbsp;\u003C/p>\n\u003Cp>She has been with the company since 2007, starting as an algorithm engineer directly solving issues raised by customers. She quickly moved on to become a technical project manager, combining her technological and client-facing capabilities to push Mobileye&rsquo;s products forward, before settling in her current role three years ago, leading a department of about 60 people.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/a8be41b1cecc0cb3c198b66222304893_1719226005156.jpg\" alt=\"Vice President of R&amp;D Production Programs Noa Fleishman\" width=\"600\" height=\"389\" />\u003C/p>\n\u003Ch6>Vice President of R&amp;D Production Programs Noa Fleishman\u003C/h6>\n\u003Cp>&ldquo;Coming up through the company, having the technological background, helped me immensely during my career. I can be a part of the technical discussions on all levels and contribute to reaching the right solution,&rdquo; she said. &ldquo;I know I have some influence on things here, but throughout the years I had this conflict about moving to project management because I do miss writing code and solving analytical challenges,&rdquo; Noa admitted. During her almost 17 years at Mobileye, Noa witnessed the company&rsquo;s growth and transformation, with more offices around the world and the diverse background of its workforce. &nbsp;&ldquo;I can say that today Mobileye is less male dominated than it used to be.&rdquo; However, when speaking to her experiences as one of the few, or even the only woman in the room, Noa admitted she had to learn to trust the skills she already possessed. &ldquo;Before, when I met with clients and entered a room with 20 men, I thought I needed to act like they do. But in recent years, I finally understood that my feminine attributes - my communication skills, multitasking and patience- serve me much better.&rdquo; &nbsp;\u003C/p>\n\u003Cp>Between Noa and the directors on her team, 75% of the leadership in her department are women. Noa noted she tries to give more women a chance to interview and would love to see more of them show up. Unfortunately, she sees that very few pursue a career in the field she carved her path in. &ldquo;I recently gave a lecture to young girls at a &lsquo;Cracking the Glass Ceiling&rsquo; event, and I shared a story with the young women about how I was one of only 20 women in my computer science class of over 200 students. I told them about that because I wanted to challenge them to be person number 21 who would take that class, I wanted them to know that gender paradigms are meant to be broken.&rdquo;&nbsp;\u003C/p>\n\u003Cp>Like their colleagues, Merav, Neta and Noa thrive on solving problems and seeing matters from all angles. Their success drives our ability to make the autonomous future a reality, sooner rather than later, and their ambition and curiosity are the fuel on which this company runs.&nbsp;\u003C/p>","2024-06-24T07:00:00.000Z","Events, Industry, Autonomous Driving",{"id":555,"type":5,"url":556,"title":557,"description":558,"primary_tag":9,"author_name":10,"is_hidden":11,"lang":12,"meta_description":558,"image":559,"img_alt":560,"content":561,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":562,"tags":563},264,"mobileyes-base-adas-a-comprehensive-customizable-approach-to-safety","Mobileye's Base ADAS: A comprehensive, customizable approach to safety","Our feature-rich Base ADAS encompasses a wide range of computer vision technologies and is built on our proven track record as a leader in driver assist.","https://static.mobileye.com/website/us/corporate/images/7186b74b66c5d002d4347124c3e2fe97_1718711790512.jpg","Vehicle detection, a basic part of all our ADAS products, in action.","\u003Cp>Have you noticed that driving is becoming a team effort between driver and machine? The average car has in recent years become smarter, equipped with various features to help keep its occupants safe. From lane-keeping assist to detecting vehicles in blind spots, these features, which together form the car&rsquo;s &nbsp;Advanced Driver-Assistance System (ADAS), are quietly enhancing the driving experience and making it more intelligent.\u003C/p>\n\u003Cp>In this third installment of our \u003Ca href=\"https://www.mobileye.com/blog/what-is-advanced-driver-assistance-system-adas/\" target=\"_blank\" rel=\"noopener\">Mobileye 101 series\u003C/a>, we&rsquo;ll focus on our foundational ADAS solution&mdash;Mobileye Base ADAS.\u003C/p>\n\u003Ch3>\u003Cstrong>Mobileye: ADAS pioneer\u003C/strong>\u003C/h3>\n\u003Cp>Mobileye has, for the past quarter of a century, been developing and supplying the computer vision technology that enables many ADAS features. The cost-effective single-camera approach that we pioneered (in contrast to some early driver assistance systems that used multiple cameras or sensors like radar and lidar), combined with our expertise in computer vision, has helped make our systems more affordable. This has significantly contributed to the rise of high-performing ADAS in a wider range of cars on the road today.\u003C/p>\n\u003Ch3>\u003Cstrong>How ADAS saves lives\u003C/strong>\u003C/h3>\n\u003Cp>Automotive safety advocates and regulators around the world have recognized the power of ADAS to save lives and reduce crashes. In a 2023 report, the \u003Ca href=\"https://newsroom.aaa.com/2023/08/your-autos-safety-net-the-lifesaving-potential-of-driving-assistance-tech/\" target=\"_blank\" rel=\"noopener\">AAA Foundation for Traffic Safety\u003C/a> calculated that driver-assistance technologies available today could prevent 37 million crashes, 14 million injuries, and nearly 250,000 deaths in the United States over the next 30 years &ndash; if they are made standard on all vehicles. Other research by the \u003Ca href=\"https://www.iihs.org/media/290e24fd-a8ab-4f07-9d92-737b909a4b5e/HvQHjw/Topics/ADVANCED%20DRIVER%20ASSISTANCE/IIHS-HLDI-CA-benefits.pdf\" target=\"_blank\" rel=\"noopener\">Insurance Institute for Highway Safety\u003C/a> found that automatic emergency braking (AEB) reduces front-to-rear crashes by 50%, that lane departure warning reduces injury crashes by 21%, and that blind spot detection reduces lane change crashes involving injuries by 23%. &nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>Features of Mobileye Base ADAS\u003C/strong>\u003C/h3>\n\u003Cp>Consumers now expect their vehicles to have various levels of ADAS features for safety and comfort, and Mobileye Base ADAS provides a broad spectrum of computer vision technologies for OEMs to integrate a wide variety of features into their Base-ADAS driving platforms.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/1a9f83429ad1036fa2c2afec6456b17d_1718711848390.png\" alt=\"\" width=\"650\" height=\"422\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">Display of a Mobileye ADAS-equipped vehicle, showing a 3D view of the road with vehicles, pedestrians, and road markings.\u003C/span>\u003C/p>\n\u003Cp>Features of Mobileye Base ADAS include \u003Cstrong>Forward Collision Warning\u003C/strong>, one of the first driver assistance features widely adopted by carmakers, which monitors the area in front of the vehicle and warns the driver of an imminent collision, and \u003Cstrong>Blind Spot Detection\u003C/strong>, which alerts the driver to any obstacles in the hardest-to-see sections of the driver's field of view.&nbsp;\u003C/p>\n\u003Cp>But Mobileye's Base ADAS goes beyond mere warnings and, depending on its implementation into a given vehicle, can also intervene when necessary, like automatically applying the brakes (\u003Cstrong>Automatic Emergency Braking\u003C/strong>) or nudging the vehicle back into its lane to help avoid a collision (\u003Cstrong>Lane Departure Warning\u003C/strong>).\u003C/p>\n\u003Cp>Other active features include \u003Cstrong>Adaptive Cruise Control\u003C/strong>, which maintains a safe distance between the vehicle and the vehicle in front by adjusting speed, even on busy highways, and \u003Cstrong>Highway Assist\u003C/strong>, which combines such features as Adaptive Cruise Control and Lane Centering to partially automate cruising on interurban highways. \u003Cstrong>\u003Ca href=\"https://www.mobileye.com/blog/intelligent-speed-assist-isa-computer-vision-adas-solution/\" target=\"_blank\" rel=\"noopener\">Intelligent Speed Assist\u003C/a> \u003C/strong>helps drivers stay within the legal speed limit by either passively alerting the driver or actively intervening to reduce the vehicle's speed.\u003C/p>\n\u003Ch3>\u003Cstrong>Built on a wide range of product technologies\u003C/strong>\u003C/h3>\n\u003Cp>Mobileye&rsquo;s Base ADAS is built on a foundation of over a dozen computer vision product families. It merges decades of experience as a leader in computer vision and automative-grade processor design with our expertise in custom-integrated software that optimizes performance and energy efficiency.\u003C/p>\n\u003Cp>Product families cover the domains that have made Mobileye a leader in ADAS, including our pioneering approaches to vehicle, pedestrian, and general object detection, as well as advanced traffic sign recognition (TSR) and traffic light recognition (TFL), among many other advanced perception strategies. They provide critical activation instructions during vehicle operation, optimizing safety.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/c16a659fce25aacea504349581e5e21d_1718711873770.png\" alt=\"\" width=\"650\" height=\"422\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">Object detection of pedestrians as visualized by Mobileye&rsquo;s computer vision software.\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>A spectrum of driving solutions\u003C/strong>\u003C/h3>\n\u003Cp>Mobileye isn't just about one-size-fits-all solutions. We offer a \u003Ca href=\"https://www.youtube.com/watch?v=ViGL0z1BULs\" target=\"_blank\" rel=\"noopener\">comprehensive spectrum of driver-assistance features\u003C/a>, with Base ADAS as the foundation. This means that OEMs get a custom-tailored solution to their needs, and ultimately paves the way for their seamless expansion to implement &nbsp;autonomous solutions.\u003C/p>\n\u003Cp>The next time you experience the smooth convenience of driver-assistance features in your car, remember &ndash; it may be Mobileye's Base ADAS quietly working away behind the scenes.\u003C/p>","2024-06-17T07:00:00.000Z","ADAS",{"id":565,"type":5,"url":566,"title":567,"description":568,"primary_tag":190,"author_name":10,"is_hidden":11,"lang":12,"meta_description":568,"image":569,"img_alt":570,"content":571,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":572,"tags":573},263,"maas-for-av-day","Delving into MaaS for AV day  ","On National Autonomous Vehicle Day, we focus on how Mobileye shapes the way we move ","https://static.mobileye.com/website/us/corporate/images/7ba4fe7b30605759e4dd694b95d8b6b3_1717068889627.jpg","\"It is perhaps time to let go of the steering wheel and put our trust in the autonomous mobility vision\"","\u003Cp>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">People love to travel, to explore, to \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">discover\u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">. \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">We \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">often \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">see ourselves \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">navigat\u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">ing\u003C/span> \u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">towards the future, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">setting\u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\"> course\u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">,\u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\"> and \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">arriving at \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">our \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">destination. \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">But as technology advances \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">and changes\u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\"> our liv\u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">es, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">it is \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">perhaps time\u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\"> to\u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\"> let go of the \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">steering wheel \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">and \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">put \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">our \u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">trust\u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\"> in\u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\"> the\u003C/span> \u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">autonomous mobility\u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\"> vision\u003C/span>\u003Cspan class=\"NormalTextRun SCXW143663402 BCX0\">.\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>It&rsquo;s MaaS and autonomous\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan class=\"TextRun SCXW168840776 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun CommentStart SCXW168840776 BCX0\">Autonomous systems are \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">transform\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">ing\u003C/span> \u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">our commute, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">our \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">access\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\"> to \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">mobility\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">, and \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">our \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">experiences\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\"> in public\u003C/span> \u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">space\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">s\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">.\u003C/span>\u003C/span> \u003Cspan class=\"TextRun SCXW168840776 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">However\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">,\u003C/span> \u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">w\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">hen addressing the future of mobility \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">around our citie\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">s and in our streets\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">, we must \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">distinguish\u003C/span> \u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">it\u003C/span>\u003C/span> \u003Cspan class=\"TextRun SCXW168840776 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">from\u003C/span> \u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">the services \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">we know today\u003C/span> \u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">that are \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">referred to as \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">MaaS\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\"> &ndash;\u003C/span> \u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">from\u003C/span> \u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">public \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">transport services to on-demand\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\"> ride\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\"> or delivery app\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">s\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">,\u003C/span> \u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">that \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">arrive\u003C/span> \u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">with \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">a \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">person \u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">at the wheel\u003C/span>\u003Cspan class=\"NormalTextRun SCXW168840776 BCX0\">.\u003C/span>\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan class=\"EOP SCXW70701312 BCX0\" data-ccp-props=\"{}\">\u003Cspan class=\"TextRun SCXW136905924 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">Mobileye \u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">is not \u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">re\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">cre\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">at\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">ing\u003C/span>\u003C/span> \u003Cspan class=\"TextRun SCXW136905924 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">mass transit\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\"> or \u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">taxis\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">,\u003C/span> \u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">rather\u003C/span> \u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">working on \u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">making\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\"> them autonomous\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">.\u003C/span> \u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">We\u003C/span> \u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">bring\u003C/span> \u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">innovative\u003C/span> \u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">know-how\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\"> and\u003C/span> \u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">proven\u003C/span> \u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">track record\u003C/span> \u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">to \u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">mak\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">e\u003C/span> \u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">th\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">ose\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\"> existing\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\"> services \u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">driverless\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">while\u003C/span> \u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">also\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\"> safer\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\"> than \u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">now\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">, cost-effective\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">,\u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\"> and scalable\u003C/span> \u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">within the \u003C/span>\u003Cspan class=\"NormalTextRun SCXW136905924 BCX0\">existing infrastructure.\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW136905924 BCX0\" data-ccp-props=\"{}\">&nbsp;\u003C/span> \u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan class=\"TextRun SCXW31538426 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">Mobileye helps solve mobility problems \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">not \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">by \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">reinventing the (steering)\u003C/span> \u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">wheel, but by \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">relying \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">on tested and proven mobility concepts, such as on-demand shuttles\u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">,\u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\"> and working together with \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">mobility \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">experts of \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">different colors\u003C/span> \u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">to drive\u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\"> its vision\u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\"> forward\u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">.\u003C/span> \u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">We\u003C/span>\u003C/span> \u003Cspan class=\"TextRun SCXW31538426 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">ha\u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">ve\u003C/span> \u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">established\u003C/span> \u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">a \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">central \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">position\u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\"> allowing us to\u003C/span> work with&nbsp;\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">OEMs, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">transport \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">operators\u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">,\u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\"> and\u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\"> public officials, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">responding to \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">their needs\u003C/span>\u003C/span>\u003Cspan class=\"TrackChangeTextInsertion TrackedChange SCXW31538426 BCX0\">\u003Cspan class=\"TextRun SCXW31538426 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">,\u003C/span>\u003C/span>\u003C/span>\u003Cspan class=\"TextRun SCXW31538426 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\"> while \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">never forgetting\u003C/span> \u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">the end users who will \u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">benefit\u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\"> from it al\u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">l\u003C/span>\u003Cspan class=\"NormalTextRun SCXW31538426 BCX0\">.\u003C/span>\u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/cce12f287c437a3e728459b62909cc59_1717069139023.png\" alt=\"\" width=\"600\" height=\"389\" />\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan class=\"TextRun SCXW174261089 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW174261089 BCX0\">Driving solutions\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW174261089 BCX0\" data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan class=\"TextRun SCXW206062514 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">The introduction\u003C/span>\u003C/span>\u003Cspan class=\"TrackChangeTextInsertion TrackedChange SCXW206062514 BCX0\">\u003Cspan class=\"TextRun SCXW206062514 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">,\u003C/span>\u003C/span>\u003C/span>\u003Cspan class=\"TextRun SCXW206062514 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\"> and growing success\u003C/span>\u003C/span>\u003Cspan class=\"TrackChangeTextInsertion TrackedChange SCXW206062514 BCX0\">\u003Cspan class=\"TextRun SCXW206062514 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">,\u003C/span>\u003C/span>\u003C/span>\u003Cspan class=\"TextRun SCXW206062514 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\"> of Mobileye Drive&trade;, \u003C/span>\u003Cspan class=\"NormalTextRun CommentStart SCXW206062514 BCX0\">Mobileye&rsquo;s fully autonomous system\u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\"> for mass \u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">transit\u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">, is a clear example of \u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">Mobileye&rsquo;s \u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">potential to reshape mobility. \u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">Like\u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\"> our\u003C/span> \u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">other \u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">platforms and technologies\u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">, Drive has \u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">satisfied \u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">rigorous regulations and testing, and\u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">,\u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\"> in some cases, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">has\u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\"> even helped raise the bar \u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">of\u003C/span> \u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">industry expectations, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">thanks to \u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\">our\u003C/span>\u003Cspan class=\"NormalTextRun SCXW206062514 BCX0\"> extensive knowledge.\u003C/span>\u003C/span>&nbsp;&nbsp;\u003C/p>\n\u003Cp>The platform is agnostic and its one-size-fits-most approach empowers its capabilities to operate in different locations, with different vehicles, and help solve distinct issues many cities and communities face. In early 2024 it was announced that Mobileye Drive would begin \u003Ca href=\"https://www.mobileye.com/news/mobileye-and-project-3-mobility-announce-collaboration-for-scalable-urban-autonomous-mobility-project/\" target=\"_blank\" rel=\"noopener\">testing and validation in Zagreb, Croatia\u003C/a> as part of our collaboration with local company Project 3 Mobility. It is also being tested on the streets of \u003Ca href=\"https://www.mobileye.com/news/volkswagen-commercial-vehicles-begins-av-testing-with-mobileye-drive/\">Munich and Austin, Texas\u003C/a> as part of Mobileye's work with the VW group, and in \u003Ca href=\"https://www.linkedin.com/posts/mobileye_norway-oslo-autonomousdriving-activity-7016464919113940992-XGT5?utm_source=share&amp;utm_medium=member_desktop\" target=\"_blank\" rel=\"noopener\">Oslo in a pilot with Ruter and Holo\u003C/a>.&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan class=\"TextRun SCXW183597323 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW183597323 BCX0\">The AV impact&nbsp;\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW183597323 BCX0\" data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan class=\"TextRun SCXW35378623 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun CommentStart SCXW35378623 BCX0\">Cities all around the world are similar in many ways\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> they house a big population, have large-scale infrastructure system\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">s\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">a \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">public\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> education\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> system\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">, and space\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">s\u003C/span> \u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">for\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> culture\u003C/span> \u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">to\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> prosper\u003C/span>\u003C/span>\u003Cspan class=\"TrackChangeTextInsertion TrackedChange SCXW35378623 BCX0\">\u003Cspan class=\"TextRun SCXW35378623 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">.\u003C/span>\u003C/span>\u003C/span>\u003Cspan class=\"TextRun SCXW35378623 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> However, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">as\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> the cities \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">grew in the past century, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">their residents\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">&rsquo; \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">reliance on \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">cars\u003C/span> \u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">grew as well,\u003C/span> \u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">leading\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> to increased congestion, pollution, and reduced safety. \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">But every city is also unique, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">and every metropolitan, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">town\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> or village has\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> specific \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">characteristics and needs that \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">distinguish \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">it\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">.\u003C/span> \u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">I\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">n dealing with those \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">challenges\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">mobility \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">must \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">evolve,\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> or\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> it\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> will\u003C/span> \u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">become\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> an \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">obstacle.\u003C/span> \u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">By\u003C/span> \u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">offering autonomous solutions to \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">MaaS\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> infrastructure, \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">we can reinvent the way\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">s\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\"> we \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">go anywhere and \u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">everywhere\u003C/span>\u003Cspan class=\"NormalTextRun SCXW35378623 BCX0\">.\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW35378623 BCX0\" data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:1,&quot;335551620&quot;:1,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:279}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>","2024-05-31T07:00:00.000Z","Events, AV Safety, Industry, Autonomous Driving",{"id":575,"type":5,"url":576,"title":577,"description":578,"primary_tag":190,"author_name":10,"is_hidden":11,"lang":12,"meta_description":578,"image":579,"img_alt":580,"content":581,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":582,"tags":583},262,"when-we-say-consumer-av","What do we mean when we say consumer AV?","Consumer autonomous vehicles are more than just cars that drive themselves; they offer a set of capabilities you might not even realize are on the road.","https://static.mobileye.com/website/us/corporate/images/0b9fbe78808b5e1596844c568893f989_1716204022667.png","Mobileye Chauffeur™ represents an even further step towards full autonomous driving as a hands-off/eyes-off system.","\u003Cp>When talking about autonomous cars many of us envision something we might see on the big screen or on TV. In these scenes, our hero is sitting in a car preoccupied and not paying attention to the road or napping, while the car drives itself. The car itself would be driving, often at high speed and sometimes suspended in the air, it might be missing its steering wheel and could even be arguing with its passenger. &nbsp;\u003C/p>\n\u003Cp>The Sci-Fi representation of AVs offers our collective imagination a fantastic future and serves as an effective storytelling device, but it is also a rather simplistic approach to a complex and fascinating technology. These on-screen vehicles are missing a core value of autonomous driving solutions - the car does not work alone or counter the person in it, it is a joint effort by both human and machine.&nbsp;\u003C/p>\n\u003Ch3>Human vs. machine: who&rsquo;s the boss?&nbsp;\u003C/h3>\n\u003Cp>Autonomous driving is a spectrum that has various levels. The Society of Automotive Engineers (SAE), numbered them from 0 (no autonomy) to 5 (full autonomy, under all conditions), but since Prof. Amnon Shashua introduced a new taxonomy, it is easier for many to understand this concept with a simple observation&mdash;who supervises whom?&nbsp;\u003C/p>\n\u003Cp>It is important to note that automated driving capabilities already exist, making our roads safer. Our cars prevent us from drifting between lanes, provide braking in case of an emergency, and remind us to slow down. Our hands are on the steering wheel, our eyes are on the road, but in a way, the car supervises us while we drive and acts when needed. Mobileye ADAS and Cloud-Enhanced Driver-Assist&trade; are such solutions that allow the car to affect our driving passively or actively. &nbsp;\u003C/p>\n\u003Cp>The final feature in the ADAS category and the next level of automated driving capabilities reverses the roles between the driver and the car. It is called Mobileye SuperVision&trade; and it&rsquo;s a hands-off/eyes-on platform, meaning the driver is watching the road and supervising the car as it drives and maneuvers within a specified Operational Design Domain (ODD). Utilizing 11 cameras and proprietary Mobileye technology, the platform allows the driver to sit behind the wheel and to only engage when needed as he or she watches the road and monitors the drive within the ODD. Some car makers already offer services built on this platform for highway hands-off/eyes-on driving with urban soon to follow. Most recently the \u003Ca href=\"https://www.mobileye.com/news/automated-driving-volkswagen-group-intensifies-collaboration-with-mobileye/\" target=\"_blank\" rel=\"noopener\">Volkswagen Group announced\u003C/a> it is intensifying its collaboration with Mobileye, as the two companies are set to bring new automated driving functions to series production built from SuperVision.&nbsp;\u003C/p>\n\u003Ch3>\u003Cspan class=\"TextRun SCXW6534606 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW6534606 BCX0\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/0b9fbe78808b5e1596844c568893f989_1716204022668.png\" alt=\"\" width=\"1000\" height=\"388\" />\u003C/span>\u003C/span>\u003C/h3>\n\u003Ch3>\u003Cspan class=\"TextRun SCXW6534606 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW6534606 BCX0\">Synergy\u003C/span>\u003Cspan class=\"NormalTextRun SCXW6534606 BCX0\"> that brings back time\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW6534606 BCX0\" data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:279}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>In addition to Mobileye SuperVision, Mobileye Chauffeur&trade; has also been named by VW Group for series production. Chauffeur represents an even further step towards full autonomous driving as a hands-off/eyes-off system. The Chauffeur platform allows the driver and the car to reach a much higher level of synergy and trust, as drivers can take their eyes off the road and direct their attention elsewhere, in certain areas and roads (like highways and on/off ramps) within the ODD, gaining back time, while also being ready to step in in case the car signals it needs their attention.&nbsp;\u003Cbr />\u003Ciframe src=\"https://player.vimeo.com/video/935584586?h=960d25a662\" width=\"640\" height=\"360\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/p>\n\u003Ch6>\u003Cbr />\"We wanted to come up with something that consumers could understand... so the first question is who is driving?\" Prof. Shai Shalev-Shwartz explains human-machine AV synergy at Bloomberg AI\u003C/h6>\n\u003Cp>Mobileye Chauffeur builds on the different technologies mentioned above. It is a consumer AV system that is cost-effective, operational in various environments and areas, suitable for different car models, and ready for market. It offers automakers the option to preserve their unique brand and reputation by customizing certain aspects of the autonomous driving experience, maintaining their style, and thanks to Mobileye&rsquo;s mapping technology - REM&trade;, these features are not restricted by geography. Additionally, Mobileye&rsquo;s True Redundancy&trade; approach, which is powered by two independent, standalone systems which complement one another: a camera system and a radar-lidar system. Combining these elements creates a higher level of safety for an autonomous driving experience that mimics the driving style of other road users.&nbsp;\u003C/p>\n\u003Cp>This new level of synergy gives autonomy to drivers (over their time) and cars (over the driving). It shows autonomous driving is not just a futuristic vehicle that can drive by itself on the silver screen, but rather a spectrum of capabilities that work best when the car and the driver assist each other and are in-sync while driving, overseeing each other keeping the journey safe. &nbsp;&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2024-05-20T07:00:00.000Z","Autonomous Driving, AV Safety",{"id":585,"type":5,"url":586,"title":587,"description":588,"primary_tag":32,"author_name":589,"is_hidden":11,"lang":12,"meta_description":588,"image":590,"img_alt":588,"content":591,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":592,"tags":593},261,"autonomous-decisions-the-bias-variance-tradeoff-in-self-driving-technology","Autonomous decisions: The bias-variance tradeoff in self-driving technology","Monolithic versus compound AI systems in LLMs and autonomous driving.","Prof. Amnon Shashua and Prof. Shai Shalev-Shwartz","https://static.mobileye.com/website/us/corporate/images/f8167f28d4d800bd54b3e04c9fda5a5d_1715717822589.jpg","\u003Cp>Back in November 2022, the release of ChatGPT garnered widespread attention, not only for its versatility but also for its end-to-end design. This design involved a single foundational component, the GPT 3.5 large language model, which was enhanced through both supervised learning and reinforcement learning from human feedback to support conversational tasks. This holistic approach to AI was highlighted again with the launch of Tesla&rsquo;s latest FSD system, described as an end-to-end neural network that processes visual data directly &ldquo;from photons to driving control decisions,\" without intermediary steps or &ldquo;glue code.\"&nbsp;\u003C/p>\n\u003Cp>While AI models continue to evolve, the latest generation of ChatGPT has moved away from the monolithic E2E approach. Consider the following example of a conversation with ChatGPT:\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/chatgpt_blog2.png\" alt=\"\" width=\"746\" height=\"448\" />\u003C/p>\n\u003Cp>When asked to compute \"what is 3456 * 3678?,\" the system first translates the question into a short Python script to perform the calculation, and then formats the output of the script into a coherent natural language text. This demonstrates that ChatGPT does not rely on a single, unified process. Instead, it integrates multiple subsystems&mdash;including a robust deep learning model (GPT LLM) and separately coded modules. Each subsystem has its defined role, interfaces, and development strategies, all engineered by humans. Additionally, 'glue code' is employed to facilitate communication between these subsystems. This architecture is referred to as &ldquo;\u003Ca href=\"https://bair.berkeley.edu/blog/2024/02/18/compound-ai-systems/\" target=\"_blank\" rel=\"noopener\">Compound AI Systems\u003C/a>\u003Ca href=\"../../blog/autonomous-decisions-the-bias-variance-tradeoff-in-self-driving-technology/#_edn1\">\u003Csup>1\u003C/sup>\u003C/a>&rdquo; (CAIS).\u003C/p>\n\u003Cp>Before we proceed, it is crucial to dispel misconceptions about which system architecture is \"new\" or \"traditional\". Despite the hype, the E2E approach in autonomous driving is not a novel concept; it dates back to \u003Ca href=\"https://proceedings.neurips.cc/paper/1988/file/812b4ba287f5ee0bc9d43bbf5bbe87fb-Paper.pdf\" target=\"_blank\" rel=\"noopener\">the Alvinn project (Pomeraleau, 1989)\u003C/a>\u003Ca href=\"../../blog/autonomous-decisions-the-bias-variance-tradeoff-in-self-driving-technology/#_edn2\">\u003Csup>2\u003C/sup>\u003C/a>. The CAIS approach is also not new, but as we have shown above, it has been adopted by the most recent versions of ChatGPT.\u003C/p>\n\u003Cp>\u003Cem>This blog aims to explore the nuances between E2E systems and CAIS, by drawing a deep connection to the \u003C/em>\u003Ca href=\"https://en.wikipedia.org/wiki/Bias&ndash;variance_tradeoff\" target=\"_blank\" rel=\"noopener\">\u003Cem>bias-variance tradeoff\u003C/em>\u003C/a>\u003Ca href=\"../../blog/autonomous-decisions-the-bias-variance-tradeoff-in-self-driving-technology/#_edn3\">\u003Cem>\u003Csup>3\u003C/sup>\u003C/em>\u003C/a>\u003Cem> in machine learning and statistics. For concreteness, we focus the discussion on self-driving systems, but the connection is applicable more generally to the design of any AI-based system.\u003C/em>\u003C/p>\n\u003Ch3>\u003Cstrong>The bias-variance tradeoff\u003C/strong>\u003C/h3>\n\u003Cp>The grand question, driving any school of thought for building a data-driven system, is the \"bias/variance\" tradeoff. Bias, also known as &ldquo;approximation error,\" means that our learning system cannot reflect the full richness of reality. Variance, also known as &ldquo;generalization error,\" means that our learning system overfits to the observed data, and fails to generalize to unseen examples.\u003C/p>\n\u003Cp>The total error of the learned model is the sum of the approximation and generalization errors, so in order to reach a sufficiently small error, we need to delicately control both. There is a tradeoff between the two terms since we can decrease the generalization error by restricting the learned model to come from a specific family of models, but this might introduce a bias if the chosen family of models cannot reflect the full richness of reality.\u003C/p>\n\u003Cp>Based on this background, we can formalize the two approaches for building a self-driving system. A CAIS, or an &ldquo;engineered system,\" deliberately puts architectural restrictions on the self-driving system for the sake of reducing the generalization error. This introduces some bias. For example, going from \"photons\" to a &ldquo;sensing state,\" which is a model of reality surrounding the host vehicle&mdash;location and measurements of road users, roadway structures, drivable paths, obstacles and so forth&mdash;and from there to control decisions introduces bias. The reason for this bias is that the sensing state might not be rich enough to reflect reality to its fullest and therefore the capacity of the system is constrained. In contrast, an E2E network skipping the sensing state step and instead mapping incoming videos directly to vehicle control decisions would not suffer from this bias of the &ldquo;sensing state abstraction.\" On the other hand, the E2E approach will have a higher generalization error, and the approach advocated by the E2E proponents is to compensate for this error with huge amounts of data, which in turn necessitates a huge investment in compute, storage, and data engines.\u003C/p>\n\u003Cp>To recap, the E2E approach &ndash; in the simplistic form being communicated to the public &ndash; is to define a system with zero bias while incrementally reducing variance through volumes of data for training the system with the purpose of gradually eliminating all &ldquo;corner cases\". In an engineered approach, on the other hand, the system starts with a built-in bias due to the abstraction of the sensing state and driving policy while (further) reducing variance through data fed into separate subsystems with a high-level fusion glue-code. The amount of data required for each subsystem is exponentially smaller than the amount of data required for a single monolithic system.\u003C/p>\n\u003Ch3>\u003Cstrong>The devil is in the AI details\u003C/strong>\u003C/h3>\n\u003Cp>The story, however, is more delicate than the above dichotomy. Starting from the bias element, it&rsquo;s not that E2E systems will have zero bias&mdash;since the neural network resides on an on-board computer in the car, its size is constrained by the available compute and memory. Therefore, the limited compute available in a car introduces bias. In addition, the approximation error of a well-engineered approach is not necessarily excessively large for several reasons. For one, Waymo is clear evidence that the bias of an engineered system is sufficiently small for building a safe autonomous car. Moreover, in a well-engineered system we can add a subsystem that skips some of the abstractions (e.g., the sensing state abstraction) and thus further reduce the bias in the overall system.\u003C/p>\n\u003Cp>The variance element is also more nuanced. In an \"engineered\" system, the variance is reduced through abstractions (such as sensing state) as well as through the high-level fusion of multiple subsystems. In a pure E2E system, the variance should be reduced only through more and more data. But this process of reducing variance through a data pipeline deserves more scrutiny. Take the notion of mean time between failures (MTBF) and let's assume that failures are measured by critical interventions. Let's take MTBF as a measure of readiness of a self-driving vehicle to operate in an \"eyes-off\" manner. In Mobileye's engagement with car makers the MTBF target is 10\u003Csup>7\u003C/sup> hours of driving. Just for reference, \u003Ca href=\"https://www.teslafsdtracker.com/home\" target=\"_blank\" rel=\"noopener\">public data\u003C/a>\u003Ca href=\"../../blog/autonomous-decisions-the-bias-variance-tradeoff-in-self-driving-technology/#_edn4\">\u003Csup>4\u003C/sup>\u003C/a> on Tesla's recent V12.3.6 version of FSD stands around 300 miles per critical intervention which amounts to an MTBF of roughly 10 hours &ndash; which is 6 orders of magnitude away from the target MTBF. Let's assume that somehow the MTBF has reached 10\u003Csup>6\u003C/sup> hours and we wish merely to improve it by one order of magnitude in order to reach 10\u003Csup>7\u003C/sup> hours. How much data would we need to collect? This is a question about the nature of the long tail of a distribution. To make things concrete, assume we have 1 million vehicles on the road driving one hour per day and the data in question is event-driven&mdash;i.e., when an intervention of the human driver occurs then a recording of some time around the event is being made and sent to the car maker for further training. An intervention event represents a \"corner case\" and the question of the long tail is how those corner cases are distributed. Consider the following scenario where B is the set of &ldquo;bad&rdquo; corner cases. An example of a heavy tail distribution is when B = {b_1,...,b_{1000}} and the probability of b_i to occur is 10\u003Csup>-9\u003C/sup> for every i. Even if we assume that when a corner case is discovered then we can somehow retrain the network and fix that corner case, without creating any new corner cases, we must knock off around 900 corner cases so that P(B) = 10\u003Csup>-7\u003C/sup>. Because the MTBF is 10\u003Csup>6\u003C/sup> then we encounter one corner case per day. It follows that we will need around 3 years to get this done. The point here is that no one knows how the long tail is structured - we gave one possible long tail scenario but in reality it could be worse or it could be better. As mentioned previously, while there is a precedent that the bias of an engineered system is sufficient for building a safe autonomous car (e.g. Waymo), there is still no precedent that reducing variance solely by a recurring data engine is sufficient for building a safe autonomous car.\u003C/p>\n\u003Cp>It follows that to solely double-down on a data pipeline might be too risky. What else can be done in the E2E approach for reducing variance? To lead into it lets ask ourselves:\u003C/p>\n\u003Cp>\u003Cbr />&nbsp; (i) Why does Tesla FSD have a sensing state in their display? The idea of an E2E system is you go from \"photons to control\" while skipping the need to build a sensing state. Are they doing that solely for the purpose of notifying the driver what the system &ldquo;sees?\" Or does it have a more tacit purpose?\u003C/p>\n\u003Cp>\u003Cbr />&nbsp; (ii) Why has Tesla \u003Ca href=\"https://www.theverge.com/2024/5/7/24151497/tesla-lidar-bought-luminar-elon-musk-sensor-autonomous\" target=\"_blank\" rel=\"noopener\">purchased 2,000 lidars\u003C/a>\u003Ca href=\"../../blog/autonomous-decisions-the-bias-variance-tradeoff-in-self-driving-technology/#_edn5\">\u003Csup>5\u003C/sup>\u003C/a> from Luminar? Presumably for creating ground truth (GT) data for a supervised training. But why?\u003C/p>\n\u003Cp>The two riddles are of course related.\u003C/p>\n\u003Cp>Imagine an E2E network comprising of a backbone and two heads - one for outputting vehicle control and the other for outputting the sensing state. Such a network is still technically E2E (from photons to control) but also has a branch for sensing state. The sensing state branch needs to be trained in a supervised manner from GT data, hence the need for 2,000 lidars. The real question is whether the GT data can be created automatically without manual labeling. The answer is definitely yes because Mobileye does that. We have an \"auto-GT\" pipeline for training for sensing state. And, the reason why you would want a branch outputting the sensing state is not merely for displaying the sensing state to the driver. The real purpose of the sensing state branch is to reduce the variance (and more importantly the sample complexity of the system which is the amount of data needed for training) of the system through the \"multi-tasking\" principle. We addressed this \u003Ca href=\"https://arxiv.org/pdf/1604.06915\" target=\"_blank\" rel=\"noopener\">back in 2016\u003C/a>\u003Ca href=\"../../blog/autonomous-decisions-the-bias-variance-tradeoff-in-self-driving-technology/#_edn6\">\u003Csup>6\u003C/sup>\u003C/a> and gave as an example an agricultural vehicle on the side of the road. The probability of observing a rare vehicle type somewhere in the image is much higher than the probability of observing such a vehicle immediately in front of us. Therefore, without a sensing state head (that detects the rare vehicle type on the shoulder even if it is not affecting the control of the host vehicle) one would need much more data in order to see such a vehicle immediately in front us, which affects the control of the host vehicle. What this comes to show is that the sensing state abstraction is important as it hints to the neural network that a good approach for giving correct control commands may need to understand the concept of vehicles and to detect all of the vehicles around us. Importantly, this abstraction is learned through supervised learning, using GT data which is created automatically (by a well-engineered offline system).\u003C/p>\n\u003Cp>Another component of the popular E2E narrative is \"no glue-code\" in the system. No glue-code means no bugs being entered by careless engineers. But this too is a misconception. There is glue-code &ndash; not in the neural network but in the process of preparing the data for the E2E training. One clear example is the automatic creation of GT data. Elon Musk gave such an example &ndash; that of human drivers not respecting stop signs as they should and instead perform a \u003Ca href=\"https://www.teslarati.com/tesla-rolling-stop-markey-blumenthal-letter-elon-musk/\" target=\"_blank\" rel=\"noopener\">rolling stop\u003C/a>\u003Ca href=\"../../blog/autonomous-decisions-the-bias-variance-tradeoff-in-self-driving-technology/#_edn7\">\u003Csup>7\u003C/sup>\u003C/a>. The rolling stop events had to be taken out of the training data so that the E2E system would not adopt bad behavior (as it is supposed to imitate humans). In other words, the glue-code (and bugs) are shifting from the system code to data curation code. Actually, it may be easier to detect bugs in system code (at least there are existing methodologies for that) than to detect bugs in data curation.\u003C/p>\n\u003Ch3>\u003Cstrong>Don&rsquo;t get into the way of analytic solutions\u003C/strong>\u003C/h3>\n\u003Cp>Finally, we would like to point out that when we have an analytical solution for a problem, it is certainly not better to use a purely E2E machine learning method. For example, the long multiplication exercise mentioned at the start of this blog highlights the fact that ChatGPT+, rightfully so, uses a good old calculator for the task and does not attempt to \"learn\" how to do long multiplication from a massive amount of data. In the driving policy stack (determining the actions the host vehicle should do and outputting the vehicle control commands) there are numerous analytical calculations. Knowing when to replace \"learning\" with analytical calculations is crucial for variance reduction but also for transparency, explainability and \u003Ca href=\"https://www.mobileye.com/opinion/mobileye-dxp-as-a-novel-approach/\">tuning\u003C/a>\u003Ca href=\"../../blog/autonomous-decisions-the-bias-variance-tradeoff-in-self-driving-technology/#_edn8\">\u003Csup>8\u003C/sup>\u003C/a> (imitating humans is somewhat problematic because many of them are not good drivers).\u003C/p>\n\u003Ch3>\u003Cstrong>Final words\u003C/strong>\u003C/h3>\n\u003Cp>AI is progressing at a remarkable pace. The revolution of foundation models of the like of ChatGPT began as a monolithic E2E model (back in 2022) whereas today it evolved into \u003Ca href=\"https://openai.com/chatgpt/pricing\" target=\"_blank\" rel=\"noopener\">ChatGPT+\u003C/a>\u003Ca href=\"../../blog/autonomous-decisions-the-bias-variance-tradeoff-in-self-driving-technology/#_edn9\">\u003Csup>9\u003C/sup>\u003C/a> which represents an engineered solution built on top of AI components including the base LLM, plugins for retrieval, code interpreter and image generation tools. There is much to be said about the claims that the future of AI is \u003Ca href=\"https://bair.berkeley.edu/blog/2024/02/18/compound-ai-systems/\" target=\"_blank\" rel=\"noopener\">shifting\u003C/a>\u003Ca href=\"../../blog/autonomous-decisions-the-bias-variance-tradeoff-in-self-driving-technology/#_edn10\">\u003Csup>10\u003C/sup>\u003C/a> from monolithic models to compound AI systems. We believe that this trend is even more important given the extremely high accuracy requirement for, and safety-critical aspect of, autonomous driving.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cu>References:\u003C/u>\u003C/p>\n\u003Cp style=\"font-size: 1.3rem; margin-top: 1.4rem;\">\u003Ca id=\"_edn1\" href=\"#_ednref1\" name=\"_edn1\">\u003C/a>\u003Csup class=\"ref\">1\u003C/sup> Compound AI Systems\u003Cbr />\u003Ca href=\"https://bair.berkeley.edu/blog/2024/02/18/compound-ai-systems/\">https://bair.berkeley.edu/blog/2024/02/18/compound-ai-systems/\u003C/a>\u003C/p>\n\u003Cp style=\"font-size: 1.3rem; margin-top: 1.4rem;\">\u003Ca id=\"_edn2\" href=\"#_ednref2\" name=\"_edn2\">\u003C/a>\u003Csup class=\"ref\">2\u003C/sup> the Alvinn project (Pomeraleau, 1989)\u003Cbr />\u003Ca style=\"word-wrap: break-word;\" href=\"https://proceedings.neurips.cc/paper/1988/file/812b4ba287f5ee0bc9d43bbf5bbe87fb-Paper.pdf\">https://proceedings.neurips.cc/paper/1988/file/812b4ba287f5ee0bc9d43bbf5bbe87fb-Paper.pdf\u003C/a>\u003C/p>\n\u003Cp style=\"font-size: 1.3rem; margin-top: 1.4rem;\">\u003Ca id=\"_edn3\" href=\"#_ednref3\" name=\"_edn3\">\u003C/a>\u003Csup class=\"ref\">3\u003C/sup> Bias-variance tradeoff\u003Cbr />\u003Ca href=\"https://en.wikipedia.org/wiki/Bias&ndash;variance_tradeoff\">https://en.wikipedia.org/wiki/Bias&ndash;variance_tradeoff\u003C/a>\u003C/p>\n\u003Cp style=\"font-size: 1.3rem; margin-top: 1.4rem;\">\u003Ca id=\"_edn4\" href=\"#_ednref4\" name=\"_edn4\">\u003C/a>\u003Csup class=\"ref\">4\u003C/sup> Public data on Tesla's recent V12.3.6 version of FSD\u003Cbr />\u003Ca href=\"https://www.teslafsdtracker.com/home\">https://www.teslafsdtracker.com/home\u003C/a>\u003C/p>\n\u003Cp style=\"font-size: 1.3rem; margin-top: 1.4rem;\">\u003Ca id=\"_edn5\" href=\"#_ednref5\" name=\"_edn5\">\u003C/a>\u003Csup class=\"ref\">5\u003C/sup> Tesla&nbsp;purchased 2000 Lidars from Luminar\u003Cbr />\u003Ca href=\"https://www.theverge.com/2024/5/7/24151497/tesla-lidar-bought-luminar-elon-musk-sensor-autonomous\">https://www.theverge.com/2024/5/7/24151497/tesla-lidar-bought-luminar-elon-musk-sensor-autonomous\u003C/a>\u003C/p>\n\u003Cp style=\"font-size: 1.3rem; margin-top: 1.4rem;\">\u003Ca id=\"_edn6\" href=\"#_ednref6\" name=\"_edn6\">\u003C/a>\u003Csup class=\"ref\">6\u003C/sup> On the Sample Complexity of End-to-end Training vs. Semantic Abstraction Training \u003Cbr />\u003Ca href=\"https://arxiv.org/pdf/1604.06915\">https://arxiv.org/pdf/1604.06915\u003C/a>\u003C/p>\n\u003Cp style=\"font-size: 1.3rem; margin-top: 1.4rem;\">\u003Ca id=\"_edn7\" href=\"#_ednref7\" name=\"_edn7\">\u003C/a>\u003Csup class=\"ref\">7\u003C/sup> Human drivers not respecting stop signs as they should and instead perform a&nbsp;rolling stop\u003Cbr />\u003Ca href=\"https://www.teslarati.com/tesla-rolling-stop-markey-blumenthal-letter-elon-musk/\">https://www.teslarati.com/tesla-rolling-stop-markey-blumenthal-letter-elon-musk/\u003C/a>\u003C/p>\n\u003Cp style=\"font-size: 1.3rem; margin-top: 1.4rem;\">\u003Ca id=\"_edn8\" href=\"#_ednref8\" name=\"_edn8\">\u003C/a>\u003Csup class=\"ref\">8\u003C/sup> Mobileye DXP as a novel approach \u003Cbr />\u003Ca href=\"https://www.mobileye.com/opinion/mobileye-dxp-as-a-novel-approach/\">https://www.mobileye.com/opinion/mobileye-dxp-as-a-novel-approach/\u003C/a>\u003C/p>\n\u003Cp style=\"font-size: 1.3rem; margin-top: 1.4rem;\">\u003Ca id=\"_edn9\" href=\"#_ednref9\" name=\"_edn9\">\u003C/a>\u003Csup class=\"ref\">9\u003C/sup> ChatGPT+\u003Cbr />\u003Ca href=\"https://openai.com/chatgpt/pricing\">https://openai.com/chatgpt/pricing\u003C/a>\u003C/p>\n\u003Cp style=\"font-size: 1.3rem; margin-top: 1.4rem;\">\u003Ca id=\"_edn10\" href=\"#_ednref10\" name=\"_edn10\">\u003C/a>\u003Csup class=\"ref\">10\u003C/sup> Future of AI is shifting from monolithic models to compound AI systems\u003Cbr />\u003Ca href=\"https://bair.berkeley.edu/blog/2024/02/18/compound-ai-systems/\">https://bair.berkeley.edu/blog/2024/02/18/compound-ai-systems/\u003C/a>\u003C/p>","2024-05-15T07:00:00.000Z","Amnon Shashua, Autonomous Driving",{"id":595,"type":5,"url":596,"title":597,"description":598,"primary_tag":397,"author_name":10,"is_hidden":11,"lang":12,"meta_description":598,"image":599,"img_alt":600,"content":601,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":602,"tags":603},260,"mobileye-at-ncap24-centering-on-safety","Mobileye at NCAP24: Centering on safety ","Global NCAP’s inaugural world congress offers forum for the safety rating agency’s and Mobileye’s shared values ","https://static.mobileye.com/website/us/corporate/images/cb6d41308d3f2f740ef7f3870309c414_1713723171460.jpg","Mobileye at NCAP24","\u003Cp>Since its inception over two decades ago, Mobileye has set its eyes on a clear goal &ndash; making our roads safer though advanced technology. They were a small team led by prof. Amnon Shashua when they aimed to revolutionize the driver-assist field using a single camera and a system-on-a-chip. Now, approximately 170 million EyeQ&trade; chips worldwide later, as the company reaches new technological heights with the \u003Ca href=\"https://www.mobileye.com/news/mobileye-eyeq6-lite-launches-to-speed-adas-upgrades-worldwide/\" target=\"_blank\" rel=\"noopener\">launch of the EyeQ6 Lite SOP\u003C/a> - its latest innovative computer vision and machine learning technology, Mobileye continues to invest in the core product that built its foundation &ndash; Mobileye Base ADAS&trade;.&nbsp;\u003C/p>\n\u003Cp>Automotive safety advocates and regulators around the world have recognized the power of Advanced Driver-Assistance System, with research showing it helps to prevent car accidents and to save lives. Mobileye ADAS includes such features as Automatic Cruise Control (ACC), lane centering abilities, and lane departure warning. It also meets GSR ISA requirements in Europe and offers NCAP tested functions including automatic emergency braking (AEB), forward collision warning, traffic sign recognition, and lane keep assist. &nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/268fb7695a78def10782054ce8a5cacd_1713723444872.jpg\" alt=\"NCAP Functions\" width=\"600\" height=\"389\" />\u003C/p>\n\u003Cp>Mobileye&rsquo;s driver-assist solutions are designed to help create safer roads and can pass rigorous regulation testing. Achieving the coveted 5-star rating from Euro NCAP demands increasingly sophisticated features, prompting the development of new capabilities, a challenge which Mobileye has met with its innovative solutions.&nbsp;\u003C/p>\n\u003Cp>As the scope of compliance and safety functions continues to broaden, and in keeping with new and sophisticated technologies that help saving lives around the world, Mobileye maintains a proven track record of delivering these essential capabilities and tools to the automotive industry. We combine our highest performing and cost-effective hardware/software combination, leveraging our expertise in machine learning for ADAS, to remain at the forefront of advancing automotive technology.&nbsp;\u003C/p>\n\u003Cp>This year, Mobileye is proudly sponsoring \u003Ca href=\"https://www.globalncap.org/ncap24\" target=\"_blank\" rel=\"noopener\">NCAP24\u003C/a> &ndash; a new event shaping \"the future of safer vehicles and sustainable mobility,&rdquo; bringing together various stakeholders with the shared goal of making roads safer.&nbsp;\u003C/p>\n\u003Cp>NCAP24 is yet another testament to Mobileye&rsquo;s driven by vision spirit, participating and leading multi stakeholders in creating real change with life-saving technology.&nbsp;\u003C/p>","2024-04-21T07:00:00.000Z","ADAS, Events, Industry, News",{"id":605,"type":5,"url":606,"title":607,"description":608,"primary_tag":9,"author_name":10,"is_hidden":11,"lang":12,"meta_description":608,"image":609,"img_alt":610,"content":611,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":612,"tags":255},259,"mobileye-isa-system-picked-by-top-oem","Mobileye’s vision-only ISA system is picked up by top OEM","The cost-saving solution, which has already been shown to exceed rigorous EU GSR requirements, is introduced to the OEM’s operation in Turkey","https://static.mobileye.com/website/us/corporate/images/b86a31dd7d3831dff5fe447ffed653f5_1713421057360.jpg","ISA systems are designed to help drivers stay within the legal speed limit","\u003Cp>\u003Cspan data-contrast=\"auto\">A major EU-based automaker is incorporating Mobileye&rsquo;s vision-only intelligent speed assist (ISA) solution onto its vehicles for sale in Turkey, after the system showed great success both in identifying signs in complicated environments and in recognizing and adjusting to toll booths. This new progress follows six independent labs in five different European countries already having tested Mobileye&rsquo;s ISA software and confirmed that it meets, and at times even exceeds, the European Union General Vehicle Safety Regulation&rsquo;s (GSR) requirements.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The GSR sets July 2024 as the month by which all new passenger vehicles sold in the EU must meet specific ISA requirements. Several automakers have tested Mobileye&rsquo;s EyeQ&trade;4-based ISA solution and found that it successfully meets all such requirements, with the software now certified for use in all 27 EU countries, Norway, Switzerland, and Turkey.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"GSR ISA\" src=\"https://player.vimeo.com/video/936229338?h=2c3d0eff9c&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"425\" height=\"350\" frameborder=\"0\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">\u003Cstrong>\u003Cspan class=\"TextRun SCXW47544679 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun CommentStart SCXW47544679 BCX0\">Mapless\u003C/span> \u003Cspan class=\"NormalTextRun SCXW47544679 BCX0\">&ldquo;futureproof&rdquo; \u003C/span>\u003Cspan class=\"NormalTextRun SCXW47544679 BCX0\">savings\u003C/span>\u003Cspan class=\"NormalTextRun SCXW47544679 BCX0\">&nbsp;\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW47544679 BCX0\" data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/strong>\u003Cbr />\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">\u003Cspan class=\"EOP SCXW47544679 BCX0\" data-ccp-props=\"{}\">The Mobileye ISA solution is unique in both its technology and in reducing manufacturer costs. It is the first vision-only ISA system, which means no third-party maps are used, thus reducing costs. In addition, our solution in some cases surpasses the GSR standards, which require all new vehicles to be able to automatically identify speed limits. &nbsp;&nbsp;\u003Cbr />\u003Cbr />\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/9e05a0a421b3b65084a37d54aa9931b9_1713421857616.jpg\" alt=\"EU Flag\" width=\"600\" height=\"389\" />\u003Cbr />\u003C/span>\u003C/span>\u003C/p>\n\u003Cp>Automakers already incorporating the Mobileye EyeQ4 and later platforms containing the ISA solution into their vehicles can meet the new standards merely by updating the EyeQ&rsquo;s existing software, without any new hardware requirements. Additionally, the Mobileye ISA system can be trained to detect new signs that may be added to the GSR traffic signs catalog. These capabilities should serve to &ldquo;futureproof&rdquo; the technology for up to 14 years after a vehicle model is launched, as well as enable those vehicles to detect new signs with no need to change version or do any re-validation or regression testing. &nbsp;&nbsp;\u003C/p>\n\u003Cp>Mobileye ISA also offers driver-assistance features beyond the letter of the GSR. For example, it detects conditional or supplementary signs and recognizes nearing toll booths.&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>Reducing cost, not safety level&nbsp;\u003C/strong>\u003C/h3>\n\u003Cp>ISA systems are designed to help drivers stay within the legal speed limit either by passively alerting them or by actively intervening to reduce vehicle speed. According to the regulation, ISA systems have to work anywhere in the European Union territory; this makes driving safer but, unfortunately, can also drive-up manufacturing costs which might eventually fall at the consumer's feet. &nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/092f9d1742886c5cc226f5996d21a773_1713422144538.jpg\" alt=\"\" width=\"600\" height=\"389\" />\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Most ISA solutions currently on the market rely on a mixture of technologies that includes traffic sign recognition (TSR) abilities, a mapping database, and GPS, among others. This combination often requires use of third-party services (e.g. the mapping database) that not only come with their own price tag but could also be less reliable as they lack real-time adjustment (for example, they are unaware of ad-hoc speed limits due to road construction). This is where Mobileye&rsquo;s vision-only ISA system makes a particular financial impact: it doesn&rsquo;t rely on any third-party providers, while still offering proven features that meet and, in some cases, exceed GSR requirements \u003C/span>\u003Cem>\u003Cspan data-contrast=\"auto\">and\u003C/span>\u003C/em>\u003Cspan data-contrast=\"auto\"> react to changing events on the road.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye&rsquo;s solution is the type of technological innovation that helps OEMs protect their vehicles&rsquo; users. Mobileye&rsquo;s &ldquo;built for safety - built for scale&rdquo; mantra serves as the right approach at the right time to help carmakers and policymakers create a safer future.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>","2024-04-18T07:00:00.000Z",{"id":614,"type":24,"url":615,"title":616,"description":617,"primary_tag":28,"author_name":10,"is_hidden":11,"lang":12,"meta_description":617,"image":618,"img_alt":619,"content":620,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":32,"publish_date":621,"tags":622},258,"mobileye-eyeq6-lite-launches-to-speed-adas-upgrades-worldwide","Mobileye EyeQ™6 Lite launches to speed ADAS upgrades globally","The latest generation of Mobileye's pioneering system-on-chip and software already set to enable safety and convenience in 46 million vehicles. ","https://static.mobileye.com/website/us/corporate/images/fe5c0db6e4aa39bad155323a8a6a3e4c_1713310858467.jpg","The EyeQ6 Lite and an 8-megapixel camera that enables broader data for ADAS applications","\u003Cp>Mobileye announced today it has delivered the first production-candidate hardware and software of its new EyeQ&trade;6 Lite system-on-chip to its customers, which will power advanced driver-assistance systems in multiple models launching this year. This milestone marks the beginning of the EyeQ6 family, with the EyeQ6L already set to be installed in 46 million vehicles over the next few years &ndash; becoming the global auto industry&rsquo;s ADAS solution of choice from the start. It will be followed by the EyeQ6 High advanced system-on-chip, on track to launch in early 2025.\u003C/p>\n\u003Cp>The EyeQ6L builds on Mobileye&rsquo;s 25 years of pioneering work in automotive safety, computer vision, chip design and machine learning, which enabled the widespread adoption of automatic forward collision warning and emergency braking across the automotive industry. To date, more than 170 million vehicles worldwide have been built with Mobileye technology inside.\u003C/p>\n\u003Cp>\"With the EyeQ6L, the team has once again delivered a system-on-chip that enables sizable gains in performance, safety, and comfort features to our customers, without a material increase in price, as has been our pledge for many years,\" said Prof. Amnon Shashua, founder and CEO of Mobileye. &ldquo;We know the power of ADAS to save lives and reduce traffic accidents globally, and with the EyeQ6L, automakers can meet regulatory requirements while delivering meaningful technology improvements to end users.&rdquo;\u003C/p>\n\u003Cp>Many studies have found notable safety benefits from key ADAS technologies. The \u003Ca href=\"https://newsroom.aaa.com/2023/08/your-autos-safety-net-the-lifesaving-potential-of-driving-assistance-tech/\" target=\"_blank\" rel=\"noopener\">AAA Foundation for Traffic Safety has said\u003C/a> technologies available today could prevent 37 million crashes, 14 million injuries and nearly 250,000 deaths in the United States over the next 30 years if made standard on all vehicles. \u003Ca href=\"https://www.iihs.org/media/290e24fd-a8ab-4f07-9d92-737b909a4b5e/HvQHjw/Topics/ADVANCED%20DRIVER%20ASSISTANCE/IIHS-HLDI-CA-benefits.pdf\" target=\"_blank\" rel=\"noopener\">Other research by the Insurance Institute for Highway Safety\u003C/a> has found that automatic emergency braking reduces front to rear crashes by 50 percent, and injury crashes by 57 percent, while pedestrian AEB cuts crashes by 27 percent and lane-departure warning reduces injury crashes by 21 percent. While about 7 out of 10 new vehicles worldwide are sold with some level of ADAS technology today, that share is lower in many high-volume markets, including China and India, where the full benefit of ADAS remains untapped.\u003C/p>\n\u003Cp>Automotive safety advocates and regulators around the world have recognized the power of ADAS to save lives and reduce crashes, and the EyeQ6L was designed to meet not only current standards but future ones as well, from the European Union&rsquo;s General Safety Regulation and new car assessment programs (NCAPs) in dozens of countries, to U.S. regulations and insurance industry assessments, as well as ASIL-B level safety.\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/a9fd33835525070d6bfd807e5effc0fc_1713311058227.png\" alt=\"\" />\u003C/strong>\u003C/p>\n\u003Cp>\u003Cstrong>Benefits and Features for Drivers Worldwide\u003C/strong>\u003C/p>\n\u003Cp>The EyeQ6L combines Mobileye&rsquo;s experience in designing automotive-grade processors and artificial intelligence/machine learning algorithms with its expertise in custom integrated software that optimizes performance and energy efficiency. Designed with two CPU cores and five high-compute density accelerators, the EyeQ6L provides 4.5 times more computing power than the EyeQ4M, at roughly half the physical space, with similar levels of power consumption &ndash; key elements for automotive use. The chip also improves pixel segmentation capabilities through a dynamic neural network with more than double the point density of EyeQ4M.\u003C/p>\n\u003Cp>The EyeQ6L enables systems that can capture much more detailed data of the world around them with an 8-megapixel camera and 120-degree lateral field of vision, a 20-degree increase over the camera available with the EyeQ4M. The increased vision data also powers new environmental sensing and range capabilities; with EyeQ6L, vehicles can sense when roads are dry, wet or snowy and adjust emergency stopping distances accordingly, as well as detecting many types of objects at greater distances. The camera and processor updates enable several advancements to automatic emergency braking systems, such as an increased ability to monitor and react to other vehicles, pedestrians, or random road objects in complex situations &ndash; like a piece of furniture falling off a truck in an adjacent lane, or a cow sitting in the vehicle&rsquo;s path.\u003C/p>\n\u003Cp>Mobileye has been an industry leader in pedestrian and cyclist detection, and the EyeQ6 builds on that record through new sensing and software capabilities, improving vehicle response. The hardware and software power of EyeQ6L also enables upgrades to functions such as lane-keeping assist and automated lane change systems that can find not only the center of the current lane, but the next two lanes on either side of travel. Automated cruise control systems using EyeQ6 can now sense an upcoming curve and slow the vehicle as needed for passenger comfort.\u003C/p>\n\u003Cp>Last year, Mobileye rolled out Intelligent Speed Assist upgrades for its EyeQ4M that could meet European standards for automatically reading and understanding speed limit signs &ndash; whether permanent or temporary signs. The EyeQ6L goes a step further by reading key text phrases on signage, like a speed limit that&rsquo;s only active on weekday mornings, or a city entrance sign that implies a lower speed limit. All this uses only computer vision and on-board software &ndash; not GPS or other external data sources.\u003C/p>\n\u003Cp>&ldquo;This is only the start of our journey with the EyeQ6L,&rdquo; said Nimrod Nehushtan, Executive Vice President of Business Development and Strategy for Mobileye. &ldquo;Through continuous development, tools like EyeQ Kit&trade; and the ability to use over-the-air software updates, the EyeQ6 family has the ability to serve the industry for many years to come, making driving safer and more convenient for millions.&rdquo;\u003C/p>\n\u003Cp>The other member of the EyeQ6 family, the EyeQ6 High, is set to enter series production early next year and power all of Mobileye&rsquo;s advanced automated driving technologies &ndash; from Mobileye SuperVision&trade; for hands-off, eyes-on driving, through Mobileye Chauffeur&trade; for hands-off, eyes-off driving, and Mobileye Drive&trade; autonomous driving in specified domains.\u003C/p>","2024-04-17T07:00:00.000Z","News, ADAS, Mobileye Inside, Industry",{"id":624,"type":24,"url":625,"title":626,"description":627,"primary_tag":28,"author_name":10,"is_hidden":11,"lang":12,"meta_description":627,"image":628,"img_alt":629,"content":630,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":32,"publish_date":631,"tags":632},256,"automated-driving-volkswagen-group-intensifies-collaboration-with-mobileye","Volkswagen Group intensifies collaboration with Mobileye","Volkswagen and Mobileye will bring new automated driving functions to series production based on the Mobileye SuperVision™ and Mobileye Chauffeur™ platforms. ","https://static.mobileye.com/website/us/corporate/images/bfa4e22ac1493ebbc99323bb415a2f7e_1710795224355.jpg","Logos of the Volkswagen Group and Mobileye.","\u003Cp>\u003Cstrong>Wolfsburg/Jerusalem, March 20, 2024 &ndash; \u003C/strong>The Volkswagen Group is welcoming further strategic collaboration and significantly accelerating its development efforts in the field of automated and autonomous driving. Now, Volkswagen is intensifying its partnership with Mobileye in the domain of automated driving. Together, the two companies will bring new automated driving functions to series production. Mobileye is to provide technologies for partially and highly automated driving based on its Mobileye SuperVision and Mobileye Chauffeur platforms.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/a12beac8db6dc870c38f86797bcdc466_1710847421403.jpg\" alt=\"\" width=\"1504\" height=\"420\" />\u003C/p>\n\u003Cp>\u003Cem>Volkswagen Group and Mobileye will work on three levels of technology; hands-free, eyes-on driving; hands-free, eyes-off systems and fully autonomous vehicles.\u003C/em>\u003C/p>\n\u003Cp>In the future, the Volkswagen Group&rsquo;s Audi, Bentley, Lamborghini and Porsche brands will use these technologies to rapidly introduce new premium-oriented driving functions to their model portfolios across powertrain types. These include advanced assistance systems for highway and urban driving, such as automated overtaking on multilane highways in permitted areas and conditions, as well as automatic stopping at red lights and stop signs, and support in intersections and roundabouts. In addition, Mobileye is set to supply further technology components for automated driving to Volkswagen Commercial Vehicles. In the long term, the Volkswagen Group aims to rely on its own complete in-house system: Partnerships with Bosch and Qualcomm, as well as with Horizon Robotics in China, will be continued with a focus. All driver assistance systems are to be based on the software architectures developed by Volkswagen&rsquo;s Cariad company.\u003C/p>\n\u003Cp>&ldquo;Our goal is to offer our customers throughout the world outstanding products with cutting-edge technology,&rdquo; says Oliver Blume, CEO of the Volkswagen Group and Porsche AG. \"New automated driving functions will significantly boost convenience and safety. These functions, which will be tailored to our brands and products, will make every trip a personal, individual experience. In Mobileye, we have an additional first-class partner to shape this automotive future together.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/a12beac8db6dc870c38f86797bcdc466_1710847421407.jpg\" alt=\"\" width=\"1200\" height=\"675\" />\u003C/p>\n\u003Cp>The Volkswagen Group and Mobileye have been collaborating on advanced driver assistance systems for some time. In the future, Mobileye is set to also provide technologies for driving functions with enhanced Level 2 capabilities (\"partially automated driving\") within the Volkswagen Group. When such functions will be available, and subject to its operational design domain, drivers will be allowed to take their hands off the steering wheel but must remain attentive to the traffic and ready to intervene at any time. In addition, Volkswagen is working with Mobileye on Level 3 functions (\"highly automated driving.\") At this level, the vehicle will be able to temporarily take over driving tasks in specified areas; drivers are not required to monitor the system continuously. Volkswagen and Mobileye are jointly developing these technologies into cross-brand systems.\u003C/p>\n\u003Cp>In addition, Mobileye will also offer certain production-ready functions for the new E3 1.2 premium-oriented software architecture. This new architecture is managed by Cariad and will be gradually implemented within the group by Audi, Bentley, Lamborghini, and Porsche. As part of their product strategy, the brands decide on the specific deployment of the systems and tailor them to a brand-specific driving experience.\u003C/p>\n\u003Cp>Furthermore, the Volkswagen Commercial Vehicles brand is set to be supplied by Mobileye with software and hardware to achieve Level 4 (\"fully automated driving\"). Volkswagen ADMT, a Volkswagen Group subsidiary, will implement these components in fully electric development platforms based on the Volkswagen ID. Buzz. The goal of Volkswagen ADMT is to bring self-driving ID. Buzz vehicles to series production for mobility and transportation services.\u003C/p>\n\u003Cp>&ldquo;We are proud to work closely with Volkswagen Group to make the future of driving safer, more automated and more rewarding,&ldquo; says Prof. Amnon Shashua, President and CEO of Mobileye. &ldquo;Through these programs, we see Volkswagen Group leading the industry in putting AI-powered advanced driver assistance technology in the hands of consumers globally and developing new services with autonomous vehicles.&ldquo;\u003C/p>\n\u003Cp>\u003Cstrong>Volkswagen Group sharpens its development strategy for driver assistance systems\u003C/strong>\u003C/p>\n\u003Cp>Volkswagen is further clarifying the division of development tasks for driver assistance systems between in-house innovation and collaboration for its new E\u003Csup>3\u003C/sup> 1.2 and E\u003Csup>3\u003C/sup> 2.0 software architectures. This will streamline processes and reduce complexity. With strategic partners such as Mobileye, Volkswagen Group is accelerating the rapid delivery of the premium-oriented E\u003Csup>3\u003C/sup> 1.2 architecture, while, in the long term, the Group will rely on its own complete in-house (&lsquo;stack&rsquo;) for automated driving across all brands.\u003C/p>\n\u003Cp>For the future E\u003Csup>3\u003C/sup> 2.0 architecture, the Volkswagen Group plans to forge ahead and consolidate its resources and development responsibilities within Cariad, its software company. Together with Bosch, Cariad aims to develop the Group's proprietary complete system. This system will be integrated into the future all-electric, fully digital, and highly scalable mechatronics Group platform, the Scalable Systems Platform (SSP)\u003C/p>\n\u003Cp>&ldquo;We are sharpening our development strategy for driver assistance systems at the same time as heightening our customer focus. We are concentrating on fast, reliable delivery,&rdquo; says Michael Steiner, Head of Research and Development of the Volkswagen Group and Member of the Executive Board, Research and Development of Porsche AG.\u003C/p>\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">\u003Cstrong>About the Volkswagen Group\u003C/strong>\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">The Volkswagen Group is one of the world's leading car makers, headquartered in Wolfsburg, Germany. It operates globally, with 114 production facilities in 19 European countries and 10 countries in the Americas, Asia and Africa. With around 684,000 employees worldwide. The Group&rsquo;s vehicles are sold in over 150 countries. With an unrivalled portfolio of strong global brands, leading technologies at scale, innovative ideas to tap into future profit pools and an entrepreneurial leadership team, the Volkswagen Group is committed to shaping the future of mobility through investments in electric and autonomous driving vehicles, digitalization and sustainability. In 2023, the total number of vehicles delivered to customers by the Group globally was 9.2 million (2022: 8.3 million). Group sales revenue in 2023 totaled EUR 322.3 billion (2022: EUR&nbsp; 279.1 billion). The operating result before special items in 2023 amounted to EUR 22.6 billion (2022: EUR 22.5 billion).\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">\u003Cstrong>About Mobileye\u003C/strong>\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">Mobileye (Nasdaq: MBLY) leads the evolution of mobility with its autonomous driving and driver-assistance technologies, based on world-renowned expertise in artificial intelligence, computer vision, mapping, and integrated hardware and software. Since its founding in 1999, Mobileye has enabled the wide adoption of advanced driver-assistance systems while pioneering groundbreaking technologies such as REM&trade; crowdsourced mapping, True Redundancy&trade; sensing, Responsibility-Sensitive Safety&trade; (RSS&trade;) driving policy and Driving Experience Platform (DXP). These technologies support a product portfolio structured for scale and designed to unlock the full potential of mobility, offering a range of solutions from premium ADAS to autonomous vehicles. By the end of 2023, about 170 million vehicles worldwide had been equipped with Mobileye technology. In 2022, Mobileye listed as an independent company separate from Intel (Nasdaq: INTC), which retains majority ownership. For more information, visit \u003Ca href=\"https://www.mobileye.com\">https://www.mobileye.com\u003C/a>.\u003C/span>\u003C/p>","2024-03-20T07:00:00.000Z","News, Industry, Autonomous Driving, ADAS, Driverless MaaS, Mapping & REM",{"id":634,"type":24,"url":635,"title":636,"description":637,"primary_tag":190,"author_name":10,"is_hidden":11,"lang":12,"meta_description":637,"image":638,"img_alt":639,"content":640,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":631,"tags":641},257,"volkswagen-admt-announces-agreement-with-mobileye-for-autonomous-driving","Volkswagen ADMT announces agreement with Mobileye for autonomous driving","Volkswagen ADMT will work with Mobileye on a fleet of autonomous vehicles at scale—a first for a global automaker.","https://static.mobileye.com/website/us/corporate/images/c245a61f928c539a7fe4569114de035b_1710931574541.jpg","The Volkswagen ID. Buzz AV built on the Mobileye Drive™ autonomous platform. ","\u003Cp>&minus; Volkswagen is the first vehicle manufacturer to develop an autonomous Level 4 service vehicle for large-scale production\u003C/p>\n\u003Cp>&minus; The aim is to use it in the commercial sector, for example for mobility and transport services in Europe and the USA\u003C/p>\n\u003Cp>&minus; High synergies with systems for automated driving within&nbsp;Volkswagen Group: shared module use from Level 2+ to Level 4\u003C/p>\n\u003Cp>Hannover, 20 March 2024 &ndash; Autonomous driving is coming. Following an extensive pilot phase with road testing in Germany and the USA, Volkswagen ADMT GmbH, part of Volkswagen AG, is announcing a cooperation agreement with Mobileye Global Inc. Mobileye shall develop and supply software, hardware components and digital maps for the self-driving ID. Buzz AD.\u003C/p>\n\u003Cp>The main part of the agreement covers delivery and use of a self-driving&nbsp;system (SDS) based on the Mobileye Drive&trade; platform for a special version of the ID. Buzz, which has been under development for autonomous driving since 2021. It corresponds to the so-called Level 4 definition of the Society of Automotive Engineers (SAE), in which the autonomous vehicle operates self-driving in a defined area such as a city. The basis for this are various software and hardware components, including two independent high-performance computers as well as 13 cameras, nine lidar and five radar units, each of which is capable of producing 360-degree surroundings. A constant online connection to clouds provides the autonomous vehicles with swarm data from other road users about the traffic situation as well as updates to the three-dimensional maps.\u003C/p>\n\u003Cp>&ldquo;Bringing autonomous shuttles on the road in large quantities requires cooperation from strong partners,&rdquo; says Christian Senger, member of the Board of Management at Volkswagen Commercial Vehicles, responsible for Autonomous Driving: &ldquo;We are developing the first fully autonomous large-scale production vehicle, and Mobileye brings its digital driver on board.&rdquo;\u003C/p>\n\u003Cp>An advantage of the cooperation is the synergy with systems for automated driving in the Volkswagen group; depending on the expansion level, modules can be shared across SAE levels from 2+ to 4. The aim of Volkswagen ADMT GmbH is to develop the fully electric autonomous ID. Buzz AD for the use in mobility and transport services from 2026.\u003C/p>\n\u003Cp>This also includes intelligent fleet control: The Volkswagen Group company MOIA has been operating Europe's largest private ride pooling service in&nbsp;Hamburg since 2019 and has transported more than ten million passengers to date. MOIA brings its practical know-how into the development: Unlike individual (partially) autonomous cars, which are used individually by vehicle owners, mobility services are dedicated to transporting passengers to their desired destinations within the defined urban area and drop them off safely.\u003C/p>\n\u003Cp>Another use case for self-driving vehicles is, for example, the transport of packages. The logistics market has grown significantly in recent years due to the increasing share of e-commerce. Delivery capacity is already one of the biggest challenges the industry is facing due to the driver shortage. Autonomous transport will therefore be a possible solution to ensure long-term delivery capability and participate in market growth. Volkswagen ADMT GmbH is working intensively on autonomous freight transport for various industries as a second important pillar alongside autonomous passenger transport. In the future, autonomous vehicles shall be able to drive to certain loading and unloading stations or to customer addresses independently.\u003C/p>\n\u003Cp>Autonomous vehicles for mobility and transportation services solve the driver shortage that has been the case for many years. Robo-shuttles promote both the quality of life and the economic development of cities.\u003C/p>\n\u003Cp>\u003Cbr />\u003Cspan style=\"font-size: 10pt;\">Volkswagen Commercial Vehicles: We Transport Success, Freedom and Future\u003C/span>\u003Cbr />\u003Cspan style=\"font-size: 10pt;\">As a leading manufacturer of light commercial vehicles, the Volkswagen Commercial Vehicles brand (VWCV) is reshaping the transportation of goods, services and people in a fundamental and lasting way. Our vehicles transport construction workers, families and adventurers, bread rolls, parcels and surfboards. Every day they help countless people all over the world to do a good job, they operate as mobile workshops and they bring \u003C/span>\u003Cspan style=\"font-size: 10pt;\">paramedics and police personnel to wherever they are needed. At our sites in Hanover (D), Poznań (PL), Września (PL) and Pacheco (ARG), around 24,000 employees produce the Transporter, the new Multivan, Caddy, Crafter and Amarok model lines, and since May 2022 the ID. Buzz &ndash; the fully electric version of our iconic Bulli. Within the Volkswagen Group, VWCV is also the lead brand for autonomous driving and mobility offerings such as Mobility-as-a-Service and Transport-as-a-Service - areas in which we are shaping the future of mobility. In this way, the brand is transporting the society of tomorrow with all its requirements for clean, intelligent and sustainable mobility. It is this that Volkswagen Commercial Vehicles stands for with its brand promise: We transport success, freedom and future.\u003C/span>\u003C/p>","Autonomous Driving, Driverless MaaS, Mapping & REM, Industry, Mobileye Inside, News",{"id":643,"type":5,"url":644,"title":645,"description":646,"primary_tag":28,"author_name":10,"is_hidden":11,"lang":12,"meta_description":646,"image":647,"img_alt":648,"content":649,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":650,"tags":651},255,"international-womens-day-mobileye","On International Women's Day, Mobileye leaders reflect on innovation ","Moran, Talia, Ayelet and Uvi talk about their role in core technology and products.","https://static.mobileye.com/website/us/corporate/images/a8e93d4ccc8a75a84b57c8197eecc4a8_1709814450428.jpg","Talia Berkowitz (left), Dr. Ayelet Akselrod-Ballin, Moran Molnar, Uvi (Ahuva) Kroizer at Mobileye's new Jerusalem campus.","\u003Cp>\u003Cspan data-contrast=\"auto\">Innovation thrives in an environment that encourages the exchange of ideas from all perspectives. While this is true any day of the year, International Women&rsquo;s Day is a great opportunity to hear from some of our female leaders about how they see their work on some of Mobileye&rsquo;s core technologies and products.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">&ldquo;I love that we do something meaningful and that I feel involved in so much of what Mobileye does.&rdquo; That is how Moran Molnar, Mobileye's newly appointed Chief of Product within the CTO Office, sees the benefit of her work. &ldquo;There is never a dull moment, and our goals are always growing,&rdquo; she said. As an engineer who works in product realms, Moran describes her day-to-day as the best of both worlds. &ldquo;You need to be able to go into the technological side while also understanding the strategy,&rdquo; she explained. &ldquo;It's about being able to tell our story and what we bring to the world, but also being open to what our customers want and need and adapt accordingly.&rdquo;&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/aaf69656e29fd378fdf751978201c91d_1709796736644.jpg\" alt=\"Chief of Product within the CTO Office, Moran Molnar\" width=\"600\" height=\"389\" />\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">Moran Molnar, Chief Product in the CTO Office, in the autonomous vehicle workshop\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Moran has been at Mobileye for seven years, including when autonomous driving has gone from a side project, through proof of concept, to a spectrum of AV products we now know as \u003C/span>\u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener\">Mobileye SuperVision&trade;\u003C/a>\u003Cspan data-contrast=\"auto\">, \u003C/span>\u003Ca href=\"https://www.mobileye.com/solutions/chauffeur/\" target=\"_blank\" rel=\"noopener\">Mobileye Chauffeur\u003Cspan data-contrast=\"none\">&trade;\u003C/span>\u003C/a>\u003Cspan data-contrast=\"auto\">, and \u003C/span>\u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">\u003Cspan data-contrast=\"none\">Mobileye Drive\u003C/span>\u003Cspan data-contrast=\"none\">&trade;\u003C/span>\u003C/a>\u003Cspan data-contrast=\"auto\">. Since then, Mobileye has become an established leader in the autonomous driving field. Last year, research groups \u003C/span>\u003Ca href=\"https://www.mobileye.com/news/mobileye-named-av-leader-by-two-industry-research-groups/\" target=\"_blank\" rel=\"noopener\">\u003Cspan data-contrast=\"auto\">Guidehouse\u003C/span>\u003C/a>\u003Cspan data-contrast=\"none\">\u003Ca href=\"https://www.mobileye.com/news/mobileye-named-av-leader-by-two-industry-research-groups/\" target=\"_blank\" rel=\"noopener\"> Insights and ABI Research\u003C/a>\u003C/span>\u003Cspan data-contrast=\"auto\"> both placed Mobileye in the number one position in competitive assessments of AV technology suppliers. &ldquo;I have had the privilege of watching this system develop and grow, and I feel I have made a meaningful contribution to it,&rdquo; she added. &ldquo;AV is the future of this company in my mind, along with other technologies, and being a part of that means something to me.&rdquo;&nbsp;&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Moran is not the only one whose role is part of some of our biggest projects. Director of R&amp;D Production Programs, Talia Berkowitz, relies on her background in computer science and business, and leads a team working alongside many automakers. &ldquo;We work daily with engineers both in Mobileye and on the customer side, and our team's expertise is to deep dive into the technical details of multiple hardware, software and algorithmic domains.&rdquo; she said.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/6aecdc1ae96bfbecc568957c3d2bff9f_1709796973216.jpg\" alt=\"Director of R&amp;D Production Programs, Talia Berkowitz\" width=\"600\" height=\"389\" />\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">Talia Berkowitz, Director of R&amp;D Production Programs\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Talia works on \u003C/span>\u003Ca href=\"https://www.mobileye.com/blog/what-does-adas-stand-for/\" target=\"_blank\" rel=\"noopener\">\u003Cspan data-contrast=\"none\">Mobileye ADAS\u003C/span>\u003C/a>\u003Cspan data-contrast=\"auto\"> solutions, meaning her team works closely with car manufacturers and tier 1 suppliers, offering them new features to make driving safer. &ldquo;These are not some plans for five years from now. Products we worked on two years ago are in cars right now, and the products we are working on now will soon be hitting the road,&rdquo; she noted as she emphasized the importance of the customer-facing role. &ldquo;We have to make adjustments to the products based on many factors. Specific automaker, geography, driving culture, etc., all play a role in our strategy and in matching our products to the customer&rsquo;s needs.&rdquo;&nbsp; Today, Mobileye&rsquo;s ADAS technology has been integrated into more than 170 million vehicles worldwide.&nbsp; \u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Behind Mobileye&rsquo;s driver-assist products is a host of unique technologies based on our expertise in machine learning and AI. Senior Director of Algorithms Dr. Ayelet Akselrod-Ballin is one of the leaders behind the software that goes into Mobileye-equipped vehicles. Ayelet joined Mobileye two years ago after over two decades in the industry and academia. During her time at the Weizmann Institute and Harvard, she focused on complex computer vision and machine learning problems, and today she applies her know-how to the algorithms behind Mobileye&rsquo;s \u003C/span>\u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener\">\u003Cspan data-contrast=\"none\">REM&trade;\u003C/span>\u003C/a>\u003Cspan data-contrast=\"auto\"> mapping system. She, together with the technological team, works toward building high-quality, efficient, and innovative algorithms that push the system forward, making it more accurate and providing a high-quality solution. \u003C/span>\u003Cspan data-contrast=\"auto\">&ldquo;I have continued working in a similar machine learning deep learning domain as I did in academia. However, I enjoy the execution mode of the industry, the ability to influence a wide project, and work with a large group of talented people and seeing things you conjure become a reality.&rdquo; Ayelet said.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{}\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/d65a9ab3e1a45b573393f45270c969e2_1709797870634.jpg\" alt=\"Senior Director of Algorithms Dr. Ayelet Akselrod-Ballin\" width=\"600\" height=\"389\" />\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">Dr. Ayelet Akselrod-Ballin, Senior Director of Algorithms in Mobileye mapping group\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan class=\"TextRun SCXW15198881 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">Another one of Mobileye&rsquo;s impressive \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">piece\u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">s\u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\"> of technology is its lidar project\u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">, which\u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\"> is dedicated to crafting a cutting-edge and cost-effective solution for the future of autonomous driving.&nbsp; This future is \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">set \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">to be\u003C/span>\u003C/span> \u003Cspan class=\"TextRun SCXW15198881 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">enabled by integrating \u003C/span>\u003C/span>\u003Ca href=\"https://www.youtube.com/watch?v=6cP-0Oo0FuM\" target=\"_blank\" rel=\"noopener\">\u003Cspan class=\"TextRun Underlined SCXW15198881 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"none\">\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\" data-ccp-charstyle=\"Hyperlink\">FMCW lidar\u003C/span>\u003C/span>\u003C/a>\u003Cspan class=\"TextRun SCXW15198881 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\"> with our surround camera and radar sensors for hands-off/eyes-off driving. \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">&ldquo;It is important to see the big picture\u003C/span>\u003C/span>\u003Cspan class=\"TrackedChange SCXW15198881 BCX0\">\u003Cspan class=\"TextRun SCXW15198881 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">;\u003C/span>\u003C/span>\u003C/span>\u003Cspan class=\"TextRun SCXW15198881 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\"> even if you work on \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">only \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">one detail you have to see where it \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">fits in\u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">,&rdquo;\u003C/span> \u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">said\u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\"> Principal Software Engineer Uvi \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">(Ahuva) \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">Kroizer. &ldquo;I was one of the first people on \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">this \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">project and I am very proud \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">that I designed and implemented key pieces of its software\u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">.&rdquo; Uvi admits that when she was younger it was harder for her to \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">make\u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\"> her voice\u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\"> heard\u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">, but today she sees herself as someone who brings people \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">together,\u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\"> which she sees as one of the key enablers of a successful product\u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">. &ldquo;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">I work in an interdisciplinary field, so \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">I might not \u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">know everything there is to know\u003C/span>\u003Cspan class=\"NormalTextRun SCXW15198881 BCX0\">,&rdquo; she said jokingly, &ldquo;but thanks to my experience I know how to bring people from different teams together. Interpersonal relations are important here, I need to represent my team, but also remember we all have the same goal of creating the best product.&rdquo;\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW15198881 BCX0\" data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan class=\"EOP SCXW15198881 BCX0\" data-ccp-props=\"{}\">\u003Cspan class=\"TextRun SCXW39963028 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/fd7a32749ff06d00788450f542f99d14_1709797345149.jpg\" alt=\"Principal Software Engineer Uvi (Ahuva) Kroizer\" width=\"600\" height=\"389\" />\u003C/span>\u003C/span>\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">Uvi (Ahuva) Kroizer, Principal Engineer in Mobileye lidar group, in front of art installation made with real EyeQ chips\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan class=\"EOP SCXW15198881 BCX0\" data-ccp-props=\"{}\">\u003Cspan class=\"TextRun SCXW39963028 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">T\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">hese four\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\"> exceptional women provide\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">d\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\"> us with a fascinating glimpse into what it takes to \u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">build the bridge to\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">wards\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\"> autonomous driving. \u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">At Mobileye, our success \u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">is not\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\"> only in our industry \u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">expertise\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">, but also \u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">in guiding\u003C/span> \u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">product development\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\"> from concept to \u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">production\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\"> &ndash;\u003C/span> \u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">largely \u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">thanks to\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\"> the \u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">bright \u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">minds behind the technology\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">. \u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">Our success is in\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\"> threading technological knowledge and business \u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">experience \u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">together\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">,\u003C/span> \u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">while\u003C/span> \u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">transforming\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\"> theoretical concepts into \u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">tangible\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\"> reality\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">. Every\u003C/span> \u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">day\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">,\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\"> we navigate through many challenges \u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">together\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\"> in pursuit of our vision: safer vehicles worldwide\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">.&nbsp;\u003C/span>\u003Cspan class=\"NormalTextRun SCXW39963028 BCX0\">&nbsp;\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW39963028 BCX0\" data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/span>\u003C/p>","2024-03-07T08:00:00.000Z","News, Mobileye Inside, ADAS, Autonomous Driving, Mapping & REM, AV Safety",{"id":653,"type":654,"url":655,"title":656,"description":657,"primary_tag":658,"author_name":429,"is_hidden":11,"lang":12,"meta_description":657,"image":659,"img_alt":660,"content":661,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":662,"tags":663},254,"opinion","mobileye-dxp-as-a-novel-approach","The customization crossroads: Mobileye DXP as a novel approach","How we merge the universal needs of an automated driving system with the unique desires of a brand.",9,"https://static.mobileye.com/website/us/corporate/images/d3e61ddfe04ec9f19f25187580a6c1f7_1708003321461.jpg","Mobileye's DXP addresses the tension between the need to customize and time-to-market and performance risks.","\u003Cp>Over the past decade, as we have embarked on the gradual transition from driver assist towards autonomous driving, there have been several crossroads along the way where the conventional industry approach has overlooked an inconvenient reality and thus limited the potential for safer and ubiquitous autonomous vehicles &ndash; the shared goal in the industry. The path of least resistance often leads to a choice of what seems doable now, but isn&rsquo;t practical or scalable later.\u003C/p>\n\u003Cp>We saw this when it was understood that AVs needed HD maps on a level that regular navigation maps could not support, and the common approach was to use dedicated lidar-equipped vehicles to manually map an area for AV driving. This was slow, costly, geographically limited, and generated maps that were quickly out of date. This was when we came up with the idea of REM crowdsourced mapping using cameras already onboard ADAS-equipped vehicles travelling on their typical routes.\u003C/p>\n\u003Cp>We also saw this when the industry came to a consensus that cameras, radars, and lidars were all necessary for redundant sensing for AVs, but overlooked how equipping vehicles with that many sensors would make them highly expensive, dramatically limiting profitable business models. This is why we are investing in developing imaging radar with lidar-like output, to reduce the number of lidars included and thereby significantly lowering the hardware costs per vehicle. We even see today that our then-radical early 2000s decision to design our own chip and tightly couple hardware with software is proving central to cost-effective high-performance delivery, while others are only now considering this route.\u003C/p>\n\u003Cp>The industry is now at another crossroads, where we must not overlook an important reality, even if it is inconvenient or hard to overcome, and it has to do with the inherent tension between, on one hand, automakers' desire to customize their offerings and, on the other hand, the time-to-market and performance risks involved in full development of automated driving systems from scratch.\u003C/p>\n\u003Ch3>\u003Cstrong>The differentiability-scalability-risk tradeoff \u003C/strong>\u003C/h3>\n\u003Cp>For automated driving to become commonplace, automakers will want to customize their driving experience for their particular consumers' expectations. How to do this optimally must take into consideration three key factors: the ability for the automaker to differentiate among other brands, the ability for the supplier to scale to many automakers, and a need to minimize the risk of not executing to the desired performance, cost, and timeline. Given the numerous announcements about ambitious projects in recent years that never came, or have yet to come, to fruition, this execution risk cannot be overlooked. &nbsp;\u003C/p>\n\u003Cp>&nbsp;&nbsp;\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/1381038603fa2dbc89d8fcbcd240c2ef_1708001820543.jpg\" alt=\"\" width=\"650\" height=\"300\" />\u003C/p>\n\u003Ch3>\u003Cstrong>Where to draw the line between sense, plan, and act?\u003C/strong>\u003C/h3>\n\u003Cp>The foundation of all robotics is sense-plan-act: perceive the environment, make a plan, and execute the plan. If differentiability is desired, the automaker must control some part of this stack, but the hard question is where to draw the line between what the supplier controls and what the automaker controls, with all three aspects of the tradeoff in mind. If the supplier provides part of the perception and the automaker develops the rest, differentiation is obtained, but the risk to the automaker is very high; development is difficult, time-consuming, and costly. Integrating everything into one complete and robust system is extremely hard.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/945570dd6474b8083bf5dfaf1dbcdcd6_1708002027160.jpg\" alt=\"\" width=\"650\" height=\"337\" />\u003C/p>\n\u003Cp>On the other hand, if the supplier provides all of the driving policy and the automaker handles only the actions taken by the vehicle, either not enough differentiation is achieved on the part of the automaker, \u003Cem>or\u003C/em> the supplier needs to handle all of the automaker&rsquo;s driving experience requirements, which is a challenge to scalability.&nbsp;&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/f23a6c4da27193d5421b3baceea322c2_1708002148211.jpg\" alt=\"\" width=\"650\" height=\"332\" />\u003C/p>\n\u003Cp>So then the answer seems easy &ndash; draw the line between perception and planning. Get all your sensing from the supplier, and the automaker can build their own driving policy and actuation stack.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/bce90c5eac3727c9ca12d4f1ab1a58e3_1708002252931.jpg\" alt=\"\" width=\"650\" height=\"331\" />\u003C/p>\n\u003Cp>While tantalizing, this too is highly problematic, and where a key reality of AV development gets overlooked: perception is never perfect, and a driving policy must be robust enough to anticipate those imperfections, requiring an intimate integration of perception and planning. Many attempts have been made with this approach and are at various stages of failure, due to what we call the \"underestimation plague\" &ndash; a tendency to vastly underestimate how hard driving policy really is.\u003C/p>\n\u003Cp>Driving policy must deal with predictions, intentions, uncertainties, and risks of decision-making errors and is therefore highly complex. This approach is therefore also not scalable for the automaker, as driving policy is intimately integrated with perception and must be continually adapted and revalidated as perception changes.&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>Separating the universal from the unique \u003C/strong>\u003C/h3>\n\u003Cp>Given the above, the better path to enable differentiation while also minimizing execution risk and enabling scalability is to draw the line \u003Cspan style=\"font-style: normal !msorm;\">\u003Cem>in the middle of\u003C/em>\u003C/span> the policy stack, with the goal of keeping the perception and sensing integrated, while offering space to the automaker to define behavioral elements of the vehicle.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/d694ea364ba1e8f44b1ca1fb90b6eaff_1708003677023.jpg\" alt=\"\" width=\"650\" height=\"366\" />\u003C/p>\n\u003Cp>In order to do this, we must define which aspects are universal and should be the same for all systems, and which aspects are unique &ndash; where differentiation can and should be possible. Perception is clearly universal, and the way the vehicle performs actions is clearly unique. Driving policy, however, is partially universal and partially unique. On the one hand, it must be tailored to sensing, and on the other hand, it is responsible for the look and feel of the driving experience. The art is to find the right granularity of abstractions that will enable us to make the separation between the universal content (which we want to standardize) and the unique content (which we want to customize around).\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/c7d9afa73d33066de2b13baa1b7d31be_1708003813431.jpg\" alt=\"\" width=\"650\" height=\"360\" />\u003C/p>\n\u003Ch3>\u003Cstrong>Mobileye DXP: When, what, and how\u003C/strong>\u003C/h3>\n\u003Cp>Mobileye Driving Experience Platform, or DXP, is a programming language that separates between the universal and the unique, by organizing the decision-making according to when, what, and how. The \"when\" and \"what\" are universal, and the \"how\" is unique. For example, \"when\" approaching a stop sign, the vehicle must brake to stop (the \"what\"). But \"how\" each vehicle model brakes is unique &ndash; later and stronger, or earlier and milder, for example. Or, when approaching a roundabout, the vehicle must decide whether to yield or to proceed in front of another vehicle. Assuming both are safe, what should the vehicle do? This is the &ldquo;how&rdquo; category, and different automakers will want to tune their systems differently.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/a9661f51c4a3e05dadefe3c330c7d06e_1708003943206.jpg\" alt=\"\" width=\"650\" height=\"367\" />\u003C/p>\n\u003Ch3>\u003Cstrong>How it works inside DXP\u003C/strong>\u003C/h3>\n\u003Cp>Within a given scenario &ndash; the \"what\" and \"when\" &ndash; the automaker defines packages or families of \"how\" &ndash; a variety of implementations of how to brake to stop for example. The automaker constructs packages of &ldquo;how instances&rdquo; out of the platform&rsquo;s &ldquo;how families\". The platform provides online and offline tools for creating these packages. The platform also offers reference designs for required packages, in order to reduce execution risk, such that the automaker doesn't need to implement all packages from day one, but can focus on where it specifically wants to provide differentiation.\u003C/p>\n\u003Cp>Automakers then create code that selects the appropriate package during online driving, based on application parameters like locality, road type, regulation, driving mode, and weather conditions. This solves the differentiability-scalability-risk tradeoff by offering differentiability without breaking the intimate integration between sensing and policy, by using the right abstractions, and, accordingly, reducing execution risk, because the platform will be based on a working product out of the box, and all efforts can focus on differentiation. Automakers can also make post-production tweaks to the driving experience in response to consumer feedback.\u003C/p>\n\u003Cp>The backbone of this platform is based on redundancy in perception engines, and driving policy using Responsibility-Sensitive Safety combined with analytical calculations and intentions (you can learn more about that in \u003Ca href=\"https://www.youtube.com/watch?v=_z3qBZ6vQL8&amp;t=1349s\" target=\"_blank\" rel=\"noopener\">this video)\u003C/a>.\u003C/p>\n\u003Cp>We see great potential in DXP, and in the short time since we unveiled it at CES, several automakers have expressed interest in learning more. The tangible benefits of automated driving systems should help an automaker further define its brand. DXP offers a better roadmap to that future.\u003C/p>","2024-02-21T08:00:00.000Z","Opinion, Amnon Shashua, Industry, From our CEO, Autonomous Driving",{"id":665,"type":5,"url":666,"title":667,"description":668,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":668,"image":669,"img_alt":670,"content":671,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":672,"tags":673},245,"what-does-adas-stand-for","What does ADAS stand for? And what are its features? ","Most new cars on the market today come equipped with advanced driver-assistance systems. Here are some of the features you're likely to encounter. ","https://static.mobileye.com/website/us/corporate/images/c82a41e403606f2e8dd6b665566fba2e_1703494945804.jpg","ADAS features share the common goals of making driving easier and our roads safer. ","\u003Cp>\u003Cspan data-contrast=\"auto\">ADAS stands for Advanced Driver-Assistance System &mdash; and chances are that if your car was made in the past decade or two, it has some kind of ADAS features on board.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye has, for the past quarter of a century, been developing and supplying the computer vision technology that enables many ADAS features. Dozens of automakers sell hundreds of car models around the world with ADAS features powered by our technology, totaling more than 140 million vehicles to date (and counting). And we're continuously improving our core technologies to deliver ever more advanced capabilities.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">In this, the second installment of our \u003Ca href=\"https://www.mobileye.com/blog/what-is-advanced-driver-assistance-system-adas/\" target=\"_blank\" rel=\"noopener\">Mobileye 101 series\u003C/a>\u003C/span>\u003Cspan data-contrast=\"auto\">, we'll take you through some of the ADAS features you're likely to see in new cars on the road today, and how the state of driver assistance technology is evolving towards the future.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">ADAS features today\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/24e2a66e429df0bb0757150a1eddd762_1703494968277.png\" alt=\"\" width=\"650\" height=\"306\" />\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Forward Collision Warning\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\"> is one of the most basic forms of driver assistance, monitoring the area in front of the vehicle and warning the driver of an imminent collision.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Automatic Emergency Braking\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\"> builds upon the passive function of Forward Collision Warning by actively actuating the brakes to avoid collision.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Blind Spot Detection\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\"> (also known as Blind Spot Warning or Blind Spot Monitor) alerts the driver to any obstacles in the hardest-to-see sections of the driver's field of view.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Lane Departure Warning\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\"> tracks the boundaries of the lane and alerts the driver if they're drifting over the line.\u003C/span>\u003Cstrong>\u003Cspan data-contrast=\"auto\"> Lane Keep Assist\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\"> goes further by actively intervening to steer the vehicle back into its lane.\u003C/span>\u003Cstrong>\u003Cspan data-contrast=\"auto\"> Lane Centering\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\"> goes yet further to actively keep the vehicle centered within its lane. And \u003C/span>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Lane Change Assist \u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\">goes further still by switching lanes automatically.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Adaptive Cruise Control\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\"> maintains a set speed on the highway while monitoring the distance to the vehicle ahead, slowing down and speeding up to keep a safe following distance.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Highway Assist\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\"> is one of the most advanced forms of ADAS, combining such features as Adaptive Cruise Control and Lane Centering to partially automate cruising on interurban highways.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Traffic Jam Assist\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\"> is similar to Highway Assist, but is designed to operate at lower speeds in stop-and-go traffic, partially automating one of the most mundane aspects of driving.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Traffic Sign Recognition\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\"> detects and identifies all manner of traffic signs posted alongside the road, informing the driver and vehicle systems of relevant signals.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Intelligent Speed Assist\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\"> monitors traffic signs and other parameters of the driving environment to help drivers stay within the speed limit.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan class=\"TextRun SCXW132396832 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW132396832 BCX0\">The e\u003C/span>\u003Cspan class=\"NormalTextRun SCXW132396832 BCX0\">ffects of ADAS\u003C/span>\u003C/span>\u003Cspan class=\"EOP TrackedChange SCXW132396832 BCX0\" data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye technology supports a broad range of additional features, from emergency vehicle detection to automatic high beams. Other suppliers specialize in technology such as backup cameras, anti-lock braking systems, and tire pressure monitoring systems. Some automakers even offer specialized systems for towing trailers or driving off-road.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">These features all share the common goals of making driving easier and our roads safer. And they've proven effective: the \u003Ca href=\"https://www.iihs.org/media/290e24fd-a8ab-4f07-9d92-737b909a4b5e/oOlxAw/Topics/ADVANCED%20DRIVER%20ASSISTANCE/IIHS-HLDI-CA-benefits.pdf\" target=\"_blank\" rel=\"noopener\">Insurance Institute for Highway Safety\u003C/a> \u003C/span>\u003Cspan data-contrast=\"auto\">(IIHS) found passive Blind Spot Detection to reduce the instance of lane-change collisions resulting in injuries by 23 percent, for example, and the active intervention of Automatic Emergency Braking to reduce rear-end collisions by 50 percent (resulting in 56 percent fewer injuries).\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">As these features grow more commonplace, capable, and effective, automotive safety standards have increased. Organizations like IIHS and \u003Ca href=\"https://www.euroncap.com/en/about-euro-ncap/timeline/\" target=\"_blank\" rel=\"noopener\">Euro NCAP\u003C/a> \u003C/span>\u003Cspan data-contrast=\"auto\">have instituted additional testing (and award higher marks) for vehicles equipped with key ADAS features. In many markets, government authorities require all new vehicles on the market to come with certain ADAS features fitted as standard, such as the \u003Ca href=\"https://www.mobileye.com/news/mobileye-launches-the-first-camera-only-intelligent-speed-assist-to-meet-new-eu-standards/\" target=\"_blank\" rel=\"noopener\">Intelligent Speed Assist\u003C/a> \u003C/span>\u003Cspan data-contrast=\"auto\">required under the EU's latest General Safety Regulation.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Evolving beyond basic ADAS\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/681ee34ca3b9b876269d4b79c4d4a3d5_1703495009120.png\" alt=\"\" width=\"650\" height=\"306\" />\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Now in our 25th year, Mobileye continues to develop and improve the technologies to support ADAS features such as these. Our computer vision technology leads the industry, especially in its renowned ability to derive three-dimensional information from a single camera. Our latest algorithms are trained to not only detect other road users, but determine their intent &ndash; which can be particularly useful in predicting, for example, if a \u003Ca href=\"https://www.mobileye.com/blog/pedestrian-safety-month-protection-detection/\" target=\"_blank\" rel=\"noopener\">pedestrian\u003C/a> \u003C/span>\u003Cspan data-contrast=\"auto\">is waiting by the curb or is about to step out into the street. And our latest EyeQ&trade; systems-on-chip \u003C/span>\u003Ca href=\"https://www.mobileye.com/blog/enhanced-computer-vision-driver-assistance/\" target=\"_blank\" rel=\"noopener\">support more powerful camera sensors\u003C/a> \u003Cspan data-contrast=\"auto\">(with greater resolution and field of view) to see farther and wider, along with highly sophisticated artificial intelligence capabilities to derive more insights from the driving environment.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The next step in evolving the capabilities of driver assistance is adding REM&trade;-generated maps. Our \u003Ca href=\"https://www.mobileye.com/blog/cloud-enhanced-driver-assist/\" target=\"_blank\" rel=\"noopener\">Cloud-Enhanced Driver-Assist&trade;\u003C/a> \u003C/span>\u003Cspan data-contrast=\"auto\">solution brings to the ADAS sphere the mapping tech initially designed for autonomous vehicles, enabling such features as lane centering (even in the absence of visible lane markers) and traffic-light relevancy (which can alert drivers, for instance, if they're about to run a red light).\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">With additional sensors, processing power, and RSS&trade;-based driving policy, \u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-bridge-to-consumer-autonomous-vehicles/\" target=\"_blank\" rel=\"noopener\">Mobileye SuperVision&trade;\u003C/a> \u003C/span>\u003Cspan data-contrast=\"auto\">takes driver assistance to the next level, enabling eyes-on/hands-off automated driving capabilities. From there, we're \u003Ca href=\"https://www.mobileye.com/blog/when-will-self-driving-cars-be-available/\" target=\"_blank\" rel=\"noopener\">stepping up\u003C/a> \u003C/span>\u003Cspan data-contrast=\"auto\">to eyes-off/hands-off operation with Mobileye Chauffeur&trade; and driverless mobility services with Mobileye Drive&trade; as we strive to incrementally evolve from driver assistance to the autonomous future.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>","2024-02-14T08:00:00.000Z","ADAS, Mapping & REM",{"id":675,"type":24,"url":676,"title":677,"description":678,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":678,"image":679,"img_alt":680,"content":681,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":682,"tags":683},253,"mobileye-and-project-3-mobility-announce-collaboration-for-scalable-urban-autonomous-mobility-project","Mobileye and Project 3 Mobility announce collaboration for scalable urban autonomous mobility project","Project 3 Mobility is developing a new mobility service leveraging Mobileye Drive and a unique electric vehicle platform","https://static.mobileye.com/website/us/corporate/images/a3432d1bf7442959befeecc389c5ee8a_1707767336699.jpg","Logos of Project 3 and Mobileye on a circuit board","\u003Cp>Jerusalem and Zagreb, February 13, 2024 &ndash; Mobileye (Nasdaq: MBLY), a global leader in autonomous driving solutions, and Project 3 Mobility (P3), a Croatian company building a new ecosystem for urban autonomous mobility, today announced a collaboration to explore a new mobility service, utilizing Mobileye&rsquo;s scalable self-driving technology, \u003Ca href=\"https://www.mobileye.com/solutions/drive/\">Mobileye Drive&trade;\u003C/a>.\u003C/p>\n\u003Cp>Project 3 Mobility aims to significantly improve transportation in urban areas and revolutionize mobility in cities. The first P3 service is aimed to be launched in Zagreb in 2026, with testing and validation of Mobileye's AV solution on the streets of the Croatian capital targeted to start in 2024.\u003C/p>\n\u003Cp>Project 3 Mobility's mission is to redefine urban mobility into a safe, effortless, and premium user experience. By leveraging technology and innovation, P3 is creating a ground-breaking ecosystem consisting of three key elements: an autonomous electric vehicle, specialized infrastructure, and a mobility service platform. Project 3 Mobility is developing a fully autonomous electric vehicle fundamental for implementing this new urban mobility ecosystem. The vehicle is built on a completely new platform designed around safety and comfort, fully utilising the benefits of autonomous driving. Mobileye technology will be integrated into the P3 vehicle, which will use the Mobileye Drive autonomous driving solution that has also been adopted by some of the world&rsquo;s leading automotive companies.\u003C/p>\n\u003Cp>For the past two decades, Mobileye has used the power of computer vision and artificial intelligence to improve automotive safety, with approxtimately 170 million vehicles integrating Mobileye technology to date. The Mobileye Drive combination of redundant camera and lidar-radar sensing, REM&trade; crowdsourced mapping and transparent driving policy &ndash; all powered by the EyeQ&trade; system-on-chip - enables Mobileye to offer scalable, adaptable autonomous driving for robotaxis and Mobility-as-a-Service (MaaS) providers. The big benefits of Mobileye Drive are that it can be deployed in a wide range of geographic locations, on many different road types; that it can drive under varying weather conditions; and that it can adapt itself to local driving styles, all of which are crucial for Project 3 Mobility&rsquo;s service scalability.&nbsp;\u003C/p>\n\u003Cp>P3 has already signed agreements with 9 cities across the EU, UK and GCC to provide its urban autonomous service, and 30 more cities are planned to follow. Also, Project 3 Mobility is about to establish a production facility in Croatia for the large-scale production of autonomous vehicles that will be deployed worldwide.\u003C/p>\n\u003Cp>&ldquo;We welcome this innovative collaboration with Project 3 Mobility to bring an exciting and promising new mobility service to life,&rdquo; said Prof. Amnon Shashua, founder and CEO of Mobileye. &ldquo;We share the vision of making urban mobility safer and more accessible, and with the flexibility of Mobileye Drive to adapt to different vehicle types, we&rsquo;re excited to explore new frontiers in mobility services.&rdquo;\u003C/p>\n\u003Cp>&ldquo;We are committed to improving how people move in cities. Our collaboration with Mobileye is a cornerstone for achieving that vision. Mobileye has proved to be a great partner that has the right experience, the right technology, and the right team. Together we will make that vision of urban autonomous mobility a reality. I am very excited to share more details in the next few months&rdquo;, said Marko Pejković, CEO of Project 3 Mobility.\u003C/p>\n\u003Cp>Project 3 Mobility&rsquo;s planned Mobility-as-a-Service project aims to provide customers with more than just transportation from point A to point B. The goal is to give customers an improved user experience tailored to their needs, by giving back time spent in traffic, opening possibilities for leisure or work. The realization of P3&lsquo;s project aims to increase efficiency and safety in traffic and achieve a positive impact on the environment and city infrastructure. The project aims to create significant benefits for Zagreb and many other cities that will provide this service.\u003C/p>\n\u003Cp>Project 3 Mobility will share further details about its project and service in the months ahead.\u003C/p>\n\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>&nbsp;\u003C/p>\n\u003Cp>Mobileye (Nasdaq: MBLY) leads the mobility revolution with its autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in computer vision, artificial intelligence, mapping, and data analysis. Since its founding in 1999, Mobileye has pioneered such groundbreaking technologies as REM&trade; mapping, True Redundancy&trade; sensing, and Responsibility Sensitive Safety&trade; (RSS&trade;). These technologies are driving the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions, powering industry-leading advanced driver-assistance systems and delivering valuable intelligence to optimize mobility infrastructure. To date, about 170 million vehicles worldwide have been built with Mobileye technology inside. In 2022 Mobileye was listed as an independent company separate from Intel (Nasdaq: INTC), which retains majority ownership. For more information, visit  \u003Ca href=\"https://cts.businesswire.com/ct/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.mobileye.com&amp;esheet=53539903&amp;newsitemid=20230817465845&amp;lan=en-US&amp;anchor=https%3A%2F%2Fwww.mobileye.com&amp;index=2&amp;md5=dc09cc0681f253ba2b178bfdedda4ac0\">https://www.mobileye.com\u003C/a>.&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>About Project 3 Mobility\u003C/strong>\u003C/p>\n\u003Cp>Project 3 Mobility is a Croatia-based company developing a new ecosystem for urban autonomous mobility. The innovative ecosystem consists of three key elements that P3 is creating: the urban autonomous vehicle, specialized infrastructure, and a complete service platform. By focusing on a safe, effortless, and premium user experience, the P3 service will significantly improve transportation in urban areas and revolutionize the way we move in our cities. Given the cutting-edge solution, the project has attracted multiple experts from all around the world. Currently, Project 3 Mobility has a team of more than 240 people with experts from more than 20 different industries and nationalities in two offices &ndash; in Croatia and the UK.\u003C/p>\n\u003Cp>The company was founded in 2019 by Croatian innovator and entrepreneur Mate Rimac, Marko Pejković, CEO of Project 3 Mobility, and Adriano Mudri, CDO at Project 3 Mobility.&nbsp;\u003C/p>\n\u003Cp>For more information, visit&nbsp;\u003Ca href=\"https://p3m.com/\">https://p3m.com/\u003C/a>\u003C/p>","2024-02-13T08:00:00.000Z","Autonomous Driving, Mobileye Inside, Driverless MaaS, News",{"id":685,"type":24,"url":686,"title":687,"description":688,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":688,"image":689,"img_alt":690,"content":691,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":692,"tags":693},251,"hirain-to-mass-produce-first-mobileye-eyeq6-based-adas-system-in-china","HiRain to mass produce first Mobileye EyeQ6-based ADAS system in China ","Hirain will produce the first ADAS systems in China to use the Mobileye EyeQ6L system-on-chip","https://static.mobileye.com/website/us/corporate/images/235895a44626fcbbd580e8fb1631ac65_1734858410808.png","Hirain and Mobileye","\u003Cp>BEIJING, 22 January 2024 -- HiRain Technologies, a system provider of intelligent driving solutions to automakers in China, proudly announces the upcoming production of the Mobileye EyeQ\u003Csup>&trade;\u003C/sup>6 Lite based ADAS system, poised to debut in China in the second quarter of 2024. This marks a significant stride in advancing automotive safety and comfort features.\u003C/p>\n\u003Cp>The EyeQ6, the newest member of Mobileye's systems-on-chip portfolio, is engineered to redefine performance and efficiency in both core and premium ADAS offerings. Featuring Mobileye's sophisticated vision-based sensing technology, the EyeQ6 Lite excels in real-time detection and analysis of its surroundings. Its advanced AI algorithms process this information, enabling precise decision-making in diverse driving conditions. The EyeQ6 Lite also supports Cloud-Enhanced ADAS, providing automakers with a single solution for both the Chinese and international markets.\u003C/p>\n\u003Cp>Working with HiRain, the EyeQ6 Lite has been optimized for a variety of vehicles. Its compact design and low power consumption make it ideally suited for integration into current vehicle platforms, transforming it into a pivotal component in intelligent transportation.\u003C/p>\n\u003Cp>\"Our collaboration with Mobileye since 2012 has now brought forth the EyeQ6 Lite-based ADAS, HiRain's 7th generation in this series,&rdquo; said Dr. Chengjian Fan, Deputy General Manager of HiRain. &ldquo;Its outstanding performance, including L2+ features like AEB, ACC, LKA, iACC, TJA, and ENCAP2023 5-star compliance, offers our OEM clients a reliable, cost-effective product.\"\u003C/p>\n\u003Cp>\u003Cimg style=\"display: block; margin-left: auto; margin-right: auto;\" src=\"https://static.mobileye.com/website/us/corporate/images/d665e0d903e15712cfe3e146e279333c_1705924213120.jpg\" alt=\"A picture of the EyeQ6L system-on-chip from Mobileye\" width=\"620\" height=\"339\" />\u003C/p>\n\u003Cp>The EyeQ6 line builds on more than 170 million EyeQ units delivered globally since 2007, enabling key safety features such as automatic emergency braking for dozens of automakers worldwide.\u003C/p>\n\u003Cp>\"The EyeQ6 Lite project is a symbol of our steadfast dedication to safety and automotive-grade excellence,\" stated Elie Luskin, Mobileye Vice President, and Managing Director of Mobileye China. \"Our robust partnership with HiRain, spanning projects from EyeQ\u003Csup>&trade;\u003C/sup>3 to SuperVision\u003Csup>&trade;\u003C/sup>, reflects our shared vision for high-quality, reliable automotive innovations. This collaboration has been key in advancing both safety standards and technological sophistication in the automotive industry.\"\u003C/p>\n\u003Cp>\u003Cstrong>About HiRain \u003C/strong>\u003C/p>\n\u003Cp>Founded in 2003, HiRain focuses on providing customers in the fields of automobile and unmanned transportation with electronic products, R &amp; D services and overall solutions for high-level intelligent driving. Headquartered in Beijing, HiRain has established modern production plants in Tianjin and Nantong, forming a perfect R &amp; D, production, marketing and service system. Based on the concept of \"value innovation and serve customers\", the company adheres to the strategies of \"professional focus\", \"technology leadership\" and \"platform development\", and is committed to becoming a world-class comprehensive electronic system technology service provider, a full stack solution supplier for intelligent networked vehicles and a leader in high-level intelligent driving MaaS solutions. For more information, visit \u003Ca href=\"https://www.hirain.com\">https://www.hirain.com\u003C/a>\u003C/p>\n\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\n\u003Cp>Mobileye (Nasdaq: MBLY) leads the mobility revolution with its autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in computer vision, artificial intelligence, mapping, and data analysis. Since its founding in 1999, Mobileye has pioneered such groundbreaking technologies as REM&trade; crowdsourced mapping, True Redundancy&trade; sensing, and Responsibility Sensitive Safety (RSS). These technologies are driving the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions, powering industry-leading advanced driver-assistance systems and delivering valuable intelligence to optimize mobility infrastructure. To date, more than 170 million vehicles worldwide have been built with Mobileye technology inside. In 2022 Mobileye listed as an independent company separate from Intel (Nasdaq: INTC), which retains majority ownership. For more information, visit \u003Ca href=\"https://www.mobileye.com\">https://www.mobileye.com\u003C/a>\u003C/p>","2024-01-22T08:00:00.000Z","Industry, News, ADAS, Mobileye Inside",{"id":695,"type":5,"url":696,"title":697,"description":698,"primary_tag":51,"author_name":16,"is_hidden":11,"lang":12,"meta_description":698,"image":699,"img_alt":700,"content":701,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":692,"tags":702},252,"mobileye-ces-2024","What happened in Vegas: Mobileye at CES 2024  ","Mobileye made its presence felt at CES again, showcasing our latest technologies and introducing Mobileye DXP—our new Driving Experience Platform.","https://static.mobileye.com/website/us/corporate/images/c30e532299c9f589ff8c9dd811c038ec_1705931822539.jpg","Entering CES 2024 at the Las Vegas Convention Center.","\u003Cp>\u003Cspan data-contrast=\"auto\">It was all about experiencing brand-new technologies at CES 2024, and, just as in previous years, Mobileye was a main attraction on the show floor in Las Vegas, giving visitors an opportunity to witness our vision of an autonomous future.&nbsp; \u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">New collaborations and a new platform took center stage as Mobileye presented its latest developments at the biggest in-person tech event in the world, with over 135,000 people in attendance. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">From first glimpses of vehicles equipped with our cutting-edge technologies to the unveiling of our latest breakthroughs, here is a quick recap of all the action in and around Mobileye at CES 2024.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/hrEpvBG4u14?si=aLA9H38kj7bsQgb6\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Professor Amnon Shashua at CES 2024: Now. Next. Beyond.&nbsp;\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-teams=\"true\">On the first day of CES, \u003Ca id=\"menur1ru\" class=\"fui-Link ___1q1shib f2hkw1w f3rmtva f1ewtqcl fyind8e f1k6fduh f1w7gpdv fk6fouc fjoy568 figsok6 f1s184ao f1mk8lai fnbmjn9 f1o700av f13mvf36 f1cmlufx f9n3di6 f1ids18y f1tx3yz7 f1deo86v f1eh06m1 f1iescvh fhgqx19 f1olyrje f1p93eir f1nev41a f1h8hb77 f1lqvz6u f10aw75t fsle3fq f17ae5zn\" title=\"https://www.mobileye.com/amnon-shashua/\" href=\"https://www.mobileye.com/amnon-shashua/\" target=\"_blank\" rel=\"noreferrer noopener\" aria-label=\"Link Mobileye's CEO\">Mobileye's CEO\u003C/a> delivered his much-anticipated \u003Ca id=\"menur1s0\" class=\"fui-Link ___1q1shib f2hkw1w f3rmtva f1ewtqcl fyind8e f1k6fduh f1w7gpdv fk6fouc fjoy568 figsok6 f1s184ao f1mk8lai fnbmjn9 f1o700av f13mvf36 f1cmlufx f9n3di6 f1ids18y f1tx3yz7 f1deo86v f1eh06m1 f1iescvh fhgqx19 f1olyrje f1p93eir f1nev41a f1h8hb77 f1lqvz6u f10aw75t fsle3fq f17ae5zn\" title=\"https://www.mobileye.com/news/prof-amnon-shashua-at-ces-2024/\" href=\"https://www.mobileye.com/news/prof-amnon-shashua-at-ces-2024/\" target=\"_blank\" rel=\"noreferrer noopener\" aria-label=\"Link annual press conference\">annual press conference\u003C/a>. Prof. Shashua analyzed the present and future of Mobileye&rsquo;s assisted and autonomous driving systems. He also elaborated on our recent announcements that Mobileye has been awarded \u003Ca id=\"menur1s2\" class=\"fui-Link ___1q1shib f2hkw1w f3rmtva f1ewtqcl fyind8e f1k6fduh f1w7gpdv fk6fouc fjoy568 figsok6 f1s184ao f1mk8lai fnbmjn9 f1o700av f13mvf36 f1cmlufx f9n3di6 f1ids18y f1tx3yz7 f1deo86v f1eh06m1 f1iescvh fhgqx19 f1olyrje f1p93eir f1nev41a f1h8hb77 f1lqvz6u f10aw75t fsle3fq f17ae5zn\" title=\"https://www.mobileye.com/news/mobileye-reveals-new-wins-for-key-tech-platforms-with-large-global-automaker/\" href=\"https://www.mobileye.com/news/mobileye-reveals-new-wins-for-key-tech-platforms-with-large-global-automaker/\" target=\"_blank\" rel=\"noreferrer noopener\" aria-label=\"Link production design wins\">production design wins\u003C/a> by a major Western automaker.&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">In addition, Prof. Shashua \u003C/span>\u003Cspan data-contrast=\"auto\">introduced Mobileye DXP, a new platform that offers automakers the ability to customize the driving experience, preserving their brand and style.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/uco1z54FAdA?si=g8QYKWHglkow14Lv\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">Stealing the show\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The new DXP platform was at the fore in Mobileye's booth as well, which drew thousands of attendees who were wowed by everything on display. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">With an assist in the form of a \u003Ca href=\"https://www.youtube.com/watch?v=Aw_R7zqs6Sk\" target=\"_blank\" rel=\"noopener\">presentation\u003C/a> by our \u003C/span>\u003Cspan data-contrast=\"none\">Executive VP of Strategy &amp; Business Development, Nimrod Nehushtan\u003C/span>\u003Cspan data-contrast=\"auto\">, our main show took the audience on a journey, through Mobileye&rsquo;s history, by way of its here and now, all the way to what is to come ahead.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/IBXl4kJ2mfQ?si=0dNe5SfIGgoQjKwz\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cspan data-contrast=\"auto\">A deep dive into Mobileye DXP \u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/h3>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Our new Driving Experience Platform was also the subject of Mobileye CTO Prof. Shai Shalev-Shwartz's CES 2024 presentation. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Explaining where the idea behind the system came from and how it balances safety and scale, Prof. Shalev-Shwartz highlighted the platform's ability to provide automakers with AI tools to program the vehicle&rsquo;s functionality, allowing them to customize the driving experience of their vehicles to their preferences.&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/_z3qBZ6vQL8?si=QQ96oUSBpsUyVzJr\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">\u003Cspan class=\"TextRun SCXW192484678 BCX0\" lang=\"EN-US\" xml:lang=\"EN-US\" data-contrast=\"auto\">\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">Mobileye&rsquo;s \u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">yearly \u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">visit \u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">to\u003C/span> \u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">CES was thrilli\u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">ng and inspiring. Al\u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">though \u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">the big event has powered down\u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">,\u003C/span> \u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">rest \u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">as\u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">sure\u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">d\u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\"> that plans for \u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">next year are \u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">already\u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\"> underway, as we continue to work \u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">toward\u003C/span>&nbsp;\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">creating \u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\">a\u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\"> bright\u003C/span>\u003Cspan class=\"NormalTextRun SCXW192484678 BCX0\"> autonomous future.\u003C/span>\u003C/span>\u003Cspan class=\"EOP SCXW192484678 BCX0\" data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/span>\u003C/p>","News, Autonomous Driving, Industry, Video, Events",{"id":704,"type":24,"url":705,"title":706,"description":707,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":707,"image":708,"img_alt":709,"content":710,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":711,"tags":712},250,"prof-amnon-shashua-at-ces-2024","Prof. Amnon Shashua at CES 2024: Now. Next. Beyond. ","On the first day of a record-breaking year at CES, Amnon took the stage at his annual press conference to discuss the current state of autonomous driving—and its future possibilities.","https://static.mobileye.com/website/us/corporate/images/62e61593274d34538d7719105558a450_1705490954008.jpg","Mobileye CEO Prof. Amnon Shashua speaking at his annual CES press conference.","\u003Cp>Mobileye&rsquo;s annual CES press conference was again an industry focal point, with Mobileye CEO Professor Amnon Shashua presenting his insights into key issues.\u003C/p>\n\u003Cp>During his presentation, Prof. Shashua discussed the current state of self-driving car technologies and future possibilities for the industry, zeroing in on two main issues: (1) creating a safer eyes-off system, and (2) providing a platform for carmakers that allows them to customize their driving experience, which was unveiled as Mobileye&rsquo;s new DXP platform.\u003C/p>\n\u003Cp>\u003Cspan data-teams=\"true\">Prof. Shashua also introduced new and exciting collaborations between Mobileye and several OEMs, including Mobileye securing a series of \u003Ca id=\"menur1av\" class=\"fui-Link ___1q1shib f2hkw1w f3rmtva f1ewtqcl fyind8e f1k6fduh f1w7gpdv fk6fouc fjoy568 figsok6 f1s184ao f1mk8lai fnbmjn9 f1o700av f13mvf36 f1cmlufx f9n3di6 f1ids18y f1tx3yz7 f1deo86v f1eh06m1 f1iescvh fhgqx19 f1olyrje f1p93eir f1nev41a f1h8hb77 f1lqvz6u f10aw75t fsle3fq f17ae5zn\" title=\"https://www.mobileye.com/news/mobileye-reveals-new-wins-for-key-tech-platforms-with-large-global-automaker/\" href=\"https://www.mobileye.com/news/mobileye-reveals-new-wins-for-key-tech-platforms-with-large-global-automaker/\" target=\"_blank\" rel=\"noreferrer noopener\" aria-label=\"Link production design wins\">production design wins\u003C/a> from a major Western automaker and Chery's approaching launch of Mobileye&rsquo;s Cloud-Enhanced ADAS on its Exeed VX model.\u003C/span>\u003C/p>\n\u003Cp>During the event, held in Las Vegas, Prof. Shashua highlighted breakthroughs in AI and the opportunities to apply them to autonomous driving. He also updated the expected timeline for Mobileye&rsquo;s EyeQ&trade; 6H SoC and noted the next generation of Mobileye's imaging radars.\u003C/p>\n\u003Cp>Watch the video below to see the full press conference and learn about Mobileye's bridge from hands-on to no-driver systems.\u003C/p>\n\u003Cp>\u003Ciframe src=\"https://www.youtube.com/embed/uco1z54FAdA?si=CaV3jgH9LqWAiXUT&amp;amp\" width=\"560\" height=\"314\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>","2024-01-17T08:00:00.000Z","Video, Amnon Shashua, News, Autonomous Driving, ADAS",{"id":714,"type":24,"url":715,"title":716,"description":717,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":717,"image":718,"img_alt":719,"content":720,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":721,"tags":722},249,"exeed-chery-autos-luxury-brand-chooses-mobileye-and-wbtl-to-launch-first-cloud-enhanced-driver-assist-in-china","EXEED, Chery Auto’s luxury brand chooses Mobileye and WBTL to launch first cloud-enhanced driver-assist in China","The system uses continuous updates to enable vehicles to adapt to road conditions.","https://static.mobileye.com/website/us/corporate/images/dc97cfe61a5c00b94f9afe09d5c163b8_1705322793883.jpg","The Exeed VX luxury SUV","\u003Cp>Wuhu, January 15, 2024 &ndash; EXEED, Chery Auto's luxury brand, announced today it will be the first automaker in China to launch a Cloud-Enhanced Driver-Assist&trade;\u003Csup>&nbsp;\u003C/sup>system, joining with Mobileye (Nasdaq: MBLY) and WBTL ADAS to debut the technology on the Exeed VX.\u003C/p>\n\u003Cp>The EXEED VX with the \u003Ca href=\"https://www.mobileye.com/solutions/cloud-enhanced-driver-assist/\" target=\"_blank\" rel=\"noopener\">Cloud-Enhanced Driver-Assist\u003C/a> system can leverage continuous updates to enable vehicles to adapt to various driving situations. This system is particularly effective in challenging scenarios, such as poor visibility or complex traffic situations, demonstrating Chery/EXEED&rsquo;s commitment to safety and technological innovation.\u003C/p>\n\u003Cp>\"The introduction of the Exeed VX equipped with Mobileye Cloud-Enhanced Driver-Assist technology represents a major leap forward,\" said Dr. GAO Jiabing, Head of Automated Driving, Assistant of General Manager of Chery Auto. \"This collaboration with Mobileye and WBTL underscores our dedication to setting new standards in automotive safety and technological advancements.\"\u003C/p>\n\u003Cp>The cloud-enhanced functions leverage Mobileye&rsquo;s REM\u003Csup>&trade;\u003C/sup> technology, and will launch in February 2024, and by the first quarter of 2024 will support more than 37 cities across China, soon reaching wider coverage across the country. It enables unique features such as lane-keeping on roads without lane markings and adaptation to the applicable driving speed.\u003C/p>\n\u003Cp>\"Launching with Chery/EXEED, for the first time in China, our Cloud-Enhanced Driver-Assist technology is a significant step&rdquo; said Elie Luskin, VP and China Managing Director at Mobileye. &ldquo;Chery/EXEED's focus on safety and their reach in both domestic and international markets align perfectly with Mobileye's crowd-based technology, offering a unified solution across different regions. Implementing Mobileye REM technology in Chery/EXEED vehicles will not only enhance driver safety but also contribute to overall road safety.\"\u003C/p>\n\u003Cp>\u003Cstrong>About EXEED &amp; Chery Auto\u003C/strong>\u003C/p>\n\u003Cp>EXEED, launched by Chery Auto in 2017, represents the pinnacle of luxury and innovation in Chery's diverse automotive portfolio. As a premium SUV division, EXEED has quickly become synonymous with high-end design and performance, showcasing models like the TX, VX, and LX. Meanwhile, Chery Auto, renowned for its wide range of vehicles catering to various market segments, continues to make significant strides in the industry. Both EXEED and Chery embody a commitment to advanced technology and sustainable mobility, with EXEED's introduction of the electric car concept EXLANTIX highlighting this shared vision. Together, they demonstrate a blend of sophistication, innovation, and a forward-thinking approach in the global automotive market.\u003C/p>\n\u003Cp>EXEED has become an automotive brand that meets the travel needs and tastes of high-end customers since its inception, which based on the technological essence of over 20 years in Europe and a team of experts who have provided high-quality services to well-known automotive companies around the world. With its R&amp;D center located in Frankfurt, Europe, EXEED has established partnerships with strong R&amp;D capabilities with top European and American companies.\u003C/p>\n\u003Cp>In recent years, EXEED has been exploring a path forward, and its constantly growing product matrix, such as LX, TXL, VX, RX, etc., which has quietly laid out in the compact, mid-sized, medium and large automotive markets and has rooted in multiple markets such as Central Europe, the Middle East, and South America, it has gradually become the best choice for high-end users.\u003C/p>\n\u003Cp>Nowadays, in the face of the new energy trend, EXEED holds a forward-looking vision and has launched the highly anticipated high-end pure electric series - EXLANTIX. EXEED is prepared to poise to redefine future travel with a brand-new attitude.\u003C/p>\n\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\n\u003Cp>Mobileye (Nasdaq: MBLY) leads the mobility revolution with its autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in computer vision, artificial intelligence, mapping, and data analysis. Since its founding in 1999, Mobileye has pioneered such groundbreaking technologies as REM&trade; crowdsourced mapping, True Redundancy&trade; sensing, and Responsibility Sensitive Safety (RSS). These technologies are driving the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions, powering industry-leading advanced driver-assistance systems and delivering valuable intelligence to optimize mobility infrastructure. To date, more than 150 million vehicles worldwide have been built with Mobileye technology inside. In 2022 Mobileye listed as an independent company separate from Intel (Nasdaq: INTC), which retains majority ownership. For more information, visit \u003Ca href=\"https://www.mobileye.com\">https://www.mobileye.com\u003C/a>\u003C/p>\n\u003Cp>\u003Cstrong>About WBTL ADAS\u003C/strong>:\u003C/p>\n\u003Cp>WBTL is a tier-1 supplier providing active and passive safety systems for passenger and commercial vehicles, delivering leading solutions in ADAS, Braking, Steering, and Lightweight applications to more than 30 OEM customers around the globe.\u003C/p>\n\u003Cp>WBTL ADAS aims to accelerate next generation ADAS/AD technology, allowing wide and efficient deployment of life-saving ADAS systems on vehicles in more than 15 countries. WBTL ADAS&rsquo; deep, AI-based knowhow in chassis control and vehicle dynamics uniquely positioned itself in pushing toward a more robust Automated Driving future.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2024-01-15T08:00:00.000Z","ADAS, Industry, News",{"id":724,"type":24,"url":725,"title":726,"description":727,"primary_tag":16,"author_name":16,"is_hidden":11,"lang":12,"meta_description":727,"image":728,"img_alt":729,"content":730,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":731,"tags":732},247,"mobileye-reveals-new-wins-for-key-tech-platforms-with-large-global-automaker","Mobileye reveals new wins for key tech platforms with large global automaker ","Mobileye automated driving tech slated for use across multiple global brands and 17 new models, starting from 2026.","https://static.mobileye.com/website/us/corporate/images/66263c67cd7ee9163cb270e6fdeee5e4_1704644722450.jpg","At CES 2024, Mobileye revealed new wins for its three key platforms – Mobileye SuperVision™, Mobileye Chauffeur™ and Mobileye Drive™.","\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">LAS VEGAS, JAN 8, 2024 &ndash; \u003C/span>\u003C/strong>\u003Cspan data-contrast=\"auto\">Mobileye (Nasdaq: MBLY) announced today that it has been awarded a series of production design wins by a major Western automaker. Under these design wins, multiple global brands are expected to implement new automated driving solutions using Mobileye&rsquo;s three key platforms &ndash; Mobileye SuperVision\u003C/span>\u003Cspan data-contrast=\"none\">&trade;\u003C/span>\u003Cspan data-contrast=\"auto\">, Mobileye Chauffeur\u003C/span>\u003Cspan data-contrast=\"none\">&trade;\u003C/span>\u003Cspan data-contrast=\"auto\"> and Mobileye Drive\u003C/span>\u003Cspan data-contrast=\"none\">&trade;\u003C/span>\u003Cspan data-contrast=\"auto\"> &ndash; for 17 internal combustion and electric vehicle models, which are set to begin rolling out in 2026.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The extensive set of awards includes Mobileye's unique and innovative software tool that will ensure each brand can maintain the utmost level of customization and personalization in their driving experiences. The premium ADAS and automated solutions are expected to be offered on multiple vehicle platforms across a broad range of geographies and various powertrain types, and can be easily expanded to additional models based on demand.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">&ldquo;These design wins represent an historic milestone in the development of automated driving, and will greatly increase its availability to customers globally,&rdquo; said Mobileye CEO Prof. Amnon Shashua. &ldquo;Execution of these production programs will set the standard for software-driven intelligent driving, leveraging the expertise of both companies at volume to serve customers around the world.&rdquo;&nbsp;&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye will work with the various brands as a Tier 1 to develop new services for hands-off, eyes-on driving on the Mobileye SuperVision platform, leveraging AI-powered surround computer vision and radar that enable navigate-on-pilot functions for highway, rural and urban roads in defined operational domains. These services are currently expected to begin rolling out across multiple markets and regions in 2026.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye will also work with these automotive brands to implement the Mobileye Chauffeur platform to select models, offering eyes-off, hands-off advanced driving solutions in specified operating design domains. Mobileye enables Chauffeur by adding a second, independent perception system leveraging radar and lidar sensor outputs, as well as additional computing power as needed, to the SuperVision platform, creating a naturally scalable upgrade path for automakers.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">The two companies also agreed to bring fully autonomous vehicles into series production. Powered by the Mobileye Drive platform, this program is designed to produce purpose-built vehicles utilized in robotaxi and mobility-as-a-service operations. The Drive-enabled vehicles leverage computer vision, lidar and Mobileye imaging radar, with initial driverless deployments targeted for 2026.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">All systems will use the Mobileye EyeQ\u003C/span>\u003Cspan data-contrast=\"none\">&trade;\u003C/span>\u003Cspan data-contrast=\"auto\">6H systems-on-chip designed for powerful but efficient computing to integrate all sensing and REM crowdsourced mapping with safe driving policy.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">&ldquo;Since our founding, we have focused on delivering the safety and convenience benefits of advanced computer vision technology around the world,&rdquo; said Shashua. &ldquo;The pace of innovation has undoubtedly increased and the breadth of this agreement serves as a blueprint for the scalability and customizability of our technology stack, with SuperVision serving as a bridge to eyes-off systems for both consumer-owned vehicle and mobility-as-a-service markets.&rdquo;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>","2024-01-08T08:00:00.000Z","ADAS, Autonomous Driving",{"id":734,"type":69,"url":735,"title":736,"description":737,"primary_tag":73,"author_name":16,"is_hidden":11,"lang":12,"meta_description":737,"image":738,"img_alt":739,"content":740,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":741,"tags":742},243,"mobileye-at-ces-2024","Mobileye at CES 2024","Visit our press kit throughout the show for our latest news, videos, photos and more.","https://static.mobileye.com/website/us/corporate/images/3b11d58d5ed1e6627e8ae0adb746a603_1703018865471.jpg","Mobileye: Now. Next. Beyond. with President & CEO Prof. Amnon Shashua will be Presented on January 9, 2024, at 11:00 a.m. PT. ","\u003Cp>Mobileye returns to Las Vegas for CES 2024, demonstrating progress and innovation on our path to delivering an evolutionary vision of autonomy.\u003C/p>\n\u003Cp>Visit \u003Ca href=\"https://www.mobileye.com/ces-2024/\" target=\"_blank\" rel=\"noopener\">mobileye.com/ces-2024\u003C/a> for complete information on all of our CES activities.\u003C/p>\n\u003Ch3 style=\"font-size: 21px; font-weight: bold;\">&nbsp;\u003C/h3>\n\u003Ch3 style=\"font-size: 21px; font-weight: bold;\">\u003Cstrong>Mobileye CES 2024 News\u003C/strong>\u003C/h3>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/news/mobileye-announces-ces-2024-press-conference-with-prof-amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Mobileye Announces CES 2024 Press Conference with Prof. Amnon Shashua\u003C/a> (Press Release, December 20, 2023)\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/news/mobileye-reveals-new-wins-for-key-tech-platforms-with-large-global-automaker/\" target=\"_blank\" rel=\"noopener\">Mobileye Reveals New Wins for Key Tech Platforms With Large Global Automaker\u003C/a> (Press Release, January 8, 2024)\u003C/p>\n\u003Ch3 style=\"font-size: 21px; font-weight: bold;\">&nbsp;\u003C/h3>\n\u003Ch3 style=\"font-size: 21px; font-weight: bold;\">Presentations\u003C/h3>\n\u003Cp>\u003Ca href=\"https://static.mobileye.com/website/corporate/docs/product_portfolio_2024.pdf\" target=\"_blank\" rel=\"noopener\">The full spectrum of Mobileye solutions, from driver assistance to autonomous driving.\u003C/a> (Product Brief)\u003C/p>\n\u003Cp>\u003Ca href=\"https://static.mobileye.com/website/corporate/ces-2024/CES_2024_Now._Next._Beyond.pdf\" target=\"_blank\" rel=\"noopener\">Mobileye: Now. Next. Beyond. CES 2024 Press Conference with Prof. Amnon Shashua\u003C/a> (Presentation)\u003C/p>\n\u003Cp>\u003Ca href=\"https://static.mobileye.com/website/corporate/ces-2024/Mobileye_DXP_Prof._Shai_Shalev-Shwartz.pdf\" target=\"_blank\" rel=\"noopener\">Mobileye's Driving Experience Platform: Architecture, Abstractions, and APIs, presented by Mobileye CTO Prof. Shai Shalev-Shwartz\u003C/a> (Presentation)\u003C/p>\n\u003Ch3 style=\"font-size: 21px; font-weight: bold;\">&nbsp;\u003C/h3>\n\u003Ch3 style=\"font-size: 21px; font-weight: bold;\">Photos\u003C/h3>\n\u003Cp>\u003Cstrong>Mobileye at CES 2024\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:ces-2024-updated[**]\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye's Advanced Platforms in the Driver's Seat\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileyes-advanced-platforms-in-the-drivers-seat[**]\u003C/p>\n\u003Ch3 style=\"font-size: 21px; font-weight: bold;\">&nbsp;\u003C/h3>\n\u003Ch3 style=\"font-size: 21px; font-weight: bold;\">Videos\u003C/h3>\n\u003Cp>\u003Ca href=\"https://vimeo.com/901092712/422cd0f7d0?share=copy\" target=\"_blank\" rel=\"noopener\">Mobileye 2024 Broll\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://vimeo.com/901109646/b688a364d8?share=copy\" target=\"_blank\" rel=\"noopener\">Introducing Mobileye DXP\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://vimeo.com/901643395?share=copy\" target=\"_blank\" rel=\"noopener\">Mobileye: Now. Next. Beyond. CES 2024 Press Conference with Prof. Amnon Shashua\u003C/a> (Replay)\u003C/p>\n\u003Cp>\u003Ca href=\"https://vimeo.com/902025604\" target=\"_blank\" rel=\"noopener\">Mobileye's Driving Experience Platform: Architecture, Abstractions, and APIs.\u003C/a> (Replay)\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Ch3 style=\"font-size: 21px; font-weight: bold;\">Mobileye Brand\u003C/h3>\n\u003Cp>[**]gallery:mobileye's-logo[**]\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2024-01-02T08:00:00.000Z","Autonomous Driving, Press Kit, Events, ADAS, News, Industry",{"id":744,"type":5,"url":745,"title":746,"description":747,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":747,"image":748,"img_alt":749,"content":750,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":751,"tags":752},246,"drivers-in-china-post-reaction-videos-to-supervision","Drivers in China react to Mobileye's SuperVision™","Drivers in China upload their own content to showcase the success of Mobileye's SuperVision™ system inside the ZEEKR 001.","https://static.mobileye.com/website/us/corporate/images/d1846ded0582c424828c3a5340b27b38_1704116939277.png","Drivers who received the OTA update to Mobileye's SuperVision™ documented ten impressive driving scenarios.","\u003Cp>This last fall, Chinese automotive brand ZEEKR launched a major \u003Ca href=\"https://www.mobileye.com/news/mobileye-supervision-pilot-functions-added-to-110000-zeekr-vehicles/\" target=\"_blank\" rel=\"noopener\">over-the-air (OTA) update\u003C/a> to 110,000 of its 001 models that support highly automated driving features powered by Mobileye SuperVision&trade;. The update delivered full navigate-on-pilot capabilities in specified domains to the Navigation ZEEKR Pilot (NZP) driving assistance system, and drivers responded by uploading videos documenting their experiences of it.\u003C/p>\n\u003Cp>The update leverages SuperVision&rsquo;s 11-camera 360&deg; surround vision sensing capability and additional Mobileye technology, all of which provides the system with the ability to construct an environmental model and extract semantic information such as common speed and average drivable path. These advanced capabilities allow the system to &ldquo;think&rdquo; ahead and make predictions based on a full understanding of the driving environment.\u003C/p>\n\u003Cp>This video collation covers ten real-life driving scenarios (numbered from 10 to 1) showcasing NZP&rsquo;s impressive new features. You can witness how Mobileye SuperVision&trade;&nbsp; drives safely and naturally, mimicking how human drivers negotiate the road in challenging situations that are difficult even for them, including evading a bus entering the vehicle's lane on the highway, avoiding stopped vehicles, and successfully navigating complex construction zones.\u003C/p>\n\u003Cp>Click on the video below for more.\u003C/p>\n\u003Cp>\u003Ciframe src=\"https://player.vimeo.com/video/915097455?h=73613131d2&amp;title=0&amp;byline=0&amp;portrait=0\" width=\"640\" height=\"360\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>","2024-01-01T08:00:00.000Z","ADAS, Industry, Mapping & REM, Video, Mobileye Inside",{"id":754,"type":24,"url":755,"title":756,"description":757,"primary_tag":16,"author_name":16,"is_hidden":11,"lang":12,"meta_description":757,"image":758,"img_alt":759,"content":760,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":761,"tags":762},244,"mobileye-announces-ces-2024-press-conference-with-prof-amnon-shashua","Mobileye Announces CES 2024 Press Conference with Prof. Amnon Shashua  ","Mobileye: Now. Next. Beyond. and additional events presented live from Las Vegas  ","https://static.mobileye.com/website/us/corporate/images/70e29fa9085f2151107ae35044c5a59b_1703020097934.jpg","President & CEO Prof. Amnon Shashua will deliver his annual CES address on January 9, 2024, at 11:00 a.m. PT.","\u003Cp>\u003Cspan data-contrast=\"none\">Mobileye (Nasdaq: MBLY) will present its 2024 CES Press Conference, \u003C/span>\u003Cem>\u003Cspan data-contrast=\"none\">Mobileye: \u003C/span>\u003C/em>\u003Cem>\u003Cspan data-contrast=\"none\">Now. Next. Beyond. \u003C/span>\u003C/em>\u003Cspan data-contrast=\"none\">with President &amp; CEO Prof. Amnon Shashua, on January 9, 2024, at 11:00 a.m. PT.&nbsp;&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">The annual CES address will explore the state of self-driving, highlighting Mobileye&rsquo;s advancements in delivering an evolutionary vision of autonomy, including a spectrum of scalable solutions from hands-off/eyes-on ADAS to eyes-off autonomous vehicles. Prof. Shashua will also introduce a new technological breakthrough that unlocks automakers&rsquo; ability to configure the driving experience in ways that align with their brand and consumer tastes.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">As a follow up to\u003C/span>\u003Cem>\u003Cspan data-contrast=\"none\"> Now. Next. Beyond., \u003C/span>\u003C/em>\u003Cspan data-contrast=\"none\">Mobileye CTO Prof. Shai Shalev-Shwartz will present \u003C/span>\u003Cem>\u003Cspan data-contrast=\"none\">Mobileye's Driving Experience Platform: Architecture, Abstractions, and APIs \u003C/span>\u003C/em>\u003Cspan data-contrast=\"none\">on Wednesday, January 10 at 1:00 p.m. PT.\u003C/span> \u003Cspan data-contrast=\"none\">Autonomous driving demands weaving a complex AI tapestry with the precision vital for safety-focused systems. This talk will delve into Mobileye's specialized programming platform engineered specifically to meet this challenge. Prof. Shalev-Shwartz will showcase how the platform's APIs provide automakers with the tools to program the AI's functionality to the unique desired driver characteristics&nbsp;of each car model.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Also on Wednesday, January 10, Mobileye Senior Vice President of AV Johann &ldquo;JJ&rdquo; Jungwirth will appear on the CES-presented Conference Session panel titled \u003C/span>\u003Cem>\u003Cspan data-contrast=\"none\">The Middle Lane: Self-Driving Cars Today \u003C/span>\u003C/em>\u003Cspan data-contrast=\"none\">at 9:00 a.m. PT.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">For updated information on Mobileye&rsquo;s CES news and events, visit: \u003C/span>\u003Ca href=\"https://www.mobileye.com/ces-2024/\">www.mobileye.com/ces-2024/\u003C/a>\u003Cspan data-contrast=\"none\">. \u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"none\">\u003Cu>Mobileye CES 2024 Events\u003C/u>\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"none\">CES 2024 Press Conference, \u003C/span>\u003C/strong>\u003Cstrong>\u003Cem>\u003Cspan data-contrast=\"none\">Mobileye: \u003C/span>\u003C/em>\u003C/strong>\u003Cstrong>\u003Cem>\u003Cspan data-contrast=\"none\">Now. Next. Beyond. \u003C/span>\u003C/em>\u003C/strong>\u003Cstrong>\u003Cspan data-contrast=\"none\">w\u003C/span>\u003C/strong>\u003Cstrong>\u003Cspan data-contrast=\"none\">ith Prof. Amnon Shashua\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Tuesday, January 9, 2024, 11:00 a.m. PT\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/ces-2024/\">Visit here for registration and livestream details\u003C/a>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cem>\u003Cspan data-contrast=\"none\">M\u003C/span>\u003C/em>\u003C/strong>\u003Cstrong>\u003Cem>\u003Cspan data-contrast=\"none\">obileye's Driving\u003C/span>\u003C/em>\u003C/strong>\u003Cstrong>\u003Cem> \u003C/em>\u003C/strong>\u003Cstrong>\u003Cem>\u003Cspan data-contrast=\"none\">Experience\u003C/span>\u003C/em>\u003C/strong>\u003Cstrong>\u003Cem> \u003C/em>\u003C/strong>\u003Cstrong>\u003Cem>\u003Cspan data-contrast=\"none\">Platform: Architecture, Abstractions, and \u003C/span>\u003C/em>\u003C/strong>\u003Cstrong>\u003Cem>\u003Cspan data-contrast=\"none\">APIs \u003C/span>\u003C/em>\u003C/strong>\u003Cstrong>\u003Cspan data-contrast=\"none\">with CTO Shai Shalev-Shwartz\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Wednesday, Jan 10, 2024: 1:00-1:40 p.m. PT\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/ces-2024/\">Visit here for registration and livestream details\u003C/a>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"auto\">CES Session \u003C/span>\u003C/strong>\u003Cstrong>\u003Cem>\u003Cspan data-contrast=\"auto\">The Middle Lane: Self-Driving Cars\u003C/span>\u003C/em>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Featuring Mobileye&rsquo;s JJ Jungwirth\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Wednesday, January 10, 9:00-9:40 a.m. PT\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.ces.tech/sessions-events/auto/auto01.aspx\">Visit here for more information\u003C/a>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">+++\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335551550&quot;:2,&quot;335551620&quot;:2,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"none\">Contacts\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"none\">&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Dan Galves&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Investor Relations&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Ca href=\"mailto:investors@mobileye.com\">\u003Cspan data-contrast=\"none\">investors@mobileye.com\u003C/span>\u003C/a>\u003Cspan data-contrast=\"none\">&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Justin Hyde&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Media Relations&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Ca href=\"mailto:justin.hyde@mobileye.com\">\u003Cspan data-contrast=\"none\">justin.hyde@mobileye.com\u003C/span>\u003C/a>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559737&quot;:-6,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"none\">About Mobileye\u003C/span>\u003C/strong>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559737&quot;:-6,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Mobileye (Nasdaq: MBLY) leads the mobility revolution with its autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in computer vision, artificial intelligence, mapping, and data analysis. Since its founding in 1999, Mobileye has pioneered such groundbreaking technologies as REM&trade; mapping, True Redundancy&trade; sensing, and Responsibility Sensitive Safety (RSS). These technologies are driving the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions, powering industry-leading advanced driver-assistance systems and delivering valuable intelligence to optimize mobility infrastructure. To date, more than 150 million vehicles worldwide have been built with Mobileye technology inside. In 2022 Mobileye listed as an independent company separate from Intel (Nasdaq: INTC), which retains majority ownership. For more information, visit \u003C/span>\u003Ca href=\"https://www.mobileye.com/\">\u003Cspan data-contrast=\"none\">https://www.mobileye.com\u003C/span>\u003C/a>\u003Cspan data-contrast=\"none\">.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559737&quot;:-6,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>","2023-12-20T08:00:00.000Z","Autonomous Driving, Amnon Shashua, ADAS, AV Safety, Events, News",{"id":764,"type":24,"url":765,"title":766,"description":767,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":767,"image":768,"img_alt":769,"content":770,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":771,"tags":722},242,"valeo-has-produced-its-20-millionth-front-camera-system-integrating-mobileye-eyeq-technology","Valeo has produced its 20 millionth front camera system integrating Mobileye EyeQ® technology","The milestone comes only 12 months after producing the 10th million front camera.","https://static.mobileye.com/website/us/corporate/images/48ef07ed4d860edf1fef6e91a5cc1eed_1702488159134.jpg","Valeo's production line integrating Mobileye's EyeQ system-on-chip.","\u003Cp>On November 15, 2023, Valeo produced its 20 millionth front camera system containing Mobileye technology. This milestone comes only 12 months after reaching the 10th million front camera in November 2022. Up to 40,000 front cameras a day are manufactured in three production sites in Europe and China serving 12 OEMs.\u003C/p>\n\u003Cp>Valeo, world leader in advanced driving assistance systems (ADAS), started its collaboration with Mobileye in 2015, choosing to integrate into its front camera system hardware and software the Mobileye EyeQ&reg; system-on-chip (SoC). Mobileye, with partners like Valeo, revolutionized driving assistance systems by taking computer vision technology to a next level in the automotive industry. Together, Valeo and Mobileye have combined their best in class technology and have developed and manufactured multiple generations of front camera systems. The strong collaboration between Valeo and Mobileye is now focused on the integration of the latest generation of the Mobileye SoC into the Valeo front camera system and also into Valeo&rsquo;s cutting edge centralized computer and software.\u003C/p>\n\u003Cp>Valeo's front camera system using the Mobileye EyeQ contributes to the safety of road users by supporting key features such as autonomous emergency braking, adaptive cruise control, lane keeping assist, and traffic sign recognition. A front camera system is the key enabler in reaching safety requirements defined by authorities around the world, and we expect that by 2030 about 90% of all new cars will be equipped with advanced driver assistance technology.\u003C/p>\n\u003Cp>Marc Vrecko, President of Valeo&rsquo;s Comfort and Driving Assistance Systems Business Group, said: &ldquo;Valeo offers the largest portfolio of sensors and perception software available on the market. Front cameras are one of the key technologies for enhanced safety and will soon be set up on all vehicles. We are very proud to have reached this new milestone in the ramp up of our production and in our partnership with Mobileye. At Valeo, we strive to make our technology efficient, but also available to all and we are proud to be contributing to the safety of road users.\"\u003C/p>\n\u003Cp>\u003Cem>&ldquo;\u003C/em>Since our founding we have focused on improving vehicle safety for drivers globally. Our ongoing success with Valeo demonstrates how far we&rsquo;ve come together and the opportunies that lie ahead to bring even more advanced technologies to millions of vehicles over the next few years. We look forward to working with Valeo with the next-generation Mobileye EyeQ6 system-on-chip going into full production next year,\u003Cem>&rdquo;\u003C/em>&nbsp;said Nimrod Nehushtan, Senior Vice President of Strategy and Business Development at Mobileye.\u003C/p>\n\u003Cp>Today, more than 90% of road accidents are caused by human error. ADAS are at the heart of the transformation of mobility, driven by enhancements in safety and comfort. The ADAS market is expected to grow by 17% per year to reach 60 billion euros in 2030. Valeo has the market&rsquo;s most comprehensive portfolio of sensors (ultrasonic sensors, cameras, radars and LiDARs), software and associated intelligence. Beyond their collaboration for cameras, Mobileye and Valeo announced in September a new collaboration on software-defined imaging radars to support the next-generation of driver assist and automated driving features.\u003C/p>\n\u003Cp>\u003Cem>About Valeo: \u003C/em>\u003C/p>\n\u003Cp>As a technology company and partner to all automakers and new mobility players, Valeo is innovating to make mobility cleaner, safer and smarter. Valeo enjoys technological and industrial leadership in electrification, driving assistance systems, reinvention of the interior experience and lighting everywhere. These four areas, vital to the transformation of mobility, are the Group's growth drivers.\u003C/p>\n\u003Cp>Valeo in figures: 20 billion euros in sales in 2022 | 109,900 employees at December 31, 2022 | 29 countries, 183 plants, 21 research centers, 44 development centers, 18 distribution platforms.\u003C/p>\n\u003Cp>Valeo is listed on the Paris Stock Exchange.\u003C/p>\n\u003Cp>\u003Cem>About Mobileye:\u003C/em>\u003C/p>\n\u003Cp>Mobileye (Nasdaq: MBLY) is driving the autonomous vehicle evolution with its autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in computer vision, artificial intelligence, mapping, and data analysis. Since its founding in 1999, Mobileye has pioneered such groundbreaking technologies as REM&trade; crowdsourced mapping, True Redundancy&trade; sensing, and Responsibility Sensitive Safety (RSS). These technologies are driving the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions, powering industry-leading advanced driver-assistance systems and delivering valuable intelligence to optimize mobility infrastructure. To date, more than 125 million vehicles worldwide have Mobileye technology inside. In 2022, Mobileye was listed as an independent company, separate from Intel (Nasdaq: INTC) which retains majority ownership of Mobileye.\u003C/p>","2023-12-13T08:00:00.000Z",{"id":773,"type":24,"url":774,"title":775,"description":776,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":776,"image":777,"img_alt":778,"content":779,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":780,"tags":781},240,"mobileye-ceo-amnon-shashua-named-2023-automotive-all-star","Mobileye CEO Professor Amnon Shashua Named 2023 Automotive News All-Star","Mobileye is honored that CEO Professor Amnon Shashua has been recognized as a 2023 Automotive News All-Star.","https://static.mobileye.com/website/us/corporate/images/c3ece88ac9b3ddf800b4db0127698987_1702298590764.png","Professor Amnon Shashua, Founder and CEO of Mobileye","\u003Cp>Professor Amnon Shashua has been recognized as a 2023 \u003Cem>Automotive News \u003C/em>All-Star, marking the second time he has received this award from \u003Cem>Automotive News\u003C/em>. The award recognizes automotive industry leaders who are \"making a significant impact on their companies and the industry at large\".\u003C/p>\n\u003Cp>The award, in the Automated Vehicles category, highlights Shashua&rsquo;s leadership as Mobileye incrementally builds the bridge towards autonomous driving with products like Mobileye SuperVision&trade; and Mobileye Chauffeur&trade;. Having contracts in place with automakers representing 34 percent of global auto production, Mobileye is at the forefront of the industry's evolution.\u003C/p>\n\u003Cp>In the context of his receiving this award, Shashua sat down with Pete Bigelow from the \u003Cem>Automotive News\u003C/em> podcast \u003Cem>Shift\u003C/em> and discussed the broader challenges of the autonomous vehicle industry, including regulatory considerations, market-specific adaptations, and the delicate balance between safety, efficiency, and cost.\u003C/p>\n\u003Cp>\u003Cem>\u003Ca href=\"https://soundcloud.com/user-383952226/mobileyes-amnon-shashua-eyes-profits-on-road-to-full-autonomy?si=3efb4a875a2141c0acda67e3fe1886cf&amp;utm_source=clipboard&amp;utm_medium=text&amp;utm_campaign=social_sharing\">Listen to the podcast now\u003C/a>\u003C/em>\u003C/p>","2023-12-11T08:00:00.000Z","Amnon Shashua, Industry, News, Awards",{"id":783,"type":5,"url":784,"title":785,"description":786,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":786,"image":787,"img_alt":788,"content":789,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":790,"tags":304},238,"mobileye-supervision-zeekr-19th-asian-games-china","Mobileye SuperVision™ performs at 19th Asian Games in China","Zeekr vehicles equipped with Mobileye hands-off/eyes-on automated driving technology shuttled visitors around Hangzhou during the international sporting event.","https://static.mobileye.com/website/us/corporate/images/664ad7038fb5e45b46ed745347d29b7e_1699532117828.jpg","Zeekr 001 with Mobileye SuperVision™ at the 19th Asian Games in Hangzhou, China.","\u003Cp>Last month the 19\u003Csup>th\u003C/sup> Asian Games concluded in Hangzhou, China, bringing together some 12,000 athletes from 45 countries to compete in 481 events across 40 different sports. While the athletes competed in the stadiums and sports complexes across the city, Mobileye technology performed impressively on the surrounding roadways.\u003C/p>\n\u003Cp>Mobilized in support of the games was a fleet of Zeekr 001 and 009 vehicles equipped with \u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-bridge-to-consumer-autonomous-vehicles/\">Mobileye SuperVision&trade;\u003C/a>. The vehicles shuttled athletes, organizers, and media between the athletes' village on the east bank of the Qiantang River and the organizing committee headquarters on the west. The 30-minute route covered 12.6 kilometers (7.8 miles) of hands-off/eyes-on urban driving, including tunnels, intersections, and traffic lights.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/e7686af3910f797ec580f1bdcb0fb486_1699532858781.jpg\" alt=\"Mobileye SuperVision&trade; enables hands-off/eyes-on driving capabilities in vehicles like the Zeekr 001.\" width=\"1650\" height=\"777\" />\u003C/p>\n\u003Cp>The demo drives were conducted by Zeekr with support from Mobileye's AV team on the ground in China. The Zeekrs were among some \u003Ca href=\"https://zgh.com/media-center/news/2023-09-03/?lang=en\">2,000 vehicles dedicated for use during the games\u003C/a> by parent company Geely, which served as Official Mobility Partner for the 19\u003Csup>th\u003C/sup> Asian Games in Hangzhou. The Asian Games are regarded as the second largest multi-sport event in the world after the Olympics.\u003C/p>\n\u003Cp>The event followed just weeks after \u003Ca href=\"https://www.mobileye.com/news/mobileye-supervision-pilot-functions-added-to-110000-zeekr-vehicles/\">Zeekr over-the-air updated 110,000 of its vehicles\u003C/a> already in the hands of customers with a major upgrade to their Navigation Zeekr Pilot (NZP) assisted driving system. The upgraded system, based on Mobileye SuperVision, enables point-to-point automated highway navigation, lane changes, on/off-ramp assist, and intelligent traffic safety functions within identified operational design domains. The Asian Games' host city Hangzhou was among the first locations (alongside Shanghai) in which the feature was unlocked.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/f161e816b30baa702b80ffbb10cbefcb_1699532917658.jpg\" alt=\"The Zeekr 001 and 009 were the first vehicles to reach the market with Mobileye SuperVision&trade;.\" width=\"1650\" height=\"777\" />\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\">Mobileye SuperVision\u003C/a> is the most advanced of our solutions currently on the market, bridging from today's driver-assistance systems to the self-driving vehicles of tomorrow. It's designed to enable the hands-off driving capabilities of an autonomous vehicle while under the driver's watchful eye. The system incorporates dual EyeQ&trade; systems-on-chips and 11 cameras for 360-degree computer vision coverage, along with our REM&trade;-generated maps and RSS&trade;-based driving policy.\u003C/p>\n\u003Cp>The \u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-zeekr-ota-update/\">Zeekr 001\u003C/a> and \u003Ca href=\"https://www.mobileye.com/news/zeekr-mobileye-supervision/\">Zeekr 009\u003C/a> were the first vehicles with Mobileye SuperVision to reach the market. The technology has since been announced for integration into future models from \u003Ca href=\"https://www.mobileye.com/news/porsche-mobileye-supervision-collaboration/\">Porsche\u003C/a>, \u003Ca href=\"https://www.mobileye.com/news/polestar-selects-mobileye-to-bring-autonomous-technology-to-polestar-4/\" target=\"_blank\" rel=\"noopener\">Polestar\u003C/a>, \u003Ca href=\"https://www.mobileye.com/news/smart-chooses-mobileye-supervision-for-advanced-driving-automation/\">Smart\u003C/a>, and \u003Ca href=\"https://www.mobileye.com/news/faw-group-and-mobileye-forge-strategic-alliance-in-autonomous-driving/\">FAW Group\u003C/a>.\u003C/p>","2023-11-20T08:00:00.000Z",{"id":792,"type":24,"url":793,"title":794,"description":795,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":795,"image":796,"img_alt":797,"content":798,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":799,"tags":800},239,"polestar-4-to-integrate-luminar-lidar-with-mobileye-chauffeur","Polestar 4 to integrate Luminar lidar with Mobileye Chauffeur","A future version of the Polestar 4 will feature Mobileye Chauffeur eyes-off, hands-off technology using Luminar lidar.","https://static.mobileye.com/website/us/corporate/images/ae921496490c0c06f0fe706d2806110b_1699555981959.jpg","The Polestar 4, which will feature Mobileye technology inside","\u003Cp>GOTHENBURG, SWEDEN &ndash;\u003Cstrong>&nbsp;\u003C/strong>Polestar is working with Luminar, a leading automotive technology company, and Mobileye, a global leader in autonomous driving solutions, to enhance safety and the future autonomous driving capabilities of Polestar 4 with the integration of Luminar&rsquo;s next-generation LiDAR technology with Mobileye's Chauffeur platform.\u003C/p>\n\u003Cp>\u003Ca href=\"https://media.polestar.com/global/en/media/pressreleases/673327\">Announced in August\u003C/a>, Polestar 4 is planned to be the first production car to feature Mobileye Chauffeur, now with Luminar LiDAR, which builds upon the full-surround camera-based SuperVision platform available in Polestar 4 from launch. Together, the three companies aim to offer eyes-off, point-to-point autonomous driving on highways, as well as eyes-on hands off driving for other environments.\u003C/p>\n\u003Cp>With Mobileye Chauffeur, Polestar 4 is set to feature three Mobileye EyeQ6 processors, a front-facing LiDAR from Luminar, and Mobileye&rsquo;s front-facing imaging radar to provide the extra layer of sensing and artificial intelligence needed to enable eyes-off, hands-off driving.\u003C/p>\n\u003Cp>LiDAR (Light Detection and Ranging) uses lasers to create a highly detailed 3D map of the surrounding environment. Luminar&rsquo;s LiDAR is uniquely engineered from chip-level up and with a higher wavelength, enabling the greatest level of performance and safety capabilities for production cars. When coupled with Mobileye&rsquo;s Chauffeur platform, the result will be a turnkey, safer and high-performing automated system.\u003C/p>\n\u003Cp>Building on the existing relationship between Luminar and Mobileye, the integration of Luminar LiDAR into Polestar 4 also expands the \u003Ca href=\"https://media.polestar.com/global/en/media/pressreleases/663533\">partnership between Luminar and Polestar\u003C/a> which was announced in January 2023.\u003C/p>\n\u003Cp>Thomas Ingenlath, Polestar CEO, says: &ldquo;Polestar 4 comes with the highly advanced Mobileye SuperVision ADAS from the start, and we look forward to expanding that with Mobileye Chauffeur in the future. Being able to add Luminar&rsquo;s industry-leading LiDAR to the platform&rsquo;s development increases the strong link between our companies and brings even more world-class technology to Polestar 4.&rdquo;\u003C/p>\n\u003Cp>Prof. Amnon Shashua, CEO of Mobileye, says: &ldquo;Combining our base SuperVision with an independent second redundant perception system &ndash; consisting of Luminar LiDAR, radars and an imaging radar &ndash; enables true redundancy and a level of accuracy that lays the foundation for fully autonomous driving.&rdquo;\u003C/p>\n\u003Cp>Austin Russell, founder and CEO of Luminar, says: &ldquo;After collaborating with Mobileye on a solution since 2019, the true fruits of our labour are being realised for the first time by transitioning out of R&amp;D and into a production vehicle with Polestar. Together, we look forward to raising the benchmark in the industry for what a safe and autonomous future can look like.&rdquo;\u003C/p>","2023-11-09T08:00:00.000Z","Industry, News, Autonomous Driving",{"id":802,"type":5,"url":803,"title":804,"description":805,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":805,"image":806,"img_alt":807,"content":808,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":809,"tags":563},237,"intelligent-speed-assist-isa-computer-vision-adas-solution","Intelligent speed assist shows the power of Mobileye's vision","Intelligent speed assistance in Europe: Our computer vision technology goes far beyond identifying speed limit signs to surpass the requirements of the EU's latest General Safety Regulation.","https://static.mobileye.com/website/us/corporate/images/24929c228c9cfb7de391db538376a7fc_1696853690263.png","Mobileye’s ISA solution employs an array of cutting-edge AI technologies to identify all manner of traffic signage.","\u003Cp>Mobileye is known throughout the automotive industry for our experience and leadership in computer vision technology for driver assistance. The latest product of that expertise is our new Intelligent Speed Assist solution.\u003C/p>\n\u003Cp>Announced just a few months ago, \u003Ca href=\"https://www.mobileye.com/news/mobileye-launches-the-first-camera-only-intelligent-speed-assist-to-meet-new-eu-standards/\">Mobileye's new Intelligent Speed Assist (ISA) system\u003C/a> has been certified to surpass the rigorous new standards set out under \u003Ca href=\"https://www.mobileye.com/blog/intelligent-speed-assist-general-safety-regulation/\">the European Union's General Safety Regulation\u003C/a>. It's the first (and so far the only) such solution to do so based purely on camera-based computer vision technology&hellip; without relying on a third-party map or requiring any additional hardware.\u003C/p>\n\u003Cp>That makes Mobileye's new ISA system an exceptionally proficient, cost-effective, and easy-to-implement solution for automobile manufacturers, and a safety-enhancing feature for drivers of vehicles equipped with Mobileye technology.\u003C/p>\n\u003Ch3>\u003Cstrong>What is intelligent speed assist?\u003C/strong>\u003C/h3>\n\u003Cp>Intelligent Speed Assist is designed to help drivers stay within the legal speed limit by either passively alerting the driver or actively intervening to reduce the vehicle's speed.\u003C/p>\n\u003Cp>The cornerstone of any such system is traffic sign recognition (TSR), which employs \u003Ca href=\"https://www.mobileye.com/blog/camera-first-approach-for-assisted-autonomous-driving/\">computer vision technology\u003C/a> to identify and interpret traffic signs &ndash; including speed limit signs &ndash; and relays the information to the vehicle and its driver.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/4f0f1a21f5f1121d5331d016b40b0158_1696853806010.jpg\" alt=\"Mobileye's Intelligent Speed Assist solution is designed to help drivers stay within the legal speed limit.\" width=\"1611\" height=\"1070\" />\u003C/p>\n\u003Cp>Traffic sign recognition is one of the essential types of signal provided by Mobileye's \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\">EyeQ&trade; systems-on-chip\u003C/a>, which have been fitted into over 150 million vehicles to date across hundreds of model lines from dozens of the world's leading automakers. The system is trained to recognize both explicit signs (which clearly post the speed limit in kilometers or miles per hour) and implicit signage (such as those indicating construction areas, school zones, and highway entrances and exits) that imply a change in the applicable speed limit. And it can recognize both permanent printed signs as well as temporary or variable digital signs.\u003C/p>\n\u003Cp>Our ISA solution goes beyond traffic sign recognition, however, to encompass traffic sign relevancy, signature-based classification, optical character recognition, and road-type classification technologies &ndash; all of which combine to deliver more robust speed-assistance capabilities.\u003C/p>\n\u003Ch3>\u003Cstrong>Beyond recognition\u003C/strong>\u003C/h3>\n\u003Cp>With traffic sign relevancy, Mobileye's Intelligent Speed Assist (ISA) solution is programmed to not only recognize traffic signs, but to determine to which lane they're applicable. That can help prevent errors that commonly trip up basic traffic sign recognition systems &ndash; like speed limit signs on highway offramps, for example. Such signs might apply only to vehicles exiting the highway, but are prone to being picked up by the cameras on vehicles remaining on the highway as they cruise by offramps in other lanes as well. In a passive system, false alerts might well prompt the driver to ignore future alerts or even switch off the system entirely, thereby undermining (if not totally defeating) its utility. In an active system, however, the problem could be even worse: a \"false positive\" might trigger actuation of the brakes in order to bring the vehicle down to a misidentified speed limit &ndash; potentially causing rear-end collisions from unexpectant following traffic.\u003C/p>\n\u003Cp>Meanwhile, signature-based classification \"future-proofs\" the technology by allowing vehicles equipped with our ISA solution to be updated with the visual \"signatures\" of new types of road signs that might be introduced long after the vehicle has left the factory. This feature can be particularly useful in Europe, where individual countries often implement different types of signage. (Mobileye's ISA solution also supports other countries outside the EU, such as Turkey, which also adopted GSR.)\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/8395f8b628dd7be826f13ebbb0b448e6_1696853965562.jpg\" alt=\"Mobileye's computer vision technology is trained to identify signage used in countries across Europe and around the world.\" width=\"1605\" height=\"1070\" />\u003C/p>\n\u003Cp>With optical character recognition (OCR) capabilities, our ISA solution can also \"read\" textual signs that cannot be recognized by shape alone. That can help the system discern whether a reduced speed limit is in effect only during school hours, for example, or if a particular sign applies only to vehicles above a specific weight classification. It can also be especially useful in identifying the city name signs used in some parts of Europe to indicate the application of an urban speed limit.\u003C/p>\n\u003Cp>Road-type classification renders the system even more robust. This layer of the solution takes visual cues from other aspects of the driving environment and analyzes them using deep neural networks to determine what type of road is being traveled &ndash; and by extension, what the appropriate speed limit should be. That can be particularly helpful under conditions where signage may be missing, weathered, occluded, or otherwise difficult to read.\u003C/p>\n\u003Cp>One of the key factors that differentiates Mobileye technology is the \u003Ca href=\"https://www.mobileye.com/blog/mobileye-ces-2022-self-driving-secret-data/\">database of driving clips\u003C/a> that we've assembled over the course of many years of leadership in computer vision driver-assistance technology. Believed to be the largest in the industry, this database consists of tens of millions of video clips, encompassing an enormous variety of parameters and features of roadways from around the world &ndash; including local traffic signs.\u003C/p>\n\u003Cp>Training our algorithms on this enormous dataset means that when a vehicle equipped with Mobileye technology encounters a rare traffic sign, for example, chances are it has nonetheless \"seen\" it before, and knows what it's there to indicate. It also means that the system had already been internally validated on a wealth of real-world data before being independently certified by external parties.\u003C/p>\n\u003Ch3>\u003Cstrong>Seeing is believing\u003C/strong>\u003C/h3>\n\u003Cp>So far, six independent labs located in five separate European countries have tested Mobileye's Intelligent Speed Assist system and certified that it meets or exceeds the new standard set out under this latest version of the EU's General Safety Regulation. The system was designed based on extensive consultation and close coordination with the \u003Ca href=\"https://www.acea.auto/\">European Automobile Manufacturers&rsquo; Association (ACEA)\u003C/a> and the \u003Ca href=\"https://clepa.eu/\">European Association of Automotive Suppliers (CLEPA)\u003C/a>. And several major automakers are already implementing the solution in vehicles reaching markets across Europe and around the world.\u003C/p>\n\u003Cp>\"The successful launch, certification, and implementation of this new ISA solution stem from a great deal of hard work put in by many members of multiple teams across the company,\" says Amit Shainkopf, Driving Semantics Product Manager in Mobileye's R&amp;D department. \"We're proud of the performance we've achieved through vision alone &ndash; but we're not stopping there. We're forging ahead to expand these capabilities to additional geographies. And Intelligent Speed Assist is just one of many driver-assistance products we aim to launch in the coming years &ndash; all with the goal of making cars, and the roads they drive on around the world, safer for everyone.\"\u003C/p>","2023-10-31T07:00:00.000Z",{"id":811,"type":24,"url":812,"title":813,"description":814,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":814,"image":815,"img_alt":816,"content":817,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":818,"tags":819},236,"shanghai-jiaotong-university-names-amnon-honorary-professor","Shanghai Jiaotong University Names Prof. Amnon Shashua Honorary Professor","STJU recognized Amnon’s outstanding contributions to the field of AI before his lecture to 300 students and faculty members","https://static.mobileye.com/website/us/corporate/images/f9dbf90e71827e17f021f3d17a64e829_1696158764985.jpg","Amnon Shashua Receives Honorary Professorship From SJTU","\u003Cp>In a ceremony held on Tuesday, September 12, Professor Amnon Shashua was appointed as an honorary professor at the Global Institute of Future Technology and a guest professor at Shanghai Jiaotong University (SJTU).\u003C/p>\r\n\u003Cp>SJTU, renowned for its expertise in nurturing top engineers and scientists, welcomed Amnon to its campus, where he delivered a lecture, titled, \"The State of AI: Opportunities, Limitations and Dangers\" to an overflowing room of approximately 300 students, faculty, and experts.\u003C/p>\r\n\u003Cp>During the lecture, Professor Shashua&nbsp;spoke about the emergence&nbsp;of reasoning and abstraction&nbsp;in&nbsp;large language models (LLMs). While acknowledging advancements in reasoning, he&nbsp;explained&nbsp;the formidable challenge of LLMs to abstract from data, a capability inherent to human cognition.&nbsp;He provided&nbsp;examples, from joint work with Prof. Shai Shalev Shwartz,&nbsp;demonstrating the ongoing limitations in LLMs' ability to perform abstraction, indicating that this aspect of development&nbsp;does not look like a natural evolution in the field and&nbsp;may necessitate more&nbsp;substantial&nbsp;breakthroughs.\u003C/p>\r\n\u003Cp>The lecture concluded with an exploration of the pressing issue of AI alignment, by going over some recent results from his university research lab and from work with Shai, and the ongoing endeavors to establish safeguards for AI systems.\u003C/p>\r\n\u003Cp>You can watch the full lecture below:\u003Cbr>\u003Cbr>\u003C/p>\r\n\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/OiN9Jz27NKE?si=3QzG332h1KXqUmaC\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>","2023-10-02T07:00:00.000Z","Amnon Shashua, News",{"id":821,"type":5,"url":822,"title":823,"description":824,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":824,"image":825,"img_alt":826,"content":827,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":828,"tags":829},229,"volkswagen-id-buzz-mobileye-drive-iaa-mobility-2023","Volkswagen ID. Buzz showcases Mobileye Drive™ at IAA 2023","Electric passenger van on display in Munich incorporates Mobileye's turnkey self-driving solution for autonomous mobility-as-a-service.","https://static.mobileye.com/website/us/corporate/images/28dd1409e7c45b5d99ee67e3544ceeac_1695134996240.jpg","Showcased at IAA 2023, the VW ID. Buzz with Mobileye Drive™ is testing in Germany and the US. (Photo courtesy of Volkswagen.) ","\u003Cp>The Volkswagen ID. Buzz was one of the stars of this year's IAA Mobility show. On VW's show stand sat a bright yellow camper-van conversion for visitors to explore, while directly across the aisle at our booth, and in the test-drive area outside, appeared the self-driving version you see here.\u003C/p>\n\u003Cp>The ID. Buzz showcased by \u003Ca href=\"https://www.mobileye.com/blog/driving-evolution-autonomous-mobility-iaa-2023-munich/\" target=\"_blank\" rel=\"noopener\">Mobileye at the IAA\u003C/a> is one of the latest vehicles to incorporate Mobileye Drive&trade;, our turnkey self-driving system, and it is \u003Ca href=\"https://www.mobileye.com/news/volkswagen-commercial-vehicles-begins-av-testing-with-mobileye-drive/\" target=\"_blank\" rel=\"noopener\">currently undergoing testing\u003C/a> on both sides of the Atlantic.\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/DtM1ahkoR4A?si=45tNQ_Vomzo5XsI8\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>To deliver its autonomous driving capabilities, the vehicle features an array of sensors &ndash; including nine high-resolution cameras, four parking cameras, three long-range lidars, and six short-range lidars. To operate safely and effectively, it encompasses Mobileye&rsquo;s unique tech innovations: True Redundancy&trade; sensing systems, REM&trade; crowdsourced AV maps, and the RSS&trade;-based driving policy. \u003Cspan class=\"ui-provider ec bpo bpp bpq bpr bps bpt bpu bpv bpw bpx bpy bpz bqa bqb bqc bqd bqe bqf bqg bqh bqi bqj bqk bql bqm bqn bqo bqp bqq bqr bqs bqt bqu bqv\" dir=\"ltr\">For efficient processing power, it utilizes four EyeQ&trade;6 High systems-on-chips. Future versions are slated to upgrade the EyeQ chips and augment their sensing suite with \u003Ca class=\"fui-Link ___10kug0w f3rmtva f1ewtqcl fyind8e f1k6fduh f1w7gpdv fk6fouc fjoy568 figsok6 f1hu3pq6 f11qmguv f19f4twv f1tyq0we f1g0x7ka fhxju0i f1qch9an f1cnd47f fqv5qza f1vmzxwi f1o700av f13mvf36 f1cmlufx f9n3di6 f1ids18y f1tx3yz7 f1deo86v f1eh06m1 f1iescvh fhgqx19 f1olyrje f1p93eir f1nev41a f1h8hb77 f1lqvz6u f10aw75t fsle3fq f17ae5zn\" title=\"https://www.mobileye.com/news/mobileye-and-valeo-launch-partnership-for-world-class-imaging-radars/\" href=\"https://www.mobileye.com/news/mobileye-and-valeo-launch-partnership-for-world-class-imaging-radars/\" target=\"_blank\" rel=\"noreferrer noopener\" aria-label=\"Link our new imaging radars\">our new imaging radars\u003C/a>.\u003C/span>\u003C/p>\n\u003Cp>The scope of the technologies on board and the autonomous driving capabilities they deliver make this one of the most advanced in a long line of projects Mobileye has undertaken with the Volkswagen Group to date. Previously announced collaborations include the industry's first camera-only Automatic Emergency Braking system with Audi, the first \u003Ca href=\"https://www.mobileye.com/blog/cloud-enhanced-driver-assist/\">Cloud-Enhanced Driver-Assist&trade;\u003C/a> application with Volkswagen, and integrating the eyes-on/hands-off driving capabilities of \u003Ca href=\"https://www.mobileye.com/news/porsche-mobileye-supervision-collaboration/\">Mobileye SuperVision&trade; into new Porsche models\u003C/a>.\u003C/p>\n\u003Cp>This latest project has Volkswagen Commercial Vehicles \u003Ca href=\"https://www.vwpress.co.uk/en-gb/releases/4984\" target=\"_blank\" rel=\"noopener\">testing ID. Buzz autonomous vehicles in Munich\u003C/a>, Germany, and Volkswagen Group of America \u003Ca href=\"https://media.vw.com/en-us/releases/1750\" target=\"_blank\" rel=\"noopener\">testing in Austin\u003C/a>, Texas. Once the vehicle is ready for widespread deployment, VW aims to commence autonomous ridesharing services through its own MOIA division in Europe, and with independent transport operators in the North America.\u003C/p>\n\u003Cp>The ID. Buzz was one of several autonomous vehicles driven by Mobileye on display at IAA this year. It was joined there by driverless shuttles from \u003Ca href=\"https://www.mobileye.com/news/holon-mover-ces-mobileye-drive/\">HOLON\u003C/a> and \u003Ca href=\"https://www.schaeffler.com/en/media/press-releases/press-releases-detail.jsp?id=87942401\">Schaeffler\u003C/a> &ndash; both of which are also enabled by \u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">Mobileye Drive\u003C/a>, the self-driving solution that's leading the industry towards the future of autonomous mobility.\u003C/p>\n\u003Cp>[**]gallery:volkswagen-id-buzz-at-iaa-2023[**]\u003C/p>","2023-09-28T07:00:00.000Z","Events, Autonomous Driving, Driverless MaaS, Video",{"id":831,"type":5,"url":832,"title":833,"description":834,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":834,"image":835,"img_alt":836,"content":837,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":838,"tags":839},234,"driving-evolution-autonomous-mobility-iaa-2023-munich","Driving the Evolution of Autonomous Mobility at IAA 2023","From assisted to autonomous driving, Mobileye showcased a full range of technologies and solutions at this year's mobility show in Munich.","https://static.mobileye.com/website/us/corporate/images/c7b7ec2873d746f683aa0f9aee268a55_1695133339014.jpg","Mobileye's booth at IAA Mobility 2023 showcased the Zeekr 001 with Mobileye SuperVision™ and VW ID. Buzz with Mobileye Drive™.","\u003Cp>The future of the automobile was on display this month at \u003Ca href=\"https://www.mobileye.com/iaa-2023/\">IAA Mobility\u003C/a>, Europe's premier exposition for automotive technology. Mobileye \u003Ca href=\"https://www.mobileye.com/blog/iaa-mobility-2021-munich-wrap-up/\">returned\u003C/a> to the event this year to demonstrate the technological innovations that are driving the evolution from assisted to autonomous driving.\u003C/p>\n\u003Cp>On stage, Johann Jungwirth, Mobileye's Senior Vice President of Autonomous Vehicles, delivered a keynote presentation on \"\u003Ca href=\"https://www.mobileye.com/blog/johann-jungwirth-the-road-to-self-driving-mobility-iaa-2023/\">The Road to Self-Driving Mobility\u003C/a>\". Meanwhile, on both the show floor and the surrounding roadways, we showcased a broad range of cutting-edge technologies and solutions.\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/fAZtAgYARjY?si=3Hlan1CHU6yUdg2c\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>For a taste of what we lined up for this year's show, you can watch the highlights video above, and take a tour of the Mobileye booth at the Messe M&uuml;nchen convention center in the video below.\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/7OqvT2uxJgI?si=waOTPPF8uFJyisjy\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>Together with our partners, we announced some major news developments leading up to and during the show. In the video below, you'll hear about unlocking the potential of \u003Ca href=\"https://www.mobileye.com/news/mobileye-supervision-pilot-functions-added-to-110000-zeekr-vehicles/\">Mobileye SuperVision&trade; with Zeekr\u003C/a> in China, a new eyes-on/hands-off \u003Ca href=\"https://www.mobileye.com/news/smart-chooses-mobileye-supervision-for-advanced-driving-automation/\">SuperVision project with Smart\u003C/a>, deploying \u003Ca href=\"https://www.mobileye.com/news/polestar-selects-mobileye-to-bring-autonomous-technology-to-polestar-4/\">true imaging radar with Valeo\u003C/a>, and the new \u003Ca href=\"https://www.schaeffler.com/en/media/press-releases/press-releases-detail.jsp?id=87942401\">roboshuttle from Schaeffler and VDL\u003C/a>.\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/199Kme9sovM?si=rglgbyfSYs1cd5ww\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>Read about the \u003Ca href=\"https://www.mobileye.com/blog/volkswagen-id-buzz-mobileye-drive-iaa-mobility-2023/\" target=\"_blank\" rel=\"noopener\">ID. Buzz autonomous vehicle featuring Mobileye Drive&trade;\u003C/a>, and watch this space for more of what Mobileye had to show at IAA Mobility 2023.\u003C/p>","2023-09-20T07:00:00.000Z","Events, Video",{"id":841,"type":24,"url":842,"title":843,"description":844,"primary_tag":16,"author_name":16,"is_hidden":11,"lang":12,"meta_description":844,"image":845,"img_alt":846,"content":847,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":848,"tags":849},232,"faw-group-and-mobileye-forge-strategic-alliance-in-autonomous-driving","FAW Group and Mobileye Forge Strategic Alliance in Autonomous Driving","FAW Group aims to initially equip FAW Hongqi vehicles with Mobileye SuperVision™ and Mobileye Chauffeur™ technology, and build a deeper technological alliance.","https://static.mobileye.com/website/us/corporate/images/a046fddaac665842923777c85e3987a5_1694684898541.png","Logos of FAW Group and Mobileye","\u003Cp>CHANGCHUN AND JERUSALEM &mdash; FAW Group, one of the largest Chinese automotive groups, and Mobileye (Nasdaq: MBLY), a global leader in autonomous driving solutions, have announced a new strategic partnership leveraging their respective industry advantages in software, hardware and technology products. The two sides will work together to create new products based on Mobileye SuperVision&trade; and Mobileye Chauffeur&trade; platforms, to empower customers with safe, enjoyable driving experiences.\u003C/p>\n\u003Cp>Mr. Qiu Xian Dong, chairman of BOD &amp; secretary of the CPC FAW Group Committee , and Professor Amnon Shashua, President and CEO of Mobileye, held the signing ceremony on September 13, in Changchun. The two companies will start working together on FAW Hongqi brand vehicles, bringing the Mobileye SuperVision highly advanced automated driving assist platform to key models of the Hongqi brand. That will be followed closely by work toward integrating the Mobileye Chauffeur &ldquo;eyes-off, hands-off\" driving technology for specified domains into the much-anticipated E701 vehicle. The Mobileye SuperVision projects are expected to be deployed by end of 2024, with the Mobileye Chauffeur projects targeted for launch by the end of 2025.\u003C/p>\n\u003Cp>Meanwhile, other brands of FAW Group are in line to gradually follow, using Mobileye&rsquo;s platform solutions. In addition, the parties will conduct comprehensive discussions on feasible plans to further deepen their cooperation.&nbsp;\u003C/p>\n\u003Cp>As a pioneer of China's automotive industry, FAW Group has always been committed to building a world-class, green and intelligent mobility service company that consumers love. Throughout its 70 years of development, FAW Group has been constantly refreshing the user experience through technological innovation and service upgrades.\u003C/p>\n\u003Cp>Mobileye SuperVision will be an integral part of FAW's advanced driver assistance systems, promising an unparalleled driving experience in its category. Upon its integration Mobileye Chauffeur can provide &ldquo;hands-off, eyes-off\" autonomy on specified operational domains, with adaptive eyes-on driving in other settings, using the Mobileye SuperVision cameras paired with a front-facing lidar and Mobileye imaging radar. Both systems will utilize Mobileye&rsquo;s EyeQ&trade;6 SoC, RSS&trade;-based driving policy, a comprehensive 360-degree camera system, and precision mapping.\u003C/p>\n\u003Cp>&ldquo;The target of the cooperation between FAW Group and Mobileye, both leaders in their respective fields, is to provide cutting edge ADAS and AV solutions in the Chinese market. This cooperation will deepen the partnership and alliance between the companies. Mobileye&rsquo;s products will be first equipped on the Hongqi brand, and later expand to the other brands within the FAW Group,&rdquo; said Mr. Qiu Xian Dong, Chairman of BOD &amp; secretary of the CPC FAW Group Committee.\u003C/p>\n\u003Cp>&ldquo;FAW's selection of Mobileye for this venture underscores the magnitude of this collaboration,&rdquo; said Prof. Amnon Shashua, President and CEO of Mobileye. &ldquo;Between the scale and expertise FAW has built over the past seven decades and our world-class technology, this alliance could shape the future of autonomous driving in the region and beyond. Drawing from our extensive experience of delivering our technology to more than 150 million vehicles globally, we see a future where autonomous driving improves transportation safety, convenience and accessibility for millions.&rdquo;\u003C/p>\n\u003Cp>More details about the projects will be shared in the months ahead.\u003C/p>","2023-09-14T07:00:00.000Z","ADAS, News, Autonomous Driving, Industry",{"id":851,"type":654,"url":852,"title":853,"description":854,"primary_tag":32,"author_name":855,"is_hidden":11,"lang":12,"meta_description":854,"image":856,"img_alt":857,"content":858,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":848,"tags":859},233,"are-we-on-the-edge-of-a-chat-gpt-moment-for-autonomous-driving","Are we on the edge of a “ChatGPT moment” for autonomous driving?","Prof. Shai Shalev-Shwartz and Prof. Amnon Shashua discuss the end-to-end approach for solving self-driving vehicle problems, asking if it is both sufficient and necessary.","Prof. Shai Shalev-Shwartz and Prof. Amnon Shashua ","https://static.mobileye.com/website/us/corporate/images/f25b156deca6571a0ee6b607a893ba29_1694719403818.jpg","     ","\u003Cp>Large Language Models (LLMs) such as ChatGPT have revolutionized the world of natural language understanding and generation. Building such models generally has two stages: a general unsupervised pre-training step, and a specific Reinforcement Learning from Human Feedback (RLHF) step. In the pre-training phase, the system &ldquo;reads&rdquo; a large portion of the internet (trillions of words) and aims at predicting the next word given previous words, therefore learning the distribution of text over the internet. In the RLHF phase, humans are asked to rank the quality of different answers from the model to given questions, and the neural network is fine-tuned to prefer better answers.\u003C/p>\n\u003Cp>Recently, Tesla has \u003Ca href=\"https://www.cnbc.com/2023/09/09/ai-for-cars-walter-isaacson-biography-of-elon-musk-excerpt.html\" target=\"_blank\" rel=\"noopener\">indicated\u003C/a> they will adopt this approach for end-to-end solving of the self-driving problem. The premise is to switch from a well-engineered system comprised of data-driven components interconnected by many lines of codes to a pure data-driven approach comprised of a single end-to-end neural network. When examining a technological solution for a given problem, the two questions one should ask is whether the solution is \u003Cstrong>sufficient\u003C/strong> and whether it is \u003Cstrong>necessary\u003C/strong>:\u003C/p>\n\u003Cul style=\"padding-left: 43px;\">\n\u003Cli>Sufficiency: does this approach tackle all of the requirements of self-driving?\u003C/li>\n\u003Cli>Necessity: is this the best approach or is it an over-kill (trying to kill a fly with a rocket)?\u003C/li>\n\u003C/ul>\n\u003Cp>Some critical requirements of self-driving systems are transparency, controllability and performance:\u003C/p>\n\u003Col>\n\u003Cli>Transparency and Explainability: Driving systems must perceive and plan. Perception means creating a factual description of reality (the location of lanes, vehicles, pedestrians and so forth). Planning involves leveraging perception into driving decisions that must balance a tradeoff between usefulness and safety. For example, what speed should the car drive at a residential road with parked vehicles on its sides? Driving slower will be safer; say, in case a child runs into the street from between parked cars. But driving too slow will compromise the usefulness of the function and impede other vehicles. We believe how a self-driving vehicle balances such tradeoffs must be transparent so that society, through regulation, should have a say in decisions that affect all road users.\u003C/li>\n\u003Cli>Controllability: Reproducible system mistakes should be captured and fixed immediately, while not compromising the overall performance of the system. Furthermore, while human drivers take bad decisions from time to time, like not yielding properly or driving while impaired, society will not tolerate &ldquo;lapses of judgement&rdquo; of a self-driving system and every decision should be controllable.\u003C/li>\n\u003Cli>Performance: The Mean-Time-Between-Failures (MTBF) must be extremely high and non-reproducible errors (&ldquo;black swans&rdquo;) should be extremely rare.\u003C/li>\n\u003C/ol>\n\u003Cp>Now, let us judge the end-to-end solution in light of the above requirements.\u003C/p>\n\u003Cp>For transparency, while it may be possible to steer an end-to-end system towards satisfying some regulatory rules, it is hard to see how to give regulators the option to dictate the exact behavior of the system in all situations. In fact, the most recent trend in LLMs is to combine them with symbolic reasoning elements &ndash; also known as good, old fashion coding. See for example \u003Ca href=\"https://arxiv.org/abs/2305.10601\" target=\"_blank\" rel=\"noopener\">Tree of Thoughts\u003C/a>, \u003Ca href=\"https://arxiv.org/abs/2308.09687\" target=\"_blank\" rel=\"noopener\">Graph-of-Thoughts\u003C/a> and \u003Ca href=\"https://arxiv.org/abs/2211.10435\" target=\"_blank\" rel=\"noopener\">PAL\u003C/a>.\u003C/p>\n\u003Cp>For controllability, end-to-end approaches are an engineering nightmare. Evidence shows that the performance of GPT-4 over time \u003Ca href=\"https://arxiv.org/abs/2307.09009\" target=\"_blank\" rel=\"noopener\">deteriorates\u003C/a> as a result of attempts to keep improving the system. This can be attributed to phenomena like catastrophic forgetfulness and other \u003Ca href=\"https://arxiv.org/abs/2212.09251\" target=\"_blank\" rel=\"noopener\">artifacts of RLHF\u003C/a>. Moreover, there is no way to guarantee &ldquo;no lapse of judgement&rdquo; for a fully neuronal system. The trend in LLMs is to combine LLMs with external, code-based, tools in order to have guarantees on elements of the systems (e.g. &ldquo;calculator&rdquo; in \u003Ca href=\"https://arxiv.org/abs/2302.04761\" target=\"_blank\" rel=\"noopener\">Toolformer\u003C/a> and \u003Ca href=\"https://www.ai21.com/blog/jurassic-x-crossing-the-neuro-symbolic-chasm-with-the-mrkl-system\" target=\"_blank\" rel=\"noopener\">Jurrasic-x neuro-symbolic system\u003C/a>).\u003C/p>\n\u003Cp>Regarding performance (i.e., the high MTBF requirement), while it may be possible that with massive amounts of data and compute an end-to-end approach will converge to a sufficiently high MTBF, the current evidence does not look promising. Even the most advanced LLMs make embarrassing mistakes quite often. Will we trust them for making safety critical decisions? It is well known to machine learning experts that the most difficult problem of statistical methods is the long tail. The end-to-end approach might look very promising to reach a mildly large MTBF (say, of a few hours), but this is orders of magnitude smaller than the requirement for safe deployment of a self-driving vehicle, and each increase of the MTBF by one order of magnitude becomes \u003Ca href=\"https://arxiv.org/abs/1604.06915\" target=\"_blank\" rel=\"noopener\">harder and harder\u003C/a>. It is not surprising that the recent live demonstration of Tesla&rsquo;s latest FSD by Elon Musk shows an \u003Ca href=\"https://nypost.com/2023/08/29/elon-musk-almost-runs-red-light-livestreaming-tesla-software/\" target=\"_blank\" rel=\"noopener\">MTBF of roughly one hour\u003C/a>.\u003C/p>\n\u003Cp>Taken together, we see many concerns regarding the ability of an end-to-end approach to fully tackle the self-driving challenge. What we would further argue is that an end-to-end approach is an over-kill. The premise of a fully end-to-end approach is &ldquo;no lines of code, everything should be done by a single gigantic neural network.&rdquo; Such a system requires maintaining a huge model, with every single update carefully balanced - yet this approach goes against current trends in utilizing LLMs as components within real systems. One such trend is the neural-symbolic approach, in which the fully neuronal LLM is one component within a larger system that uses code-based tools (for example \u003Ca href=\"https://arxiv.org/abs/2302.04761\" target=\"_blank\" rel=\"noopener\">Toolformer\u003C/a>). Another trend is the expert approach, in which LLMs are fine tuned to specific, well defined tasks; and the evidence so far is that small dedicated models outperform significantly larger models (e.g. the \u003Ca href=\"https://arxiv.org/abs/2308.12950\" target=\"_blank\" rel=\"noopener\">code LLaMa project\u003C/a>). These trends have implications on the data and compute requirements, showing that quality of data, architecture, and system design may be far more important than sheer quantity.\u003C/p>\n\u003Cp>In summary, we argue that an end-to-end approach is neither necessary nor sufficient for self-driving systems. There is no argument that data-driven methods including convolutional networks and transformers are crucial elements of self-driving systems, however, they must be carefully embedded within a well-engineered architecture.\u003C/p>","Autonomous Driving, Amnon Shashua",{"id":861,"type":5,"url":862,"title":863,"description":864,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":864,"image":865,"img_alt":866,"content":867,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":868,"tags":869},230,"johann-jungwirth-the-road-to-self-driving-mobility-iaa-2023","Johann Jungwirth on the road to self-driving mobility","Watch the keynote presentation delivered by Mobileye's Senior Vice President of Autonomous Vehicles at the 2023 IAA Mobility show in Munich.","https://static.mobileye.com/website/us/corporate/images/321c6156a782b93b7bed223e1473d7ba_1694085306154.jpg","Johann \"JJ\" Jungwirth addressed the industry at IAA 2023 on the bridge we're building from assisted to autonomous driving.","\u003Cp>This week the automotive industry descended on the Bavarian capital of Munich, Germany, for \u003Ca href=\"https://www.mobileye.com/iaa-2023/\">IAA Mobility 2023\u003C/a> &ndash; one of the biggest automotive tech events of the year. Mobileye was there on the ground to showcase our latest technologies and solutions on the show floor, with demonstration vehicles on the surrounding roadways, and a keynote presentation delivered by Mobileye&rsquo;s Senior Vice President of Autonomous Vehicles, \u003Ca href=\"https://www.mobileye.com/blog/johann-jungwirth-driverless-tech-business-faz-podcast/\">Johann \"JJ\" Jungwirth\u003C/a>.\u003C/p>\n\u003Cp>JJ is a 25-year veteran of the industry and a leading authority on self-driving cars and the technological advancements required to make autonomous mobility a reality. In his keynote, entitled \"The Road to Self-Driving Mobility,\" JJ outlined Mobileye's established leadership in driver-assistance technologies and the bridge we're building to autonomous vehicles &ndash; from hands-off through eyes-off to completely driverless solutions.\u003C/p>\n\u003Cp>\"We take an incremental approach\" to autonomous mobility, \"looking at how these products can build on top of each other,\" JJ outlined in his presentation. \"We have solved the hard problems with \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener\">Mobileye SuperVision&trade;\u003C/a>, our hands-off product. And that's the baseline for our eyes-off and driverless solutions\" &ndash; Mobileye Chauffeur&trade; and Mobileye Drive&trade;.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/news/polestar-selects-mobileye-to-bring-autonomous-technology-to-polestar-4/\" target=\"_blank\" rel=\"noopener\">Vehicles equipped with Mobileye Chauffeur\u003C/a>, he projected, \"will be, probably for many of us, the first time we'll actually experience AV technology in consumer vehicles.\"\u003C/p>\n\u003Cp>Watch the full recording in the video below, and watch this space for more from IAA Mobility 2023.\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/UdQo7oncZbY?si=O-XtenmqSrnl2eME\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>","2023-09-07T07:00:00.000Z","Autonomous Driving, Video, Events",{"id":871,"type":24,"url":872,"title":873,"description":874,"primary_tag":16,"author_name":16,"is_hidden":11,"lang":12,"meta_description":874,"image":875,"img_alt":876,"content":877,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":878,"tags":879},231,"smart-chooses-mobileye-supervision-for-advanced-driving-automation","smart chooses Mobileye SuperVision™ for advanced driving automation","Special edition model to feature SuperVision-based features for point-to-point automatic navigation on highways and urban expressways","https://static.mobileye.com/website/us/corporate/images/b276d53cdcc31545018c4b49dc242be4_1693990413776.png","Smart and Mobileye join forces for SuperVision","\u003Cp>\u003Cem>\u003Cspan data-contrast=\"none\">HANGZHOU AND JERUSALEM, 6 September 2023 &mdash; \u003C/span>\u003C/em>\u003Cspan data-contrast=\"none\">smart, the new-premium intelligent all-electric auto brand, announced today a special upcoming model will feature Navigation smart Pilot (NSP) and a series of highly advanced driver-assistance features built upon the Mobileye SuperVision&trade; system.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">The special edition EV will offer smart Pilot Assist 2.0 version, which will be designed to gradually enable point-to-point automatic navigation on highways and urban expressways, automated lane changes, automated on/off-ramp assist and intelligent traffic safety functions within identified operational design domains, building off the existing strong safety performance in the smart #1. This special edition will also highlight smart&rsquo;s global ambitions.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Mr. Yang Jun, Vice President of Global Research and Development at smart Automobile, said: \"Since the birth of the brand, smart has always been led by the vision of exploring the best solutions for future urban mobility. We believe that the smarter ADAS technology, will further enhance the competitiveness of the new generation of smart all-electric product family. Leveraging the R&amp;D capability and the collaboration with industry leading partners, such as Mobileye, smart will continue elevating the brand&rsquo;s technology label, providing more intelligent, safer urban mobility experience to the users.&rdquo;\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">The SuperVision-based system leverages 11 cameras &ndash; including seven 8-megapixel cameras &ndash; and radar, along with a robust driving policy. \u003C/span>\u003Cspan data-contrast=\"none\">The system runs on two Mobileye EyeQ5&trade; system-on-chips, an advanced custom 7-nanometer ADAS chipset, building on Mobileye&rsquo;s two decades of experience in applied AI and machine learning to handle AI tasks with high energy efficiency, a key consideration for electric vehicles.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">&ldquo;We&rsquo;re excited to launch this new era of collaboration with the smart brand, offering new capabilities built from the foundation of our SuperVision technology,&rdquo; said Johann Jungwirth, senior vice president of autonomous driving at Mobileye. &ldquo;SuperVision complements smart&rsquo;s commitment to offering the future of urban mobility globally.&rdquo; \u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">With today&rsquo;s announcement, smart joins ZEEKR and Polestar in offering SuperVision-based services to customers. More details about the model will be released soon in the fourth quarter of 2023.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}\">&nbsp;\u003C/span>\u003C/p>","2023-09-06T07:00:00.000Z","News, Industry, Autonomous Driving, ADAS",{"id":881,"type":24,"url":882,"title":883,"description":884,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":884,"image":885,"img_alt":886,"content":887,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":888,"tags":889},227,"mobileye-supervision-pilot-functions-added-to-110000-zeekr-vehicles","Mobileye SuperVision™ Pilot functions added to 110,000 ZEEKR vehicles","OTA update brings innovative, point-to-point automated driving features to customers in China, and has already generated strong word-of-mouth reviews","https://static.mobileye.com/website/us/corporate/images/zeekr_3054.jpg","ZEEKR 001 in Shanghai","\u003Cp>Jerusalem, 4 September 2023 &mdash; Global mobility technology brand ZEEKR has launched a major over-the-air update for 110,000 owners of the ZEEKR 001 electric vehicle with the global debut of new, highly automated driving assistance features built on the Mobileye SuperVision&trade; platform. The upgraded Navigation ZEEKR Pilot (NZP) driving assistant system has already received strong reviews from over 1,000 beta users, with class-leading performance.\u003C/p>\n\u003Cp>NZP leverages SuperVision&rsquo;s 11 cameras &ndash; including seven 8-megapixel cameras &ndash; and surround fisheye cameras, along with a front radar and robust driving policy. Key features of NZP include point-to-point automated highway navigation, lane changes, automated on/off-ramp assist and intelligent traffic safety functions in identified operational design domains.\u003C/p>\n\u003Cp>The system runs on two Mobileye EyeQ&trade;5 systems-on-chip, an advanced custom 7-nanometer ADAS chipset, building on Mobileye&rsquo;s two decades of experience in applied AI and machine learning to handle AI tasks with high energy efficiency, a key factor for electric vehicles.\u003C/p>\n\u003Cp>Thanks to these enabling technologies, NZP can react similarly as a human driver might to everyday driving scenarios within its operational design domains. It can sense speed limit changes, merge into or overtake traffic, and navigate with an appropriate safety margin for construction zones, pedestrians and other road hazards, even in low-light conditions. It also reacts smartly to the other drivers on the road, understanding their behavior, using human-like motions to efficiently merge and mimicking how drivers tend to negotiate key road features. The functions are well recognized by leading industry media for its safety and efficiency performance.\u003C/p>\n\u003Cp>The system will first be available in the cities of Shanghai and Hangzhou, with multiple cities being added over the next few months. ZEEKR 009 multi-purpose vehicle owners are expected to receive a similar OTA update later this year.\u003C/p>\n\u003Cp>&ldquo;We&rsquo;re proud of the work we&rsquo;ve done with ZEEKR to launch this successful update of NZP built on the Mobileye SuperVision&trade; platform,&rdquo; said Prof. Amnon Shashua, President and CEO of Mobileye. &ldquo;This jump forward points toward the full power of SuperVision that will enable ZEEKR to provide a seamless and reliable highly assisted driving experience in highway and urban settings. From what we&rsquo;ve seen, NZP powered by SuperVision has become a market leader, raising the bar not only in China but worldwide.&rdquo;\u003C/p>\n\u003Cp>&ldquo;As a strategic partner of Mobileye, ZEEKR is pleased to provide the industry-leading NZP solution to users to make travel safer and more efficient,&rdquo; said Andy An, CEO of ZEEKR. &ldquo;We will continue to further advance technologies to maintain our industry leadership globally.&rdquo;\u003C/p>\n\u003Cp>The SuperVision platform enables advanced driver-assist features at up to 130 kilometers per hour, on all road types, as monitored by the driver. It builds on Mobileye&rsquo;s heritage for automotive safety features and driver assist technologies like automatic emergency braking, and lays the pathway towards fully autonomous consumer vehicles and robotaxis in the near future.\u003C/p>\n\u003Cp>Under the expanded collaboration with Geely Group, three additional brands under Geely Group&rsquo;s umbrella are due to leverage Mobileye SuperVision for advanced ADAS, including Polestar. ZEEKR&rsquo;s new ZEEKR 001 FR quad-motor sports car, which can accelerate from 0 to 100 kph with just 2.08 seconds with rolling start, is the latest model launched to be equipped with Mobileye SuperVision&trade; platform.\u003C/p>\n\u003Cp>\u003Cstrong>Media Contact: \u003C/strong>Justin Hyde, \u003Ca href=\"mailto:Justin.hyde@mobileye.com\">Justin.hyde@mobileye.com\u003C/a>\u003C/p>","2023-09-04T07:00:00.000Z","Industry, Autonomous Driving, News",{"id":891,"type":24,"url":892,"title":893,"description":894,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":894,"image":895,"img_alt":896,"content":897,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":888,"tags":898},228,"mobileye-and-valeo-launch-partnership-for-world-class-imaging-radars","Mobileye and Valeo Launch Partnership for World-Class Imaging Radars","Valeo and Mobileye will work together to bring high-definition imaging radars to production for global automakers.","https://static.mobileye.com/website/us/corporate/images/1a4350d1a77e6c5a7d503dc7fdd64728_1693574090523.png","Bounding boxes illustrate how imaging radar helps AV sensing","\u003Cp>At IAA Mobility 2023 in Munich today, Mobileye and Valeo announced a new partnership to deliver software-defined, best-in-class imaging radars for next-generation driver assist and automated driving features.\u003C/p>\n\u003Cp>By joining forces, Mobileye and Valeo can quickly bring a promising new technology to automakers worldwide that enables more intelligent vehicles. As a key part of sensing systems for automated driving, imaging radar will be an enabling element for more advanced hands-off ADAS solutions and eyes-off automated driving features on highways and urban streets.\u003C/p>\n\u003Cp>&ldquo;Mobileye and Valeo&rsquo;s Imaging radar collaboration significantly advances a new and exciting phase in automotive safety and performance,&rdquo; said Nimrod Nehushtan, Mobileye&rsquo;s Executive Vice President of Business Development and Strategy. &ldquo;In this collaboration, automakers gain access to the latest cutting edge technology from Mobileye that they can trust will exceed industry expectations as we have proven before, while benefiting from the customization, industrialization, testing and support capabilities brought by Valeo. Our collaboration to deliver imaging radar to automakers benefits the industry, and ultimately, drivers globally.&rdquo;\u003C/p>\n\u003Cp>Marc Vrecko, President of Valeo&rsquo;s Comfort and Driving Assistance Systems Business Group, said:\u003Cem> \u003C/em>&ldquo;This partnership reinforces the strong relationship between Valeo and Mobileye. We are proud to collaborate together on Imaging radar technology, which will be essential in the future of autonomous mobility. This is a great illustration of Valeo&rsquo;s technological leadership in ADAS and of its capability to produce innovative technologies at scale. This collaboration will contribute to Valeo&rsquo;s commitment to offer affordable, smarter and safer mobility.&ldquo;\u003C/p>\n\u003Cp>Mobileye&rsquo;s imaging radars use advanced architecture, including Massive MIMO (multiple-input, multiple-output) antenna design, a high-end radio frequency design developed in-house, and high-fidelity sampling &ndash; all enabling accurate object detection and higher dynamic range. Thanks to an integrated system-on-chip design that maximizes processor efficiency, and world-leading algorithms for interpreting radar data, the imaging radar delivers a detailed, four-dimensional image of surroundings up to 300 meters away and beyond. With a 140-degree field-of-view at medium range and 170-degree field-of-view in close range, the radar enables more accurate detection of pedestrians, vehicles or obstructions that other sensors might miss &ndash; even on crowded urban streets.\u003C/p>\n\u003Cp>Mobileye has already seen high market interest for its Imaging radar from the industry as automakers look to expand the operational design domains of their automated driving features. Valeo has simultaneously received indications of strong demand from the market for imaging radar that achieves optimal performance.\u003C/p>\n\u003Cp>Valeo, world leader in Advanced Driver Assistance Systems (ADAS), has been developing and mass-producing radar technologies since 2006. Valeo will lead the system design of the new imaging radar product by integrating Mobileye&rsquo;s groundbreaking imaging radar technology and corresponding software and algorithms embedded in the Mobileye Radar chipset into Valeo&rsquo;s automotive software and hardware radar solutions. Valeo will meet and adapt to the latest and most stringent software and hardware requirements from automotive players, including functional safety, cybersecurity, fast communication protocols with vehicle networks, electromagnetic robustness, and validation of overall system performance and endurance during vehicle lifetime. Leveraging Valeo&rsquo;s expertise in producing the latest automotive technologies at scale and its global industrial footprint, the complete Imaging radar solution will be produced by Valeo\u003C/p>\n\u003Cp>This new partnership expands Mobileye and Valeo successful collaboration on front-facing cameras and other driver assist solutions. Since 2015, the partners have delivered more than 15 million Smart Front Cameras worldwide.\u003C/p>","Industry, Mobileye Inside, Autonomous Driving, ADAS, News",{"id":900,"type":24,"url":901,"title":902,"description":903,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":903,"image":904,"img_alt":905,"content":906,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":907,"tags":908},226,"ecarx-announces-mobileye-collaboration-for-polestar-future-products","ECARX announces Mobileye collaboration for Polestar, future products ","ECARX Holdings plans to integrate Mobileye Chauffeur™ in the Polestar 4","https://static.mobileye.com/website/us/corporate/images/a8a3701e248dd7d6cc1e7981ddee5bfc_1693230024446.png","The Polestar 4","\u003Cp>Today, ECARX Holdings announced a collaboration with Mobileye for future autonomous and advanced driver-assist technology. ECARX plans to serve as the integrator for the Mobileye Chauffeur&trade; platform in the Polestar 4 that will offer hands-off, eyes-off autonomous driving on controlled-access highways in operational-defined domains. ECARX and Mobileye also intend to collaborate on a EyeQ6L-based driver-assist solution, using the latest EyeQ&trade;6 automotive-grade system-on-chip, to serve the diverse needs of the Chinese market.\u003C/p>\n\u003Cp>Mobileye&rsquo;s&nbsp;Chauffeur adds a layer of active radar and lidar sensor to Mobileye&rsquo;s class-leading computer vision perception to create two subsystems working separately and in parallel to ensure redundancy. It also includes&nbsp;Mobileye&rsquo;s proprietary driving policy platform that can be customized by the automaker.&nbsp;All&nbsp;these elements are designed to&nbsp;run on the ultra-efficient EyeQ6&reg; SoC.\u003C/p>\n\u003Cp>\u003Ca href=\"https://ir.ecarxgroup.com/news-releases/news-release-details/ecarx-collaborate-mobileye-build-integrated-driver-assist\" target=\"_blank\" rel=\"noopener\">Read more about the announcement here\u003C/a>.\u003C/p>","2023-08-28T07:00:00.000Z","Industry, News, Autonomous Driving, ADAS",{"id":910,"type":24,"url":911,"title":912,"description":913,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":913,"image":914,"img_alt":915,"content":916,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":32,"publish_date":917,"tags":918},224,"polestar-selects-mobileye-to-bring-autonomous-technology-to-polestar-4","Polestar selects Mobileye to bring autonomous technology to Polestar 4","Mobileye Chauffeur™ technology platform to be integrated into Polestar 4","https://static.mobileye.com/website/us/corporate/images/9c648af2ec5ed4a529249523564dbc08_1692898654016.png","The Polestar 4 electric SUV Coupe. ","\u003Cp>HONG KONG AND JERUSALEM, 25 August 2023 &mdash; Polestar and Mobileye (Nasdaq: MBLY and PSNY) are to collaborate to bring autonomous technology to Polestar 4 with potential for other vehicles, using the Mobileye Chauffeur&trade; AV platform that will be manufactured and integrated by ECARX.\u003C/p>\n\u003Cp>The intended collaboration reflects Polestar&rsquo;s mission to not only deliver world-class design and sustainability in its cars, but cutting-edge innovation as well. Mobileye Chauffeur will add an extra layer of on-demand convenience to Polestar&rsquo;s performance electric vehicles that are primarily designed for driver engagement and exciting dynamics.\u003C/p>\n\u003Cp>Polestar 4, which goes on sale in China today and globally in 2024, lays the foundation for this technology by featuring a Mobileye SuperVision&trade;-based advanced driver assistance system from the start.\u003C/p>\n\u003Cp>At launch, Chauffeur will offer hands-off and eyes-off, point-to-point autonomous driving on highways , as well as eyes-on automated driving for other environments, in identified operational design domains.\u003C/p>\n\u003Cp>Thomas Ingenlath, Polestar CEO, comments: &ldquo;We are very keen to push innovation in our performance electric vehicles together with Mobileye. We know that driving yourself is not always fun and exciting &ndash; this technology means our customers could enable autonomous driving when they want, making all future journeys enjoyable.&rdquo;\u003C/p>\n\u003Cp>Both SuperVision and Chauffeur feature Mobileye&rsquo;s EyeQ&trade; systems-on-chip, RSS&trade;-based driving policy, 360-degree surround camera system, and REM&trade;-powered Mobileye Roadbook&trade; map. Chauffeur upgrades SuperVision with the newest EyeQ6 system-on-chip along with next-generation active radar and lidar sensors, providing the additional sensing layer needed for eyes-off autonomous operation &ndash; demonstrating how existing eyes-on systems build a bridge to fully autonomous driving.\u003C/p>\n\u003Cp>&ldquo;We congratulate Polestar on innovating in consumer vehicles through this program and are proud of our continuing work with the Geely Group in adopting our technology portfolio,&rdquo; said Prof. Amnon Shashua, President and CEO of Mobileye. &ldquo;Mobileye Chauffeur will offer consumers a safer, accessible way to enjoy autonomous vehicles as the next revolution in personal transportation. It&rsquo;s the pinnacle of two decades of our experience applying AI in more than 150 million vehicles worldwide.&rdquo;\u003C/p>\n\u003Cp>More details about the integration will be released closer to production launch.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\n\u003Cp>Mobileye (Nasdaq: MBLY) leads the mobility revolution with its autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in computer vision, artificial intelligence, mapping, and data analysis. Since its founding in 1999, Mobileye has pioneered such groundbreaking technologies as REM&trade; crowdsourced mapping, True Redundancy&trade; sensing, and Responsibility Sensitive Safety (RSS). These technologies are driving the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions, powering industry-leading advanced driver-assistance systems and delivering valuable intelligence to optimize mobility infrastructure. To date, more than 150 million vehicles worldwide have been built with Mobileye technology inside. In 2022 Mobileye listed as an independent company separate from Intel (Nasdaq: INTC), which retains majority ownership. For more information, visit&nbsp;\u003Ca href=\"https://cts.businesswire.com/ct/CT?id=smartlink&amp;url=https%3A%2F%2Fwww.mobileye.com&amp;esheet=53539903&amp;newsitemid=20230817465845&amp;lan=en-US&amp;anchor=https%3A%2F%2Fwww.mobileye.com&amp;index=2&amp;md5=dc09cc0681f253ba2b178bfdedda4ac0\">https://www.mobileye.com\u003C/a>.\u003C/p>\n\u003Cp>\u003Cstrong>About Polestar\u003C/strong>\u003C/p>\n\u003Cp>Polestar (Nasdaq: PSNY) is the Swedish electric performance car brand determined to improve society by using design and technology to accelerate the shift to sustainable mobility. Headquartered in Gothenburg, Sweden, its cars are available online in 27 markets globally across North America, Europe and Asia Pacific.\u003C/p>\n\u003Cp>Polestar plans to have a line-up of five performance EVs by 2026. Polestar 2, the electric performance fastback, launched in 2019. Polestar 3, the SUV for the electric age, launched in late 2022. Polestar 4, the SUV coup&eacute; transformed, is launching in phases through 2023 and into 2024. Polestar 5, an electric four-door GT and Polestar 6, an electric roadster, are coming soon.\u003C/p>\n\u003Cp>The Polestar 0 project is the company&rsquo;s ambitious goal of creating a truly climate-neutral production car by 2030. The research initiative also aims to create a sense of urgency to act on the climate crisis, by challenging employees, suppliers and the wider automotive industry, to drive towards zero.\u003C/p>\n\u003Cp>\u003Cstrong>Media Contact: \u003C/strong>Justin Hyde, \u003Ca href=\"mailto:justin.hyde@mobileye.com\">justin.hyde@mobileye.com\u003C/a>, +01 202-656-6749\u003C/p>","2023-08-25T07:00:00.000Z","Autonomous Driving, News, Industry",{"id":920,"type":24,"url":921,"title":922,"description":923,"primary_tag":190,"author_name":10,"is_hidden":11,"lang":12,"meta_description":923,"image":924,"img_alt":925,"content":926,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":927,"tags":928},220,"volkswagen-commercial-vehicles-begins-av-testing-with-mobileye-drive","Volkswagen Commercial Vehicles Begins AV Testing with Mobileye Drive™️ ","New VW Pilots Kick Off in Munich, Germany and Austin, Texas","https://static.mobileye.com/website/us/corporate/images/487800bab83702bc31049912c78a3e01_1690726252678.png","An all-electric Volkswagen ID. Buzz equipped with Mobileye Drive™️ self-driving technology in Munich, Germany","\u003Cp>Volkswagen Commercial Vehicles has begun testing a self-driving version of the ID. Buzz electric vehicle with Mobileye Drive technology on two continents. In Munich, Germany, and Austin, Texas, Volkswagen will validate the Mobileye Drive-equipped ID. Buzz vehicles with safety drivers on public roads, towards a goal of series production.&nbsp;\u003C/p>\n\u003Cp>Mobileye Drive incorporates advanced EyeQ&trade;️ Systems-on-Chip (SoCs), as well as Mobileye&rsquo;s sensing, mapping, and driving policy technologies, to create a unique, full-stack autonomous driving system that can adapt to new locations. &nbsp;\u003Cbr />&nbsp;\u003Cbr />You can read more about Volkswagen's testing in Austin \u003Ca href=\"https://media.vw.com/en-us/releases/1750\">here\u003C/a> and Munich \u003Ca href=\"https://www.vwpress.co.uk/en-gb/releases/4984\">here\u003C/a>. &nbsp;&nbsp;\u003C/p>","2023-07-31T07:00:00.000Z","Autonomous Driving, News",{"id":930,"type":5,"url":931,"title":932,"description":933,"primary_tag":934,"author_name":16,"is_hidden":11,"lang":12,"meta_description":933,"image":935,"img_alt":936,"content":937,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":32,"publish_date":938,"tags":939},219,"nimrod-nehushtan-rem-cloud-enhanced-adas-autotech-detroit","The road to the future of mobility is being mapped by REM™","\"REM crowdsourced mapping is the foundation on which all our advanced products are built,\" Mobileye SVP Nimrod Nehushtan outlined at AutoTech: Detroit 2023.",7,"https://static.mobileye.com/website/us/corporate/images/27128f45322c483eaa0cad2e9774c409_1690275739608.jpg","Road Experience Management™ harnesses the power of the crowd to enhance a range of assisted and autonomous driving solutions.","\u003Cp>For more than two decades, Mobileye has been providing the automotive industry with computer-vision technology for advanced driver-assistance systems (ADAS). Today, we offer a full range of mobility solutions, from enhanced ADAS to turnkey self-driving systems. The unique element enabling that expansion and paving the road to the future of mobility, arguably more than any other, is REM&trade;.\u003C/p>\n\u003Cp>Short for \u003Ca href=\"https://www.mobileye.com/technology/rem/\">Road Experience Management&trade;\u003C/a>, REM is Mobileye's proprietary crowdsourced, cloud-connected mapping technology. Nimrod Nehushtan, Mobileye's Senior Vice President of Business Development &amp; Strategy and co-General Manager of REM, spoke at \u003Ca href=\"https://wardsauto.informa.com/autotech-detroit-agenda/\">AutoTech: Detroit 2023\u003C/a> about this pivotal technology &ndash; which we believe is not only essential to tomorrow's autonomous vehicles, but game-changing for today's driver-assistance systems as well.\u003C/p>\n\u003Ch3>\u003Cstrong>Question: Why do vehicles need maps?\u003C/strong>\u003C/h3>\n\u003Cp>Answer: \"A safe and comfortable driving experience relies on a correct and consistent understanding of a lot of different features and attributes of the road,\" Nehushtan explains &ndash; from lane marks and \u003Ca href=\"https://www.mobileye.com/blog/intelligent-speed-assist-general-safety-regulation/\">traffic signs\u003C/a> to the curvature of the road. \"There is a lot of variation between geographies, between cities &ndash; even within the same country. And different conventions of how traffic lights, for example, are positioned, where they are positioned, how they are associated to driving lanes, and so on. And understanding this through onboard sensors alone [in real time] is a very complicated task.\"\u003C/p>\n\u003Cp>That's fundamentally why autonomous (among other) vehicles need maps, which provide the vehicle with such information ahead of time. But REM goes beyond the details of the road itself to encompass knowledge of how those roads are used.\u003C/p>\n\u003Cp>\"When you drive from home to work, you're familiar with the road. You can anticipate what's coming, so you're much less likely to do something dangerous. On the other hand, if you're driving in a new place you've never been before, you're much more likely to do something dangerous, because you don't even understand the rules of the game\" as they're applied in the local driving culture.\u003C/p>\n\u003Cp>\"That preexisting memory is very useful when driving. And it is very important for us to understand properly, to conceive the driving rules, the driving structure of the road, the dos and don'ts for each and every road in order to drive safely and comfortably everywhere.\" REM is engineered to provide both autonomous vehicles and (through our Cloud-Enhanced Driver-Assist&trade; solution detailed below) human drivers as well with that familiarity of any road they might drive on&hellip; as if they&rsquo;ve driven there before, even if they haven&rsquo;t.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/9c7590a4ad84a17d7eba1ad576af71d1_1690276739495.jpg\" alt=\"REM aggregates data from millions of vehicles on the road equipped with Mobileye technology.\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Ch3>\u003Cstrong>Q: How does REM crowdsourced mapping work?\u003C/strong>\u003C/h3>\n\u003Cp>A: \"The common approach today to building high-definition maps is to build a survey fleet that has a lot of different sensors &ndash; like lidars and inertial navigation systems and so on. You build this fleet that is relatively expensive and you operate this fleet in areas you want to map,\" explains Nehushtan. \"The problem with this approach is that it is very, very hard to scale\" to cover areas large enough to be useful. In addition, \"without an inherent capability to update the maps and keep them fresh, the product itself will maybe be useful on day one, but as time goes by, it will start to deteriorate.\"\u003C/p>\n\u003Cp>Mobileye takes a decidedly more efficient and scalable approach to mapping that builds upon our \u003Ca href=\"https://www.mobileye.com/blog/100-million-eyeq-chips/\">established leadership in driver-assistance technology\u003C/a>. \"We have been delivering safety products in the automotive industry for over two decades now.\" In 2022 alone, \"we delivered more than 33 million products, 33 million \u003Ca href=\"https://www.mobileye.com/blog/eyeq6-system-on-chip/\">chips\u003C/a>, which means 33 million cars\" driving around with our computer vision technology on board, monitoring and detecting the parameters of their surroundings.\u003C/p>\n\u003Cp>\"Our idea was to leverage this capability and to simply upload data from these systems to a cloud. It's like uploading a lot of pieces of a puzzle to a cloud,\" where the pieces are put together to create a complete picture of the driving environment. Running all this data through a complex array of proprietary algorithms renders a detailed map of roadways around the world that can be used in \u003Ca href=\"https://www.mobileye.com/solutions/\">a range of ADAS and AV applications\u003C/a>.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/7806008bec15a0ea019257cf57e32a53_1690276876264.jpg\" alt=\"Information like lane-level traffic-light relevancy helps REM enhanced driver-assistance systems with crowdsourced data.\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Ch3>\u003Cstrong>Q: What kinds of mobility solutions benefit from REM?\u003C/strong>\u003C/h3>\n\u003Cp>A: \"Having all this history and all this knowledge of how hundreds and thousands of different people drove on [a given] road over a long period of time allows us to give a kind of &lsquo;superpower&rsquo; to the system using this map,\" says Nehushtan. \"This database can be used for whatever features we want to offer, whether it's autonomous driving or enhanced driver-assist features.\"\u003C/p>\n\u003Cp>That starts with \u003Ca href=\"https://www.mobileye.com/blog/cloud-enhanced-driver-assist/\">Cloud-Enhanced Driver-Assist\u003C/a>, which marries our proven experience in ADAS tech with the benefits of REM.\u003C/p>\n\u003Cp>Take, for example, alerting a driver that they're approaching a red light. What might seem like a relatively straightforward function is anything but simple to implement in practice, due especially to the complexity and local variability of traffic-light placement.\u003C/p>\n\u003Cp>\"You cannot think of a car [today] that offers a safety solution for red light crossing, like emergency braking or alerting when you are about to enter a junction [through] a red light,\" notes Nehushtan. \"Understanding which traffic light is relevant to the lane you're in is a complicated problem. Now, having this database in the cloud allows us to solve this problem by simply connecting the system today to a cloud.\" Armed with the knowledge of which traffic signals are relevant to the lane it's in, the system can alert the driver if they're about to run a red light, helping to avoid potential collisions with cross-traffic, pedestrians, and other road users.\u003C/p>\n\u003Cp>With Cloud-Enhanced Driver-Assist, several automakers are already using REM to enhance the ADAS in their vehicles. And we're building an array of \u003Ca href=\"https://www.mobileye.com/blog/hands-off-eyes-off-taxonomy-for-automated-driving/\">increasingly automated solutions\u003C/a> incorporating the technology. As Nehushtan points out, \"REM crowdsourced mapping is the foundation on which all of these products are built.\"\u003C/p>\n\u003Cp>Watch the full presentation below.\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/3ldGty5Uz0k\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>","2023-07-25T07:00:00.000Z","Video, Autonomous Driving, ADAS, Mapping & REM",{"id":941,"type":24,"url":942,"title":943,"description":944,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":944,"image":945,"img_alt":946,"content":947,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":948,"tags":444},218,"mobileye-launches-the-first-camera-only-intelligent-speed-assist-to-meet-new-eu-standards","Mobileye Launches the First Camera-Only Intelligent Speed Assist to Meet New EU Standards","Mobileye’s new sign-detection technology has received formal homologation across Europe – the first vision-only solution that meets the new EU requirements – and is expected to go into production from Q4 2023.","https://static.mobileye.com/website/us/corporate/images/304f1e110bd310ae8841cb388957ae92_1689672627940.jpg","Using only cameras, Mobileye technology can detect a wide array of traffic signs to support Intelligent Speed Assist systems.","\u003Cp>Mobileye has introduced the world&rsquo;s first vision-only Intelligent Speed Assist (ISA) solution for automakers, following testing and certification across Europe. The camera-only solution, launching in production vehicles this year, helps global automakers meet new European Union (EU) General Safety Regulation (GSR) standards requiring automatic sensing of speed limits in all new vehicle models, without the need to rely on third-party map and GPS data.\u003C/p>\n\u003Cp>The new software, designed for Mobileye&rsquo;s EyeQ platform, has been certified for use in all 27 EU countries as well as Norway, Switzerland, and Turkey. The EyeQ4 and EyeQ6-based ISA system allows OEMs whose vehicles already integrate these chips to meet the new standards merely by updating the EyeQ&rsquo;s existing software, without any new hardware requirements.\u003C/p>\n\u003Cp>The Mobileye ISA system is expected to be integrated by a major global auto group into two vehicle brands for models going on sale in Europe later this year, with three other global automakers following close behind in 2024 and beyond.\u003C/p>\n\u003Cp>&ldquo;This is a major accomplishment for Mobileye, because we&rsquo;ve proven to the industry not only that achieving GSR-compatible vision-only ISA is possible, but also that it performs better than traditional map-based solutions,&rdquo; said Dr. Gaby Hayon, Executive Vice President of Research and Development at Mobileye.\u003C/p>\n\u003Cp>The certified solution, resulting from more than two years of work building on Mobileye&rsquo;s two decades of experience in computer vision and machine learning, is the industry&rsquo;s first of its kind. Current alternatives rely on a combination of cameras and low-resolution maps to meet the EU standards, a solution that typically brings higher cost with complexity and integration efforts, while providing less reliable performance.&nbsp;\u003C/p>\n\u003Cp>Mobileye has developed several cutting-edge technologies that upgraded the legacy traffic sign recognition technology to meet GSR requirements, including:\u003C/p>\n\u003Cul>\n\u003Cli>\u003Cstrong>Traffic sign relevancy\u003C/strong> technology that identifies whether a speed sign is relevant to a specific lane,\u003C/li>\n\u003Cli>\u003Cstrong>Signature-based classification\u003C/strong> that loads the \"signature\" of a new traffic sign to the vehicle, even for new signs that were introduced after the vehicle&rsquo;s manufacture,\u003C/li>\n\u003Cli>\u003Cstrong>OCR-\u003C/strong>based city entrance identification for European-style signs,\u003C/li>\n\u003Cli>\u003Cstrong>Advanced search engines \u003C/strong>that enable finding examples of rare signs in Mobileye&rsquo;s huge clips database and integrating them into the system, and\u003C/li>\n\u003Cli>\u003Cstrong>A road-type classifier\u003C/strong> that can work out the right speed, even when traffic signs are missing, by using different cues in the scene to detect the road type.\u003C/li>\n\u003C/ul>\n\u003Cp>&ldquo;Mobileye&rsquo;s 400-petabyte database of driving footage, gathered from around the world, enables us to rapidly meet the growing requirements of automotive safety regulators with new software designed for our existing driver-assist platforms,&rdquo; said Hayon. &ldquo;After successfully \u003Cem>surpassing\u003C/em> GSR ISA standards during stringent testing, we look forward to collaborating with automakers to implement this lifesaving technology in Europe and beyond.&rdquo;\u003C/p>\n\u003Cp>As of July 2024, all new passenger vehicles sold in the EU must meet specific GSR ISA requirements, as confirmed by rigorous testing, such as being able to detect static and dynamic message speed-limit signage across hundreds of signs, with thousands of country-specific variants, including both explicit and implicit signs, and in harsh weather and adverse lighting conditions. They must also understand temporary speed limits for construction, accidents, or other issues, often given by digital signage, and implicit speed limits such as city entrance.\u003C/p>\n\u003Cp>Speeding contributes to one third of fatal vehicle crashes in EU Member States according to the European Road Safety Observatory, and experts say the new regulations could reduce collisions by as much as 30 percent, and fatalities by up to 20 percent. Under the new regulation, all systems will be required to let drivers know what speed limits are in effect either actively- in which a vehicle automatically slows down gently towards a posted limit--or passively, in which the ISA system alerts drivers when they exceed posted limits. &nbsp;\u003C/p>\n\u003Cp>Six independent labs across five different European countries have tested and confirmed that Mobileye&rsquo;s ISA software meets or exceeds the EU's required standards, with additional testing and certification underway. In the future, Mobileye will continue advancing the system to ensure new sign recognition over the next 14 years per GSR certification requirements.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"Detection of road signs by Mobileye's ISA\" src=\"https://player.vimeo.com/video/846189161?h=9443c2300b&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"425\" height=\"350\" frameborder=\"0\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Cp>\u003Cstrong>About Mobileye Global Inc.\u003C/strong>\u003C/p>\n\u003Cp>Mobileye (Nasdaq: MBLY) is a leader of the mobility revolution with its autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in computer vision, artificial intelligence, mapping, and data analysis. Since its founding in 1999, Mobileye has pioneered such groundbreaking technologies as REM&trade; crowdsourced mapping, True Redundancy&trade; sensing, and Responsibility Sensitive Safety (RSS). These technologies are driving the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions, powering industry-leading advanced driver-assistance systems and delivering valuable intelligence to optimize mobility infrastructure. To date, more than 140 million vehicles worldwide have been built with Mobileye technology inside. In 2022 Mobileye listed as an independent company separate from Intel (Nasdaq: INTC), which retains majority ownership.\u003C/p>\n\u003Cp>&ldquo;Mobileye,&rdquo; the Mobileye logo and Mobileye product names are registered trademarks of Mobileye Global. All other marks are the property of their respective owners.\u003C/p>\n\u003Cp>\u003Cstrong>Forward-Looking Statements\u003C/strong>\u003C/p>\n\u003Cp>This press release contains forward-looking statements. Statements in this release, including statements with respect to the offering, that are not statements of historical fact are forward-looking statements and should be evaluated as such. These statements often include words such as &ldquo;anticipate,&rdquo; &ldquo;expect,&rdquo; &ldquo;suggests,&rdquo; &ldquo;plan,&rdquo; &ldquo;believe,&rdquo; &ldquo;intend,&rdquo; &ldquo;estimates,&rdquo; &ldquo;targets,&rdquo; &ldquo;projects,&rdquo; &ldquo;should,&rdquo; &ldquo;could,&rdquo; &ldquo;would,&rdquo; &ldquo;may,&rdquo; &ldquo;will,&rdquo; &ldquo;forecast,&rdquo; or the negative of these terms, and other similar expressions, although not all forward-looking statements contain these words. We base these forward-looking statements or projections on our current expectations, plans and assumptions that we have made in light of our experience in the industry, as well as our perceptions of historical trends, current conditions, expected future developments and other factors we believe are appropriate under the circumstances and at such time. You should understand that these statements are not guarantees of performance or results. The forward-looking statements are subject to and involve risks, uncertainties and assumptions and you should not place undue reliance on these forward-looking statements. Although we believe that these forward-looking statements are based on reasonable assumptions at the time they are made, you should be aware that many factors could affect our actual financial results or results of operations and could cause actual results to differ materially from those expressed in the forward-looking statements. Detailed information regarding these and other factors that could affect Mobileye&rsquo;s business and results is included in Mobileye&rsquo;s SEC filings, including the company&rsquo;s Annual Report on Form 10-K for fiscal year 2022, particularly in the section entitled &ldquo;Item 1A. Risk Factors,&rdquo; and in the preliminary prospectus and in any subsequent filings with the SEC relating to the offering. Copies of these filings may be obtained by visiting our Investor Relations website at \u003Ca href=\"https://ir.mobileye.com\" target=\"_blank\" rel=\"noopener\">ir.mobileye.com\u003C/a> or the SEC&rsquo;s website at \u003Ca href=\"https://www.sec.gov\" target=\"_blank\" rel=\"noopener\">www.sec.gov\u003C/a>.\u003C/p>\n\u003Cp>Media Contact: Justin Hyde, \u003Ca href=\"mailto:Justin.Hyde@Mobileye.com\">Justin.Hyde@Mobileye.com\u003C/a> +1 202-656-6749\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2023-07-18T07:00:00.000Z",{"id":950,"type":5,"url":951,"title":952,"description":953,"primary_tag":954,"author_name":16,"is_hidden":11,"lang":12,"meta_description":953,"image":955,"img_alt":956,"content":957,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":958,"tags":959},217,"see-supervision-handle-difficult-real-world-driving-scenarios","#SeeSuperVision Handle Difficult Real-World Driving Scenarios","In these short videos, see how Mobileye SuperVision™ negotiates some of the most challenging driving scenarios – with eyes on the road but hands off the wheel.",8,"https://static.mobileye.com/website/us/corporate/images/827744854229a167094fe4b022a01141_1688558391555.jpg","Mobileye SuperVision is engineered to handle difficult maneuvers in real-world driving environments.","\u003Cp>We've written a lot about Mobileye SuperVision&trade; on these pages. But we also know that a picture can be worth a thousand words, and a moving picture that much more.\u003C/p>\n\u003Cp>That's why we've been releasing a series of short videos showing SuperVision in action, handling a series of challenging driving maneuvers &ndash; on real roads, in actual traffic. And you can find them all in the playlist below.\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/videoseries?list=PLWCfS_Yhbvs5O_PHbKZwHDEib3-ALxs9N\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Ch3>\u003Cstrong>When the Going Gets Tough\u003C/strong>\u003C/h3>\n\u003Cp>In these videos, you can see how a vehicle equipped with \u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-bridge-to-consumer-autonomous-vehicles/\">Mobileye SuperVision\u003C/a> handles a variety of precarious situations, such as suddenly obstructed roadways, busy roundabouts, unprotected left turns, slow-moving trucks, constructions zones, and pedestrian crosswalks.\u003C/p>\n\u003Cp>These are all difficult situations that drivers regularly encounter on the road &ndash; situations we've all had to learn how to negotiate when driving. And much like the instructors who taught us these vital skills, we've trained our hands-off/eyes-on solution to handle them safely and effectively &ndash; so you won't have to.\u003C/p>\n\u003Ch3>\u003Cstrong>The Tech to Enable Eyes-On/Hands-Off Driving\u003C/strong>\u003C/h3>\n\u003Cp>Mobileye SuperVision is capable of handling these situations thanks to years of development and decades of expertise in assisted and autonomous driving technologies. SuperVision incorporates our 360-degree surround \u003Ca href=\"https://www.mobileye.com/blog/camera-first-approach-for-assisted-autonomous-driving/\">camera system\u003C/a>, along with our \u003Ca href=\"https://www.mobileye.com/blog/rem-mapping-avs/\">REM&trade;\u003C/a>-generated maps and \u003Ca href=\"https://www.mobileye.com/blog/responsibility-sensitive-safety-unwritten-rules-of-the-road/\">RSS&trade;\u003C/a>-based driving policy &ndash; all running on a pair of \u003Ca href=\"https://www.mobileye.com/blog/eyeq6-system-on-chip/\">our latest EyeQ&trade; systems-on-chip\u003C/a>.\u003Cbr />\u003Ciframe src=\"https://player.vimeo.com/video/842482816?h=6f8c5805f0&amp;autoplay=1&amp;loop=1&amp;title=0&amp;byline=0&amp;portrait=0\" width=\"640\" height=\"360\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003Cbr />These highly advanced technological building blocks enable a vehicle equipped with SuperVision to change lanes, cruise on the highway, move through traffic jams, track other road users, and avoid collisions&hellip; even where, as the videos above demonstrate, driving gets tricky.\u003C/p>\n\u003Ch3>\u003Cstrong>Coming to a Road Near You\u003C/strong>\u003C/h3>\n\u003Cp>Mobileye SuperVision is already on the road in more than 100,000 vehicles from our launch partner \u003Ca href=\"https://www.mobileye.com/news/mobileye-zeekr-expand-future-cars-partnership/\">Zeekr\u003C/a>. And with further collaborations already in place with automakers like \u003Ca href=\"https://newsroom.porsche.com/en/2023/company/porsche-mobileye-collaboration-automated-driver-assistance-functions-32250.html\">Porsche\u003C/a> and \u003Ca href=\"https://media.polestar.com/global/en/media/pressreleases/666140\">Polestar\u003C/a>, it's slated to be integrated into even more. So, before long, you might find yourself riding in a vehicle equipped with SuperVision. And when you do, we hope you'll feel that you're in good hands &ndash; so you can comfortably take yours off the wheel.\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/96280afd0d2a4408149cedc1455fbb81_1688558449454.jpg\" alt=\"The Zeekr 001 is the first vehicle on the market to incorporate the hands-off/eyes-on capabilities of Mobileye SuperVision.\" width=\"1650\" height=\"777\" />\u003C/strong>\u003C/p>","2023-07-10T07:00:00.000Z","ADAS, Video, Autonomous Driving",{"id":961,"type":5,"url":962,"title":963,"description":964,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":964,"image":965,"img_alt":966,"content":967,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":968,"tags":464},216,"making-safety-seen-international-women-in-engineering-day","Making Safety Seen on International Women in Engineering Day","Women make vital contributions to a company's growth and culture. Meet some of the women engineers who drive advancements in our products and technologies.","https://static.mobileye.com/website/us/corporate/images/d901ba75499d608e388f1e4390e42ca6_1687431455849.jpg","Shira Hirsh and Tomer Hochbaum are two of the many woman engineers making a meaningful professional impact at Mobileye.","\u003Cp>It wasn&rsquo;t until the Second World War that serious attention was paid to women&rsquo;s education in technical fields, and women began to enter the male-dominated workforce. In America, Rosie the Riveter became an enduring symbol for the physical and intellectual capabilities of American women engineers. Globally, the proportion of women engineers in 2023 stands at 24% in the United States, 17% in the European Union, 16% in Japan, and 14% in India.\u003C/p>\n\u003Cp>While stereotypes persist in STEM fields especially, there is a growing number of women who contribute to the necessary growth and culture of the companies which in turn benefit from their talents. At Mobileye, women engineers are making a meaningful impact on our \u003Ca href=\"https://www.mobileye.com/solutions/\">products\u003C/a> and \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\">technologies\u003C/a>. &nbsp;Here, read about two women who embody the theme of this year&rsquo;s \u003Ca href=\"https://www.inwed.org.uk/\">International Women in Engineering Day\u003C/a>: &ldquo;make safety seen.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/c03ae56c199365298bda634143e89b61_1687431542718.jpg\" alt=\"Woman engineers like Shira Hirsh and Tomer Hochbaum work alongside men in Mobileye&rsquo;s various engineering departments.\" width=\"1605\" height=\"1070\" />\u003C/p>\n\u003Ch3>\u003Cstrong>Saving lives \u003C/strong>\u003C/h3>\n\u003Cp>There are numerous factors that contribute to road accidents, with speeding, distractions, and non-adherence to \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\">traffic rules\u003C/a> ranking among the top five causes. At Mobileye, the core value of saving lives and promoting accessible, sustainable mobility drives the work of our engineers. Among them is Shira Hirsh, a senior principal engineer specializing in deep learning and computer optimization. Her focus is on integrating numerous algorithms into \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\">EyeQ&trade;\u003C/a> &ndash; Mobileye&rsquo;s proprietary System-on-Chip (SoC) &ndash; which aims to ensure an efficient and very quick response time, ultimately leading to a safe and seamless driving experience.\u003C/p>\n\u003Cp>Since Shira was young, she has always been curious and drawn to problem solving. But before starting her career as an engineer, she faced her own stereotypes about the field.\u003C/p>\n\u003Cp>&ldquo;I had this vision that computer engineers spend most of their days in front of a computer, devoid of human interaction, which is the opposite of the social person that I am,&rdquo; Shira reflects. &ldquo;But after careful contemplation and great support from my family and friends, I embraced the decision to pursue a degree in computer science.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/e76096cc79a4085c11591f80a564e8cf_1687431574825.jpg\" alt=\"Woman engineers are an integral part of the workforce at Mobileye, helping to make our roads safer through technology.\" width=\"1605\" height=\"1070\" />\u003C/p>\n\u003Cp>&ldquo;My choice,&rdquo; she explains, &ldquo;was motivated by my passion for problem-solving and the realization that engineering necessitates exceptional social skills for effective collaboration, communication, and understanding people&rsquo;s needs. My strength in connecting with people is an asset in my field.&rdquo;\u003C/p>\n\u003Cp>Together with her expertise in \u003Ca href=\"https://www.mobileye.com/blog/moving-our-machine-learning-to-the-cloud-inspired-innovation/\">machine learning\u003C/a> and computer optimization, she plays a pivotal role in the development and integration of advanced algorithms, ensuring that Mobileye&rsquo;s \u003Ca href=\"https://www.mobileye.com/blog/what-is-advanced-driver-assistance-system-adas/\">driver-assistance\u003C/a> technologies deliver the highest level of safety and reliability.\u003C/p>\n\u003Cp>Like Shira, Mobileye algorithm and machine learning team manager Tomer Hochbaum had the same at-home support to break the gender barrier as a female engineer.\u003C/p>\n\u003Cp>&ldquo;Both of my parents are in similar fields, so to focus on math and science was second nature to me,&rdquo; she explains. &ldquo;At home, I grew up with a big sister and mother whom I both admire, and I was always told that gender is not a barrier. It helped me overlook the obvious when I arrived at university: that all the teachers and my classmates in my engineering department were male. On top of that, studying did not come easily to me, and I found university to be very difficult. I worked very hard and found myself at Mobileye towards the end of my university experience. Despite managing a team of five, I still don&rsquo;t see myself as an engineer only &ndash; I volunteer a lot, I work with animals. I am a lot of things outside of my work.&rdquo;\u003C/p>\n\u003Ch3>\u003Cstrong>The pursuit of safety\u003C/strong>\u003C/h3>\n\u003Cp>Tomer and Shira&rsquo;s work at Mobileye aligns perfectly with this year&rsquo;s theme, &ldquo;make safety seen&rdquo;. For Tomer, she and her team strive to improve safety by developing technologies that can outperform human drivers. By accurately detecting objects, their goal is to reduce accidents and create a safer environment for all. &nbsp;\u003C/p>\n\u003Cp>For Shira, her cross-project role focuses on accelerating and optimizing algorithms across various domains, including solution-based \u003Ca href=\"https://www.mobileye.com/blog/ceo-amnon-shashua-on-the-technological-megashifts-impacting-our-world/\">AI technology\u003C/a> such as machine and deep learning, computer vision, and model-based approaches on Mobileye's proprietary EyeQ chip. Her team&rsquo;s main goal is to enhance the algorithms, enabling them to function in real-time and deliver rapid response times for ADAS systems, ensuring utmost safety.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/0c3f1e981a7066c947c2dcabd415f775_1687431615069.jpg\" alt=\"The autonomous vehicle garage at Mobileye is where our self-driving technologies are put to the test in various vehicles.\" width=\"1605\" height=\"1070\" />\u003C/p>\n\u003Ch3>\u003Cstrong>The impact on the world\u003C/strong>\u003C/h3>\n\u003Cp>In \u003Ca href=\"https://www.mobileye.com/blog/mobileye-campus-jerusalem-leed-platinum-environmental-rating/\">the office\u003C/a>, the creative collaboration between \u003Ca href=\"https://www.mobileye.com/blog/we-are-mobileye-by-kutiman/\">their colleagues\u003C/a> is precisely what is needed to push through and achieve breakthroughs. Shira explains: &ldquo;As projects progress and \u003Ca href=\"https://www.mobileye.com/blog/intelligent-speed-assist-general-safety-regulation/\">safety regulations\u003C/a> become stricter, there are instances where it may appear that we have reached maximum efficiency and cannot fit in even one more needle. However, together with others on the algorithm teams, we consistently conquer the challenge. We discover innovative approaches to reduce cycles and optimize new algorithms, enabling them to fit within our constraints. These efforts often yield results that allow us to continually enhance performance and functionality. &ldquo;\u003C/p>\n\u003Cp>&ldquo;Mobileye is a company that works towards improving safety,&ldquo; Tomer highlights. &ldquo;Everything we do is towards that goal of what&rsquo;s going to be safer and better for people. Every check is to make sure that our technology can become more accurate than a human.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/7d4ac1ab1a4ba4f512d5afe8d09e7c18_1687431651240.jpg\" alt=\"Mobileye&rsquo;s autonomous driving technologies are tested and deployed in vehicles like the Zeekr 001 and Nio ES8.\" width=\"1605\" height=\"1070\" />\u003C/p>\n\u003Cp>For Tomer and Shira, improving safety is just one of the many impacts engineers can make in the world. These exemplary women engineers are not limited to the work they do at Mobileye &ndash; their expertise allows them to address smaller, yet no-less-significant problems that align with their personal values and passions. Tomer highlights this type of opportunity through her volunteer work with animal adoption where she uses her expertise to identify committed owners and prevent abandonment.\u003C/p>\n\u003Cp>By playing pivotal roles in the development and integration of advanced algorithms, women engineers, like those we&rsquo;ve spotlighted here, ensure that Mobileye's driver-assistance technologies deliver unmatched safety and reliability. The road for women engineers may indeed be long, but it's paved with the stories like those of Shira and Tomer who break stereotypes along the way.\u003C/p>","2023-06-22T07:00:00.000Z",{"id":970,"type":5,"url":971,"title":972,"description":973,"primary_tag":954,"author_name":16,"is_hidden":11,"lang":12,"meta_description":973,"image":974,"img_alt":975,"content":976,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":977,"tags":978},215,"mobileye-cto-prof-shai-shalev-shwartz-ecomotion-2023","Mobileye CTO: “AVs Are Part of the Future and Are Here to Stay”","Speaking at EcoMotion 2023, Prof. Shai Shalev-Shwartz outlined two obstacles on the road to the autonomous future – and Mobileye’s approach to overcoming both.","https://static.mobileye.com/website/us/corporate/images/48ab1e7f7ac60427ac8b776d8d2a6f60_1687166825763.jpg","Mobileye CTO Prof. Shai Shalev-Shwartz spoke about the autonomous future at EcoMotion 2023. (Photos: Noga Shadmi van de Reep)","\u003Cp>Are autonomous vehicles &ldquo;dead or inevitable?&rdquo; That&rsquo;s one question which the \u003Ca href=\"https://www.ecomotionweek.com/\">EcoMotion\u003C/a> conference sought to tackle this year. And to answer, they brought in our Chief Technology Officer, Prof. Shai Shalev-Shwartz, who spoke on a panel together with former Florida state senator Jeff Brandes, moderated by \u003Ca href=\"https://www.mobileye.com/interview/autonocast-prof-amnon-shashua-taxonomy/\">\u003Cem>Autonocast\u003C/em>\u003C/a> co-host Alex Roy.\u003C/p>\n\u003Cp>&ldquo;By now there is evidence that self-driving is possible,&rdquo; said Prof. Shalev-Shwartz. &ldquo;We see self-driving cars in the streets of San Francisco and Austin. So I think that we are beyond this obstacle. The problems are different.&rdquo;\u003C/p>\n\u003Cp>Those problems are the same that Shalev-Shwartz and his colleagues \u003Ca href=\"https://arxiv.org/abs/1708.06374\">laid out in a research paper six years ago\u003C/a>, namely: safety and scalability.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/f844b86f4334aff4258d9ddaf01125d6_1687181105168.jpg\" alt=\"EcoMotion 2023 featured a panel discussion between Prof. Shai Shalev-Shwartz, Jeff Brandes, and Alex Roy.\" width=\"1650\" height=\"777\" />\u003C/p>\n\u003Cp>To tackle the former, Mobileye formulated the \u003Ca href=\"https://www.mobileye.com/blog/responsibility-sensitive-safety-unwritten-rules-of-the-road/\">Responsibility-Sensitive Safety&trade;\u003C/a> model (RSS&trade;). As Shalev-Shwartz put it: &ldquo;There is no way to guarantee absolute safety. So the question is, if we cannot guarantee absolute safety, what can we guarantee? And how can we have a language that enables us to talk about exactly what is &lsquo;safe&rsquo; and what is &lsquo;not safe&rsquo;?&rdquo; That&rsquo;s what RSS seeks to define, he noted. &ldquo;If everybody would act according to RSS, then there would be no accidents at all.&rdquo;\u003C/p>\n\u003Cp>The key to scalability and economic viability, according to Shalev-Shwartz, is to take a long view towards the autonomous future, maintain the profitability needed to stay the course, and \u003Ca href=\"https://www.mobileye.com/blog/hands-off-eyes-off-taxonomy-for-automated-driving/\">clearly and honestly communicate the technology&rsquo;s capabilities\u003C/a> along the way. &ldquo;We view AVs as a marathon and not a short sprint. We are trying to be very honest about what the technology is capable of, and what the technology is not capable of.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/75355ee65bca48eafa0fb8514c373ae0_1687181324356.jpg\" alt=\"Mobileye Chief Technology Officer Prof. Shai Shalev-Shwartz speaks about autonomous vehicles at EcoMotion Week 2023.\" width=\"1650\" height=\"777\" />\u003C/p>\n\u003Cp>&ldquo;We believe that the right way to win a marathon is to start with the foundations and then make incremental changes, incremental improvements, while having a solid business all the time, rather than trying to make promises or to go all the way to something that you cannot hold your breath for long enough,&rdquo; said Shalev-Shwartz. &ldquo;This was Mobileye&rsquo;s approach from day one. We have a solid ADAS business, we are profitable all the time. We're developing autonomous vehicles and we are profitable all the time. Then we make the next step to \u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-bridge-to-consumer-autonomous-vehicles/\">SuperVision\u003C/a>,&rdquo; Mobileye&rsquo;s hands-off/eyes-on solution and a vital step along the path to \u003Ca href=\"https://www.mobileye.com/blog/when-will-self-driving-cars-be-available/\">the incremental deployment of autonomous-driving technologies\u003C/a>.\u003C/p>\n\u003Cp>&ldquo;This is the approach. No hype &ndash; just be honest about what you're giving and do useful products that people can enjoy today, have a solid business, and make progress.&rdquo;\u003C/p>\n\u003Cp>Ultimately, Prof. Shalev-Shwartz concluded, &ldquo;I think the future is going to be very, very interesting. And I think AVs are part of the future and are here to stay, and we will all enjoy them in the future.&rdquo;\u003C/p>\n\u003Cp>Watch the full session in the video below.\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/ZdWne6VnCyw\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>","2023-06-20T07:00:00.000Z","Autonomous Driving, Video",{"id":980,"type":5,"url":981,"title":982,"description":983,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":983,"image":984,"img_alt":985,"content":986,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":32,"publish_date":987,"tags":563},214,"what-is-advanced-driver-assistance-system-adas","What is ADAS? (Advanced Driver-Assistance Systems)","ADAS features like adaptive cruise control and forward collision warning use technology to assist drivers and enhance road safety. Learn more from Mobileye. ","https://static.mobileye.com/website/us/corporate/images/91454a36d68f6445a115fbab02d00267_1686726590249.jpg","Mobileye supplies the technology to support driver-assistance systems in hundreds of car models sold around the world.","\u003Cp>Short for advanced driver-assistance systems, ADAS is a catch-all term for any type of technological feature that makes driving safer, easier, or more comfortable. It can encompass everything from basic functions like warning you of a potential collision to more advanced features like changing lanes automatically.\u003C/p>\n\u003Cp>Most new cars today incorporate some measure of ADAS features, and many of those are supported by \u003Ca href=\"https://www.mobileye.com/technology/\">Mobileye technology\u003C/a>. In this first installment of our new Mobileye 101 series, we&rsquo;ll drive you through the different types of driver-assist features and the technology that makes them possible.\u003C/p>\n\u003Ch3>\u003Cstrong>Passive vs Active ADAS\u003C/strong>\u003C/h3>\n\u003Cp>Designed to alert the driver of a potential hazard or impending collision, passive driver-assist features are the most fundamental form of ADAS system. These include such common features as lane-departure warning (which alerts the driver in case the vehicle is veering out of its lane) and blind-spot monitor (which watches for obstacles the driver might not be able to readily see).\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/13c2be1167df7049b013e63d1df1db8b_1686726804414.jpg\" alt=\"Mobileye technology uses cameras and advanced computer vision algorithms to detect and safeguard vulnerable road users.\" width=\"1650\" height=\"776\" />\u003C/strong>\u003C/p>\n\u003Cp>Active safety systems go further than passive ones by actively intervening to help avoid collisions. A feature like \u003Ca href=\"https://www.jdpower.com/cars/shopping-guides/what-is-automatic-emergency-braking\">Automatic Emergency Braking\u003C/a> (AEB), for example, has the ability not only to detect an impending collision and warn the driver accordingly, but also to slow down or even stop the vehicle if the driver doesn&rsquo;t react in time.\u003C/p>\n\u003Cp>Another such feature is \u003Ca href=\"https://www.caranddriver.com/research/a32813983/adaptive-cruise-control/\">Adaptive Cruise Control\u003C/a> (ACC), which not only maintains the vehicle speed set by the driver, but will slow the vehicle down if it comes up on slower-moving traffic, and accelerate back up to the preset speed once the way is clear again.\u003C/p>\n\u003Ch3>\u003Cstrong>Incremental Evolution from Assisted to Autonomous Driving\u003C/strong>\u003C/h3>\n\u003Cp>Automakers have even begun offering more comprehensive systems that combine the functions of several active safety systems to automate some of the more mundane driving tasks.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/f919798a57cc4026cd594f45020fc7f0_1686726931610.png\" alt=\"ADAS features like adaptive cruise control and lane-keep assist combine to form advanced highway assist systems.\" width=\"1650\" height=\"776\" />\u003C/p>\n\u003Cp>Combine Adaptive Cruise Control with Lane-Keep Assist, for example, and you get a highway-assist system that can control the vehicle&rsquo;s accelerator, brakes, and steering to keep the vehicle moving safely on the highway without veering out of its lane or rear-ending other vehicles in its path. A traffic-jam assist system essentially does the same, but at low speeds in stop-and-go traffic.\u003C/p>\n\u003Cp>The more of these systems a vehicle has, the closer it gets to autonomous capabilities. \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\">Mobileye SuperVision&trade;\u003C/a>, for example, combines advanced features such as highway and traffic-jam assist with autonomous lane changing, evasive maneuver assist, blind-spot monitoring, front and rear collision avoidance, and more, to deliver eyes-on/hands-off autonomous driving capabilities on a variety of road types.\u003C/p>\n\u003Ch3>\u003Cstrong>Industry-Leading Computer Vision Technology\u003C/strong>\u003C/h3>\n\u003Cp>To enable driver-assist features, vehicles need sensors and processors running the right software. Dozens of automakers around the world opt for cameras coupled with Mobileye&rsquo;s industry-leading \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\">EyeQ&trade; systems-on-chip\u003C/a>.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/ddada6efb79b1aa135434bec0856ca08_1686727074170.jpg\" alt=\"Mobileye&rsquo;s EyeQ systems-on-chip support the ADAS features in more than 140 million vehicles to date.\" width=\"1650\" height=\"776\" />\u003C/p>\n\u003Cp>Highly efficient and built from the ground up for the job, EyeQ encapsulates the computer-vision technology that vehicles need to enable their core ADAS features. It&rsquo;s coded with complex algorithms that interpret camera feeds to detect and identify other road users, obstacles, hazards, weather conditions, traffic signals, lane markers, the shape of the roadway, and more, in real time. Based on the signals it provides, the vehicle can alert the driver or take active measures to avoid collisions, all in order to operate as safely as possible in an often complex and fast-changing driving environment.\u003C/p>\n\u003Cp>Now in its sixth generation, EyeQ has gone into more than 140 million vehicles to date. That number keeps growing, while we continuously develop ever more advanced technologies to make driving easier and transportation safer &ndash; from crowdsourced maps in our \u003Ca href=\"https://www.mobileye.com/blog/cloud-enhanced-driver-assist/\">Cloud-Enhanced Driver-Assist&trade;\u003C/a> solution to 360-degree surround camera coverage in Mobileye SuperVision&trade; to active \u003Ca href=\"https://www.mobileye.com/blog/radar-lidar-next-generation-active-sensors/\">radar and lidar\u003C/a> sensors in our autonomous-driving solutions: Mobileye Chauffeur&trade; (for consumer vehicles) and Mobileye Drive&trade; (for commercial vehicles).\u003C/p>\n\u003Cp>At this point, we&rsquo;re evolving well beyond the scope of driver assistance and into the realm of autonomous driving. But no matter how far our solutions advance, they&rsquo;ll all owe their origins to the building blocks of our lifesaving core ADAS technology.\u003C/p>","2023-06-15T07:00:00.000Z",{"id":989,"type":5,"url":990,"title":991,"description":992,"primary_tag":190,"author_name":10,"is_hidden":11,"lang":12,"meta_description":992,"image":993,"img_alt":994,"content":995,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":32,"publish_date":996,"tags":997},213,"autonomous-vehicle-day-the-self-driving-stack","How Autonomous Vehicles Work: the Self-Driving Stack","To celebrate Autonomous Vehicle Day, we’re going back to basics and looking at the foundational technology that makes autonomous driving possible.","https://static.mobileye.com/website/us/corporate/images/f136d554be484a39fc8e368eccfc090e_1685551843254.png","Mobileye develops a full range of technologies to enable autonomous driving on all road types, from highways to city streets.","\u003Cp>In one of the most popular of our \u003Ca href=\"https://www.youtube.com/watch?v=pDyMzz8HMIc&amp;t=13s\">unedited self-driving videos\u003C/a>, an autonomous vehicle (AV) takes on the task of driving through one of the most challenging urban driving environments for any driver&mdash;human or otherwise: Jerusalem.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/robotaxi-night-drive-jerusalem-unedited-video/\" target=\"_blank\" rel=\"noopener\">Jerusalem\u003C/a> is a 5,000-year-old labyrinth of unplanned ancient pathways that have transformed into modern roadways. Traffic can be unpredictable and intense, presenting a completely different driving experience from the planned organization of a modern city.\u003C/p>\n\u003Cp>Autonomous Vehicle Day offers a perfect occasion to both appreciate and examine the enormity of the technological achievement involved in driving autonomously in such a difficult environment.\u003C/p>\n\u003Ch3>\u003Cstrong>The &ldquo;Brain&rdquo; of a Self-Driving Car\u003C/strong>\u003C/h3>\n\u003Cp>&ldquo;A self-driving car must work flawlessly and be able to navigate through obstacles and other road users. To do that, it needs a very smart brain,&rdquo; explains Mobileye&rsquo;s Chief Technology Officer, Prof. Shai Shalev-Shwartz.\u003C/p>\n\u003Cp>That &ldquo;smart brain&rdquo; in an AV, consisting of software running on \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener\">powerful microprocessors\u003C/a>, is called the &ldquo;self-driving stack&rdquo;. This is what allows an AV to successfully complete such delicate maneuvers as making natural unprotected left turns and veering slightly to avoid an open door (to name just a couple).\u003C/p>\n\u003Cp>Just as we can divide the human brain into various functional layers, we can also divide the brain of a self-driving car into the following functional layers:&nbsp;\u003Cbr />- Sensing\u003Cbr />- Perception\u003Cbr />- Localization\u003Cbr />- Planning\u003Cbr />- Control\u003C/p>\n\u003Ch3>\u003Cstrong>How a Self-Driving Car Senses\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/5a9d96f8400e0c5f402a91106d4b1985_1685610377231.jpg\" alt=\"Active radar and lidar sensors compliment cameras to complete the sensing state in Mobileye&rsquo;s autonomous-driving solutions.\" width=\"1650\" height=\"776\" />\u003C/p>\n\u003Ch4>\u003Cstrong>It Starts with a Camera\u003C/strong>\u003C/h4>\n\u003Cp>Self-driving cars are equipped with a variety of sensors that serve as the &ldquo;eyes&rdquo; of the vehicle and comprise the sensing layer. Not surprisingly, the main sensors that AVs use are the sensors that are most similar to the eyes of a human driver&mdash;\u003Ca href=\"https://www.mobileye.com/blog/camera-first-approach-for-assisted-autonomous-driving/\">cameras\u003C/a>.\u003C/p>\n\u003Cp>Cameras of various resolutions, sizes, and angles are typically mounted on the windshield, bumpers, and side mirrors. Working together, the cameras are able to capture a 360&deg; surround view of the vehicle.\u003C/p>\n\u003Cp>Cameras are superior to other sensors in detecting colors and shapes, so they are good at detecting lane markers, road signs, other vehicles, and various other objects in the driving environment.\u003C/p>\n\u003Ch4>\u003Cstrong>Another Layer of Accuracy\u003C/strong>\u003C/h4>\n\u003Cp>Of course, cameras have their limitations&mdash;especially in poor lighting or weather conditions. That&rsquo;s why the cameras in AVs are supplemented with inputs from other sensors, such as \u003Ca href=\"https://www.mobileye.com/blog/radar-lidar-next-generation-active-sensors/\">radar and lidar\u003C/a>, to provide a more comprehensive view of the vehicle&rsquo;s surroundings.\u003C/p>\n\u003Cp>Compared to camera sensors, radar works reliably well in low-visibility conditions such as rain, snow, and fog. However, unlike cameras, radar is not good at modeling the precise shape of an object.\u003C/p>\n\u003Cp>Lidar, an acronym for Light Detection and Ranging, is a technology that is similar to radar, but uses laser light pulses instead of radio waves to measure distance and create a 3D map of its environment.\u003C/p>\n\u003Ch3>\u003Cstrong>Putting it All Together\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/97d05301af3b438b972736b0c2bac5c7_1685552940517.jpg\" alt=\"Through our True Redundancy&trade; approach to sensing, Mobileye utilizes a variety of sensors to enable autonomous driving.\" width=\"1650\" height=\"776\" />\u003C/p>\n\u003Cp>The process of combining inputs from all sensors (cameras, radar, and lidar) to create an image of the world is called &ldquo;sensor fusion&rdquo;.\u003C/p>\n\u003Cp>There are two main types of sensor fusion, called early and late sensor fusion, which indicate at what point data received from the various sensors is combined. In early sensor fusion, raw data is fused and then object detection algorithms are applied. In late sensor fusion, object detection algorithms are applied to the data before fusing the resulting 3D maps.\u003C/p>\n\u003Cp>While early sensor fusion is the industry standard, Mobileye&rsquo;s self-driving system uses late sensor fusion. We also employ \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\">sensor redundancy\u003C/a> to create independent models of the road environment that can each provide the perceptual information needed to drive the AV if one system fails.\u003C/p>\n\u003Ch3>\u003Cstrong>Making Sense of it All \u003C/strong>\u003C/h3>\n\u003Cp>Once sensor data has been received, the car&rsquo;s onboard computer needs to make meaning of it. Is that a billboard or a truck? Where is the edge of the road?\u003C/p>\n\u003Cp>A whole subfield of \u003Ca href=\"https://www.mobileye.com/blog/ceo-amnon-shashua-on-the-technological-megashifts-impacting-our-world/\" target=\"_blank\" rel=\"noopener\">artificial intelligence\u003C/a>, called \u003Ca href=\"https://www.mobileye.com/blog/computer-vision-eccv-2022/\" target=\"_blank\" rel=\"noopener\">computer vision\u003C/a>, is dedicated to this task. This has been one of the main areas of focus at Mobileye for over two decades.\u003C/p>\n\u003Cp>Computer vision algorithms&mdash;which include \u003Ca href=\"https://www.youtube.com/watch?v=GjmZyMWo7YY\">object recognition\u003C/a>, pattern recognition, clustering algorithms, and more&mdash;are applied to the data, allowing the car to detect lane markings, street signs, traffic lights, and other objects in the environment.\u003C/p>\n\u003Cp>Custom algorithms are even developed to detect such situations as open car doors and the hand signals of traffic police.\u003C/p>\n\u003Ch3>\u003Cstrong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/fe4450c54de59cab0969905a1f71a6b2_1685552815408.jpg\" alt=\"Mobileye employs proprietary computer-vision algorithms to identify objects and other road users in its driving environment.\" width=\"1650\" height=\"776\" />\u003C/strong>\u003C/h3>\n\u003Ch3>\u003Cstrong>Precisely Locating Itself on a Map\u003C/strong>\u003C/h3>\n\u003Cp>Localization, another essential layer of the self-driving stack, is the process of determining a vehicle's precise location relative to its surroundings, which is essential to enabling the car to make informed driving decisions and plan a trip.\u003C/p>\n\u003Cp>GPS is commonly used for initial localization, but it has limitations in terms of accuracy, especially in urban environments with tall buildings that can interfere with satellite signals. When safety is critical, localization has to be accurate within centimeters, not meters and inertial sensors, cameras, lidar, and radar provide more precise location information.\u003C/p>\n\u003Cp>The use of high-definition maps is another important step in the process of localization because they contain exact information about the location of static features of the road such as curbs and pedestrian crossings. This allows the vehicle to more accurately determine its position relative to the road and &ldquo;fill in the gaps&rdquo; where sensor data may be incomplete.\u003C/p>\n\u003Cp>Once it obtains an accurate map of the road environment, the AV uses various algorithms to determine its exact location and orientation. The algorithms take into account factors such as the car's velocity, steering angle, and other sensor readings to calculate the vehicle's precise location.\u003C/p>\n\u003Cp>Mobileye&rsquo;s approach to mapping, \u003Ca href=\"https://www.mobileye.com/technology/rem/\">Road Experience Management&trade;\u003C/a>&nbsp;(or REM&trade;), rather than using a static HD map, uses anonymous data crowdsourced from around the world, resulting in dynamic, continuously updated maps.\u003C/p>\n\u003Cp>The maps generated by Mobileye&rsquo;s REM technology provide a richness of map semantics, which means that our mapping technology goes an important step further than most mapping technologies&ndash;it also captures how drivers use roads and the environment around them. For example, using REM, a vehicle is not only able to perceive a traffic light, but &ldquo;understands&rdquo; which lane (or lanes) is associated with it.&nbsp;&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>The Planning Layer of Autonomous Driving\u003C/strong>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003C/h3>\n\u003Cp>If you are a self-driving car and you know what the world looks like and where you are exactly, you&rsquo;ve come a long way. Now, you just need to make decisions based on this world model and create a plan to get to your destination. That&rsquo;s where the next layer of the self-driving stack comes in.\u003C/p>\n\u003Cp>When planning a trip, the AV needs to plan a route to get from point A to point B. However, in AVs, the planning layer consists of many types of planning&mdash;including motion planning, decision-making, collision avoidance, and behavioral planning.\u003C/p>\n\u003Cp>For instance, collision avoidance algorithms ensure the vehicle avoids moving and static objects on the roadway to keep everyone safe. Other algorithms that are part of the planning layer ensure that occupants perceive the trip as safe and comfortable (i.e., no sudden accelerations or braking) and that the car drives according to traffic laws.\u003C/p>\n\u003Cp>Motion planning algorithms make decisions about such maneuvers as lane changes and merges. The AV also needs to take into account what objects in the roadway might do, which is referred to as behavioral planning.\u003C/p>\n\u003Cp>Mobileye has added an extra layer of safety to the planning process with \u003Ca href=\"https://www.mobileye.com/blog/responsibility-sensitive-safety-unwritten-rules-of-the-road/\">Responsibility-Sensitive Safety&trade;\u003C/a>&nbsp;(or RSS&trade;).&nbsp; Our open mathematical model for AV safety and the basis for our driving policy, RSS is based on transparent and verifiable rules for balancing cautious and assertive driving.\u003C/p>\n\u003Ch3>\u003Cstrong>Controlling the Car\u003C/strong>\u003C/h3>\n\u003Cp>Once the car&rsquo;s computer has determined how to safely move based on sensor information, perception, and planning, it now needs to control the vehicle in the same way that a human driver would. Just as a human brain controls the movements of our limbs to allow us to navigate our world, an AV&rsquo;s computer also controls the brakes, steering wheel, accelerator, and other components.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>In celebration of AV Day, we've delved into some of the inner workings of this life-changing technology. From sensors and algorithms to crowd-sourced mapping and behavioral planning, \u003Ca href=\"https://www.mobileye.com/blog/history-autonomous-vehicles-renaissance-to-reality/\">self-driving cars have come a long way\u003C/a>.\u003C/p>\n\u003Cp>Imagine a future where road accidents are a thing of the past and time spent driving is transformed into productivity, leisure, and family time. &nbsp;\u003C/p>\n\u003Cp>At Mobileye, this vision is fast becoming reality, and we&rsquo;re excited about what that means for the future of mobility.\u003C/p>","2023-05-31T07:00:00.000Z","Autonomous Driving",{"id":999,"type":5,"url":1000,"title":1001,"description":1002,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1002,"image":1003,"img_alt":1004,"content":1005,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1006,"tags":732},197,"hands-off-eyes-off-taxonomy-for-automated-driving","Autonomous Driving Levels: Hands Off, Eyes Off - A New Taxonomy","Instead of autonomous driving levels, our new taxonomy defines assisted and autonomous driving systems by degree of driver involvement.","https://static.mobileye.com/website/us/corporate/images/3409a38e85102714bbffb31a0475ca85_1675343297877.png","Mobileye formulated a new framework for assisted and autonomous driving to be understandable by everyone.","\u003Cp>Language matters. Words have power. They help us to define things properly, to understand them, and to understand one another. So, when the existing lexicon leaves too much room for confusion, we need new terminology that clarifies and simplifies the subject matter. And that is precisely what we outlined \u003Ca href=\"https://www.mobileye.com/blog/ces-2023-recap/\">at CES\u003C/a> earlier this year: a \u003Ca href=\"https://www.mobileye.com/opinion/defining-a-new-taxonomy-for-consumer-autonomous-vehicles/\">new taxonomy\u003C/a> for assisted and autonomous driving that&rsquo;s both accurate and easy to understand.\u003C/p>\n\u003Cp>&ldquo;The discourse today is Level 2, Level 3, Level 4.... This taxonomy is good for engineers,&rdquo; Mobileye CEO &amp; founder \u003Ca href=\"https://www.mobileye.com/amnon-shashua/\">Prof. Amnon Shashua\u003C/a> said at CES 2023. But what we really need is &ldquo;a product-oriented language. So, we created our own language which says that there is eyes-on/eyes-off, there is hands-on/hands-off, there is a driver or no driver. That&rsquo;s it.&rdquo;\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/UCBlR4QFQCA?clip=UgkxB-CV4Be0J8rQHU7cOTxW7dY_KUTn5y5e&amp;clipt=EJCfKRjbwSs\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>Here&rsquo;s the logic behind this new taxonomy, and how it applies to&nbsp;\u003Ca href=\"https://www.mobileye.com/solutions/\">various types of driving systems\u003C/a>.\u003C/p>\n\u003Ch3>\u003Cstrong>Simplifying the Relationship Between Driver and Vehicle\u003C/strong>\u003C/h3>\n\u003Cp>Until now, the capabilities of assisted and autonomous driving technologies have been categorized into six levels of driving automation. That taxonomy was first defined in 2014 under \u003Ca href=\"https://www.sae.org/blog/sae-j3016-update\">SAE J3016\u003C/a>, a standard issued by SAE International (formerly known as the Society of Automotive Engineers). At one end of the spectrum sits Level 0, with no significant form of driver assistance whatsoever. At the other end, Level 5 autonomous driving describes vehicles capable of operating autonomously anywhere and everywhere. Everything else falls into one of the levels in between.\u003C/p>\n\u003Cp>The SAE levels of automation have been adopted widely across the industry, and arguably represent the most useful taxonomy we&rsquo;ve had until now. But does these autonomous driving levels clearly and effectively convey a vehicle&rsquo;s capabilities? Will the layperson understand where their responsibilities as a driver end and where the vehicle&rsquo;s begin (without either a chart or an in-depth understanding of the technology)?\u003C/p>\n\u003Cp>As the technologies have developed and evolved, the autonomous driving levels are no longer the most effective way of characterizing a vehicle&rsquo;s automation &ndash; particularly with \u003Ca href=\"https://www.mobileye.com/blog/understanding-l2-in-five-questions/\">the emergence of L2+\u003C/a>, the lack of clarity regarding the man/machine interaction under L3, and the practical differences between L4 and L5 having been diminished by \u003Ca href=\"https://www.mobileye.com/blog/rem-mapping-avs/\">widespread mapping\u003C/a>.\u003C/p>\n\u003Cp>So instead of levels of automation defined by engineers for engineers, we set out to characterize the relationship between human and machine based on the simple questions that matter most to drivers (among others), namely:\u003Cbr />1. Does the driver need to keep hands on the wheel? \u003Cbr />2. Does the driver need to keep a watchful eye on the road? \u003Cbr />3. Does the vehicle even need a driver at all?\u003C/p>\n\u003Cp>The answers to these questions clearly define which responsibilities rest with the driver and which with the vehicle, and in what types of driving environments.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/4b7e967e74b128bff373a8c1847d7c2f_1675343014441.png\" alt=\"One of our most advanced solutions, Mobileye SuperVision is a hands-off/eyes-on driver-assistance system.\" width=\"1650\" height=\"776\" />\u003C/p>\n\u003Ch3>\u003Cstrong>Terminology in Action\u003C/strong>\u003C/h3>\n\u003Cp>The baseline assumption governing automobile operation over most of its history has been that a human driver is solely responsible for controlling the vehicle and watching the road at all times &ndash; hands-on, eyes-on. But that&rsquo;s begun to change with the advancement of driver-assistance systems and the development of autonomous vehicles.\u003C/p>\n\u003Cp>With a solution like \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\">Mobileye SuperVision&trade;\u003C/a>, for example, the driver can take hands off the wheel and let the vehicle operate itself on all regular road types. But responsibility and overall control still rest with the driver, who must supervise the vehicle&rsquo;s operation at all times. Mobileye SuperVision, then, is a hands-off/eyes-on system.\u003C/p>\n\u003Cp>For Mobileye Chauffeur&trade;, we add \u003Ca href=\"https://www.mobileye.com/blog/radar-lidar-next-generation-active-sensors/\">active sensors\u003C/a> (such as radar and lidar) to the computer vision, \u003Ca href=\"https://www.mobileye.com/technology/rem/\">specialized crowdsourced maps\u003C/a>, and \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\">lean driving policy\u003C/a> that go into Mobileye SuperVision. These \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\">redundant\u003C/a> active sensors will allow drivers to take not only take their hands off the wheel, but their eyes off the road as well &ndash; within specific driving environments, or what engineers call Operational Design Domains. (Like a vacuum cleaner may be designed to work indoors and a lawnmower outdoors, a system might be restricted to driving autonomously only on certain road types as its Operational Design Domain expands.)\u003C/p>\n\u003Cp>Mobileye Drive&trade;, meanwhile, further builds upon the capabilities of Mobileye Chauffeur with the addition of a tele-operation system. That added capability is incorporated to handle the few remaining rare cases where human involvement may be required, allowing for the removal of the driver from the equation entirely.\u003C/p>\n\u003Cp>\u003Ca href=\"https://static.mobileye.com/website/us/corporate/images/f0360b340e9082d4e3e79c4e9760f080_1675342547123.png\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/f0360b340e9082d4e3e79c4e9760f080_1675342547123.png\" alt=\"Classifying the capabilities of Mobileye's advanced solutions according to our new taxonomy for automated driving technologies.\" width=\"1650\" height=\"776\" />\u003C/a>\u003C/p>\n\u003Cp>Sounds simple enough, right? We certainly hope so. Because while the technologies that go into these systems are highly complex, we believe their capabilities need to be expressed as simply and clearly as possible &ndash; not just for the benefit of those developing the technologies, but for the general public who will be using them as well.\u003C/p>","2023-05-22T07:00:00.000Z",{"id":1008,"type":24,"url":1009,"title":1010,"description":1011,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1011,"image":1012,"img_alt":1013,"content":1014,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1015,"tags":1016},212,"mobileye-and-man-truck-bus-launch-collaboration-for-av-city-buses"," Mobileye and MAN Truck & Bus launch collaboration for AV city buses  ","Together, we're launching a multi-level cooperation to expand Mobileye Drive™ into city buses  ","https://static.mobileye.com/website/us/corporate/images/e82f7961494b551f4dbdb22ab063a582_1687163362038.png","MAN Truck & Bus and Mobileye collaborate on autonomy in public transit","\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye and MAN Truck &amp; Bus today announced a collaboration to explore autonomy in public transit, integrating state-of-the-art autonomous vehicle technology into the award-winning city buses of MAN. This builds on Mobileye&rsquo;s strategy of working closely with vehicle manufacturers to integrate autonomous technology seamlessly across platforms.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">This new collaboration is a milestone for MAN to further the path to urban bus automation. AVs will be able to significantly improve the total cost of ownership and lower the impact of expected driver shortages.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">&ldquo;We are excited to work with MAN Truck &amp; Bus to demonstrate the benefits of self-driving buses. Our Mobileye Drive&trade; system offers a flexible and scalable AV solution that can be used in various purpose-built vehicle platforms,&rdquo; said Johann Jungwirth, Senior Vice President, Autonomous Vehicles at Mobileye.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye Drive envisions a cutting-edge, comprehensive self-driving solution for commercial use by vehicle manufacturers. It includes Mobileye's advanced EyeQ&trade; systems-on-chip, as well as sensing, mapping, and driving policy technologies to build a full-stack autonomous driving system for designated design domains. Mobileye Drive's sensor configuration consists of an array of cameras, radars, and lidars that can be adapted to a variety of vehicle platforms &ndash; with MAN as its first application in bus transit.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"auto\">Mobileye will utilize the knowledge gained through first test operations with Mobileye Drive in Germany to implement into the MAN city bus. The program&rsquo;s first milestone will be to integrate Drive into a MAN Lion&rsquo;s City E bus with a safety driver as part of a larger research project in Munich, Germany, targeted for 2025. Following that trial, further test projects would be launched, building toward a goal of series production by the end of the decade.\u003C/span>\u003Cspan data-ccp-props=\"{&quot;201341983&quot;:0,&quot;335559739&quot;:160,&quot;335559740&quot;:259}\">&nbsp;\u003C/span>\u003C/p>","2023-05-17T07:00:00.000Z","Autonomous Driving, News, Driverless MaaS",{"id":1018,"type":5,"url":1019,"title":1020,"description":1021,"primary_tag":954,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1021,"image":1022,"img_alt":1023,"content":1024,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1025,"tags":1026},211,"we-are-mobileye-by-kutiman","The Sounds, Sights, Voices, and Values of Mobileye","Take a look inside Mobileye and hear from the people who are driving the evolution of mobility in this video by Kutiman.","https://static.mobileye.com/website/us/corporate/images/a1346d6d41daed3ae688626548db341a_1683801543169.jpg","Mobileye employees have a lot to say about the principles, ideas, and spirit that drive us forward in the evolution of automotive transportation.","\u003Cp>Mobileye is a tech company, a chipmaker, and an \u003Ca href=\"https://www.mobileye.com/news/porsche-mobileye-supervision-collaboration/\">automotive supplier\u003C/a>, dedicated to making the world&rsquo;s roads safer. But arguably more than anything else, we&rsquo;re a collection of individuals: scientists, engineers, programmers, and developers &ndash; unabashed geeks who are harnessing the power of \u003Ca href=\"https://www.mobileye.com/blog/computer-vision-eccv-2022/\">computer vision\u003C/a> and \u003Ca href=\"https://www.mobileye.com/blog/ceo-amnon-shashua-on-the-technological-megashifts-impacting-our-world/\">AI\u003C/a> to tackle some of the hardest problems facing the automotive and mobility industries.\u003C/p>\n\u003Cp>While we typically focus here on the \u003Ca href=\"https://www.mobileye.com/solutions/\">products\u003C/a> of their combined efforts, today we&rsquo;re giving you a glimpse inside Mobileye to meet some of the people behind the \u003Ca href=\"https://www.mobileye.com/technology/\">technology\u003C/a> and the principles that motivate us.\u003C/p>\n\u003Cp>This video was created by world-renowned audio-visual artist \u003Ca href=\"https://youtube.com/@kutiman\">Kutiman\u003C/a> to celebrate \u003Ca href=\"https://www.intel.com/content/www/us/en/newsroom/news/mobileye-news-oct-2022.html#gs.wxojvy\">our recent IPO\u003C/a>. It was compiled from sounds sampled and footage captured in our \u003Ca href=\"https://www.mobileye.com/blog/mobileye-campus-jerusalem-leed-platinum-environmental-rating/\">offices\u003C/a>, workshops, labs, and development vehicles. Best of all, it features a cross section of our employees speaking about \u003Ca href=\"https://www.mobileye.com/blog/what-drives-us/\">the values that drive us forward\u003C/a>. Turn up your speakers, click &ldquo;play,&rdquo; and meet just a few of the \u003Ca href=\"https://www.mobileye.com/blog/international-day-of-persons-with-disabilities-shekel-perfects-data-team/\">people\u003C/a> who are proud to declare: We Are Mobileye.\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/w8n_1Blsduk\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>\u003Cspan style=\"font-size: 10pt;\">Directed, edited, and performed by Kutiman\u003C/span>\u003Cbr />\u003Cspan style=\"font-size: 10pt;\">Trumpet: Tal Avraham\u003C/span>\u003Cbr />\u003Cspan style=\"font-size: 10pt;\">Photography: Yair Elder\u003C/span>\u003Cbr />\u003Cspan style=\"font-size: 10pt;\">Mixing: Ronen \"Nenor\" Sabo\u003C/span>\u003Cbr />\u003Cspan style=\"font-size: 10pt;\">Writing &amp; Creative: Tami Tisch\u003C/span>\u003C/p>","2023-05-11T07:00:00.000Z","Video",{"id":1028,"type":24,"url":1029,"title":1030,"description":1031,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1031,"image":1032,"img_alt":1033,"content":1034,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1035,"tags":1036},210,"porsche-mobileye-supervision-collaboration","Porsche and Mobileye launch SuperVision collaboration","“We're excited to collaborate with Porsche on bringing the next generation of driving technology to customers worldwide,” said Mobileye CEO Prof. Amnon Shashua.","https://static.mobileye.com/website/us/corporate/images/68105afe6f0e6404ca5b3b599d3c2c08_1683625841363.png","Collaboration is expected to make Mobileye SuperVision™ available as a platform solution within the overall Volkswagen Group.","\u003Cp>Mobileye today announced its strategic collaboration with Porsche, one of the world&rsquo;s greatest sports car builders, to provide Mobileye&rsquo;s SuperVision&trade; premium advanced driver assistance systems in future Porsche production models.&nbsp;\u003C/p>\n\u003Cp>This new effort builds on our strategy of advancing autonomy through evolution, starting from today&rsquo;s eyes-on, hands-on driver assist systems through SuperVision-based systems that enable hands-off operation for identified use cases, leading to eventual eyes-off, hands-off autonomy.&nbsp;\u003C/p>\n\u003Cp>&ldquo;We are excited to collaborate with Porsche on bringing the next generation of driving technology to customers worldwide,&rdquo; said Prof. Amnon Shashua, President and CEO of Mobileye. &ldquo;We share Porsche&rsquo;s goal of improving the driving experience through world-class technological innovation.&nbsp;\u003C/p>\n\u003Cp>&ldquo;Mobileye SuperVision&trade;&nbsp;system was designed to enhance safety through the synergetic interaction of driver and vehicle, as well as enhance the driving experience itself, by giving drivers greater freedom to choose how they want to engage with the road, and when they want to let the vehicle handle basic driving tasks.&rdquo;&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/345801264461614ae1c8b46525d5a133_1683625971035.png\" alt=\"Future Porsche models will employ Mobileye SuperVision&trade; with surround-camera array, radar sensors, and EyeQ&trade;6 High systems-on-a-chip.\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>Future Porsche models will benefit from the system&rsquo;s 11 cameras and radar-based sensing system to deliver supervised hands-free operation for identified use cases. Our cloud-connected Road Experience Management&trade;&nbsp;maps will provide up-to-date and highly localized intelligence about not just road features like lanes and signals, but how other drivers interact with their surroundings.\u003C/p>\n\u003Cp>SuperVision&rsquo;s combination of advantages &mdash; a tightly integrated package of software and hardware based on our EyeQ&trade;6 High systems-on-a-chip on a new SV62 domain controller &mdash; make it a uniquely flexible solution. Porsche will use Mobileye SuperVision&trade; as a key ingredient in its premium driver-assist platform, including key driver monitoring systems, to deliver a Porsche-worthy experience behind the wheel. And the collaboration is expected to make Mobileye SuperVision&trade;&nbsp;available as a platform solution within the overall Volkswagen Group.\u003C/p>\n\u003Cp>Mobileye SuperVision&trade;\u003Csup> \u003C/sup>technology has been developed to work at scale across multiple geographies and operational domains. The Mobileye SuperVision&trade;&nbsp; approach builds upon Mobileye&rsquo;s two decades of proven experience as a leader in camera-based driver safety and assistance technology, powered by integrated EyeQ SoCs. More than 135 million vehicles globally have been built with Mobileye assist technology to date, powering tangible increases in traffic safety for all road users.\u003C/p>","2023-05-09T07:00:00.000Z","ADAS, News",{"id":1038,"type":5,"url":1039,"title":1040,"description":1041,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1041,"image":1042,"img_alt":1043,"content":1044,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1045,"tags":1046},205,"autonomous-vehicle-frequently-asked-questions-faq-answers","Autonomous Vehicle Questions Get the Answers They Deserve ","Today we’re answering some of your most pressing questions on our self-driving technology, as we advance towards the future of autonomous vehicles.","https://static.mobileye.com/website/us/corporate/images/fd47f7ecb883325f8839a342e42e6215_1679308858729.jpg","Though highly complex, Mobileye develops the advanced technologies for autonomous vehicles to be as transparent as possible.","\u003Cp>Mobileye periodically releases \u003Ca href=\"https://youtube.com/playlist?list=PLWCfS_Yhbvs5MtQIjNfN-xLU30mc1qjEc\">footage of our autonomous vehicles (AVs) out testing around the world\u003C/a>. We&rsquo;ve published numerous videos of our camera-only developmental AVs driving in locations across Asia, Europe, and North America, and most recently released another showing our \u003Ca href=\"https://www.mobileye.com/news/autonomous-vehicle-detroit-united-states/?trk=organization_guest_main-feed-card_feed-article-content\">Mobileye Drive&trade; test vehicle\u003C/a> (with cameras, radars, and lidars) navigating the complex streets of \u003Ca href=\"https://www.mobileye.com/blog/robotaxi-night-drive-jerusalem-unedited-video/\">Jerusalem at night\u003C/a>.\u003C/p>\n\u003Cp>The videos show vehicles equipped with our self-driving technologies maneuvering in real-world conditions, alongside actual traffic, in densely packed city centers, tackling the same challenges a human driver would face. The unvarnished glimpse they provide into our technologies at work have made them some our most popular videos. They&rsquo;ve also sparked some excellent questions across various online platforms, and today we&rsquo;re answering some of them.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/b768f774a0b45beb1b29d3d6a018426d_1679309792882.jpg\" alt=\"\" width=\"1650\" height=\"930\" />\u003C/p>\n\u003Ch3>\u003Cstrong>Question: In what locations and under what conditions is Mobileye testing its autonomous-vehicle technologies?\u003C/strong>\u003C/h3>\n\u003Cp>Answer: Our testing of autonomous vehicles began (and continues) in our hometown of Jerusalem. The city presents a particularly challenging set of driving conditions, including narrow streets, heavy stop-and-go traffic, a large volume of pedestrians (including lots of baby strollers), frequent jaywalking, ongoing roadworks, and often-aggressive, hurried drivers.\u003C/p>\n\u003Cp>These conditions make Jerusalem an excellent testing environment, but we&rsquo;re not limiting ourselves to the one location; rather, over the past few years, we&rsquo;ve expanded testing to other environments around the world, including Tokyo, Shanghai, \u003Ca href=\"https://www.mobileye.com/blog/paris-ratp-autonomous-vehicle-testing-pilot/\">Paris\u003C/a>, \u003Ca href=\"https://www.mobileye.com/blog/munich-av-video/\">Munich\u003C/a>, \u003Ca href=\"https://www.mobileye.com/news/autonomous-vehicle-testing-miami-stuttgart/\">Stuttgart\u003C/a>, \u003Ca href=\"https://www.mobileye.com/news/autonomous-vehicle-detroit-united-states/\">Detroit\u003C/a>, \u003Ca href=\"https://www.mobileye.com/press-kit/press-kit-mobileye-new-york-city/\">New York\u003C/a>, and \u003Ca href=\"https://www.mobileye.com/news/autonomous-vehicle-testing-miami-stuttgart/\">Miami\u003C/a>. We&rsquo;re testing on a variety of road types, both day and at night, and in a multitude of driving conditions.\u003C/p>\n\u003Cp>\u003Cem>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/254e2f53ba72e83e2d6dd10d414e4d96_1681912999070.jpg\" alt=\"A Mobileye autonomous development vehicle opposite the Eifel Tower, pictured while undergoing testing in Paris.\" width=\"1650\" height=\"928\" />\u003C/em>\u003C/p>\n\u003Cp>Key to our geographic scalability is our \u003Ca href=\"https://www.mobileye.com/technology/rem/\">Road Experience Management&trade; (REM&trade;)\u003C/a> technology. REM crowdsources data from millions of vehicles around the globe equipped with Mobileye technology, and creates a map of all the roads they&rsquo;re traveling to inform the AV about what to &ldquo;expect&rdquo; on those roads.\u003C/p>\n\u003Cp>By undertaking such rigorous testing, we aim to better prepare our self-driving solutions to handle whatever conditions they may face out there on the road &ndash; regardless of the location, driving environment, weather, or other conditions in which they&rsquo;re operating.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/768d83e49473ade5f08e00cfdda052f2_1679309854709.jpg\" alt=\"\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Ch3>\u003Cstrong>Q: \u003C/strong>\u003Cstrong>How does the autonomous vehicle handle narrow streets where visibility may be limited?\u003C/strong>\u003C/h3>\n\u003Cp>A: Just as a human driver would instinctively drive more cautiously on a narrow urban street where pedestrians might suddenly jump onto the road, so must an AV make similar assumptions and be cautious in such areas.\u003C/p>\n\u003Cp>As you can see in the video below, the AV is following the fourth rule of our \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\">Responsibility-Sensitive Safety&trade;\u003C/a> model (RSS&trade;) &ndash; the framework on which its driving policy is based. RSS rule #4 instructs the vehicle to always be cautious in areas of limited visibility.\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/pDyMzz8HMIc?clip=Ugkx_DSY31Bw2Y-spgfTV2rDS4xRN8XOalmc&amp;clipt=EJD2GBj9xRs\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Ch3>\u003Cstrong>Q: H\u003C/strong>\u003Cstrong>ow does the AV handle inclement weather conditions?\u003C/strong>\u003C/h3>\n\u003Cp>A: Visibility on the road can be compromised by a long list of weather events, including heavy rain, snow, fog, strong crosswinds, and more. We prepare our autonomous vehicles for such conditions by equipping them with a full array of sensor types so that they&rsquo;ll be able to detect and handle whatever weather they encounter. The sensor inputs allow an AV to determine, for example, whether it should reduce speed and proceed with greater caution due to weather conditions.\u003C/p>\n\u003Cp>Importantly, our AV featured in the \u003Ca href=\"https://www.mobileye.com/blog/robotaxi-night-drive-jerusalem-unedited-video/\">night-drive video in Jerusalem\u003C/a> incorporates a radar/lidar subsystem (in addition to a camera-based subsystem). We call this approach to sensing \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\">True Redundancy&trade;\u003C/a>, which specifically provides for incorporation of both subsystems operating independently of one another. So, if bad weather limits the cameras&rsquo; visibility, the vehicle will still be able to operate safely and effectively on radar and lidar alone (since these active sensors are not impeded in the same way by bad weather as cameras are).\u003C/p>\n\u003Cp>A compressed-air system also keeps the lenses on our AV&rsquo;s cameras clean from dirt, grime, rain, snow, and ice it might pick up from the driving environment.\u003C/p>\n\u003Cp>\u003Cem>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/05e667c8b18ab869fd7d7f0a8e986c97_1681913091101.jpg\" alt=\"A Zeekr 001 with Mobileye SuperVision testing our camera-only self-driving system in the snow in Detroit.\" width=\"1650\" height=\"928\" />\u003C/em>\u003C/p>\n\u003Cp>In addition, the REM maps mentioned earlier provide an additional rich layer of information on the driving environment, supplementing what the vehicle&rsquo;s onboard sensors pick up, which is especially useful if any of the sensors are impaired by reduced visibility conditions.\u003C/p>\n\u003Cp>In the unlikely event that the AV determines for any reason that it cannot proceed safely, it&rsquo;s programmed to either pull over to the side of the road and stop (if equipped with an eyes-off/hands-off or driverless system), or slow down and remain within its lane (in a camera-only, eyes-on/hands-off system).\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>These are some of the excellent questions we&rsquo;ve received, and we hope the answers we&rsquo;ve provided here have helped you better understand how our self-driving technologies work. For a closer look at each of the unedited autonomous drive videos captured in locations around the world, watch the videos in the playlist below.\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/videoseries?list=PLWCfS_Yhbvs5MtQIjNfN-xLU30mc1qjEc\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>","2023-04-24T07:00:00.000Z","Video, Autonomous Driving",{"id":1048,"type":5,"url":1049,"title":1050,"description":1051,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1051,"image":1052,"img_alt":1053,"content":1054,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1055,"tags":1056},208,"camera-first-approach-for-assisted-autonomous-driving","Our Camera-First Approach to Enhancing Mobility","For capability and cost, cameras can’t be beat. That’s why Mobileye embraces a camera-first approach towards sensing for both assisted and autonomous driving.","https://static.mobileye.com/website/us/corporate/images/ca2e5c9e96edcd608dd05b9deb731679_1680467545684.jpg","Mobileye pioneered the use of a single camera to enable advanced driver-assistance systems.","\u003Cp>There are many things we take for granted, but when it comes to cars at least, things weren&rsquo;t that way until someone made them so.\u003C/p>\n\u003Cp>Cars today are a fact of life, for example, but they didn&rsquo;t exist before Gottlieb Daimler and Carl Benz effectively invented them. By now we see cars everywhere, but they weren&rsquo;t mass-produced until Henry Ford came up with the assembly line. These days almost all cars come with some degree of advanced driver-assistance system, and most such systems are based on cameras... but that too wasn&rsquo;t the case until Mobileye pioneered the concept. And that camera-first approach remains at the heart of everything we do.\u003C/p>\n\u003Ch3>\u003Cstrong>Historical Vision\u003C/strong>\u003C/h3>\n\u003Cp>When Mobileye began nearly a quarter-century ago, driver-assistance technology was still in its infancy (and autonomous driving, by extension, was little more than a pipe dream). Some automakers were starting to introduce basic ADAS features, but the industry had yet to settle on what type of sensors to use for them.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/3a4d7c7e97f877e6d44c955b38e2dbec_1680467160654.jpg\" alt=\"Mobileye CEO &amp; founder Professor Amnon Shashua pioneered computer vision technology for advanced driver-assistance systems.\" width=\"1650\" height=\"789\" />\u003C/p>\n\u003Cp>One major manufacturer was working on a camera system for measuring the distance to the vehicle ahead, and invited a computer science professor named \u003Ca href=\"https://www.mobileye.com/amnon-shashua/\">Amnon Shashua\u003C/a> to provide his input. Shashua showed that he could do with just one camera what they were trying to do with two. That successful demonstration led to his founding Mobileye and paved the way for advanced driver-assistance systems that could (and indeed would) be deployed at a scale great enough to revolutionize automotive safety.\u003C/p>\n\u003Cp>The reason why that automaker called upon Shashua was due to the expertise for which he was already becoming known in the emerging field of \u003Ca href=\"https://www.intel.com/content/www/us/en/internet-of-things/computer-vision/overview.html\">computer vision\u003C/a>. This form of artificial intelligence employs highly sophisticated algorithms to interpret the inputs from simple cameras in order to understand their surroundings.\u003C/p>\n\u003Cp>To this day, that same core competence \u003Ca href=\"https://www.mobileye.com/blog/computer-vision-eccv-2022/\">still drives us here at Mobileye\u003C/a>, where teams of engineers and developers are constantly \u003Ca href=\"https://www.mobileye.com/blog/enhanced-computer-vision-driver-assistance/\">honing our computer vision algorithms\u003C/a> to better identify a wider range of parameters in the driving environment, and developing \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\">the chips to put that technology into cars and out on the road\u003C/a>.\u003C/p>\n\u003Cp>So why do we still embrace a camera-first approach above all other types of sensors? For two main reasons, as our CTO Prof. Shai Shalev-Shwartz outlines in the video below.\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/9yfuOlEyJyk\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Ch3>\u003Cstrong>Driving by Vision\u003C/strong>\u003C/h3>\n\u003Cp>As drivers, we make use of nearly all our senses when behind the wheel. We feel how the car interacts with the road, hear other cars coming and horns honking, and can even smell if something has gone wrong (like an engine overheating). But no sense is as essential to us as sight, and cameras are the sensors that function most like the human eye.\u003C/p>\n\u003Cp>Cameras can &ldquo;see&rdquo; the parameters of the road surface and the various obstacles, hazards, and other users on and around us &ndash; including other cars, trucks, bikes, and pedestrians. Cameras also provide rich semantic understanding of key details in the driving environment &ndash; such as identifying lane markings, recognizing colors on traffic lights, and even reading the text on two-dimensional traffic signs.&nbsp;\u003C/p>\n\u003Cp>For their part, radar and lidar deliver certain key advantages over cameras, such as detecting objects beyond line of sight and in lower-lighting or bad weather conditions. But they cannot replace the essential functions that cameras perform.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/f2f6ccbbb100c4704eb8a448100e2ffd_1680467003332.jpg\" alt=\"Mobileye's surround-view camera array provides 360-degree computer-vision capabilities for advanced mobility solutions.\" width=\"1650\" height=\"776\" />\u003C/p>\n\u003Ch3>\u003Cstrong>Made to Scale\u003C/strong>\u003C/h3>\n\u003Cp>There&rsquo;s a huge range in the prices of different types of sensors. Lidars, for example, are expensive, but cameras are relatively affordable. Building our technology around low-cost cameras affords us the rare simultaneous opportunity to increase both the availability and the performance of \u003Ca href=\"https://www.mobileye.com/solutions/\">our solutions\u003C/a>.\u003C/p>\n\u003Cp>For the mass market, the low cost of camera sensors enables integration of our driver-assistance technology into more vehicles (without disproportionately affecting their purchase prices). Indeed, to date, more than 135 million vehicles have been equipped with our technologies, and that number is growing at a quickening pace.\u003C/p>\n\u003Cp>For higher-end vehicles, the low cost of camera sensors means that more of them can be cost-effectively integrated into advanced solutions. For example, \u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-bridge-to-consumer-autonomous-vehicles/\">Mobileye SuperVision&trade;\u003C/a> (our eyes-on/hands-off solution) incorporates 11 cameras to provide 360-degree surround coverage.\u003C/p>\n\u003Cp>Mobileye Chauffeur&trade; (our eyes-off/hands-off solution for consumer autonomous vehicles) and Mobileye Drive&trade; (our driverless solution for autonomous commercial vehicles) are similarly being developed under our camera-first approach. Only these solutions also feature a secondary, independent \u003Ca href=\"https://www.mobileye.com/blog/radar-lidar-next-generation-active-sensors/\">radar/lidar\u003C/a> suite to back up the cameras (under our \u003Ca href=\"https://www.mobileye.com/blog/av-safety-demands-true-redundancy/\">True Redundancy&trade;\u003C/a> approach to sensing).\u003C/p>\n\u003Cp>Even our \u003Ca href=\"https://www.mobileye.com/technology/rem/\">REM&trade; mapping technology\u003C/a> owes its crowdsourcing capabilities to the proliferation of our camera-based solutions. And that mapping data is itself being implemented into a range of applications, from our turnkey systems for self-driving vehicles through to human-driven cars augmented by our \u003Ca href=\"https://www.mobileye.com/blog/cloud-enhanced-driver-assist/\">Cloud-Enhanced Driver-Assistance&trade;\u003C/a> solution.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>Mobileye will, of course, continue developing additional inputs like the maps and active sensors mentioned above. And there&rsquo;s no telling what novel developments might emerge in the future. But between their inherent capability and the scale afforded by their low cost, cameras have always been &ndash; and remain still &ndash; at the heart of everything we do.\u003C/p>","2023-04-16T07:00:00.000Z","ADAS, Video, Mapping & REM, Autonomous Driving",{"id":1058,"type":5,"url":1059,"title":1060,"description":1061,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1061,"image":1062,"img_alt":1063,"content":1064,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1065,"tags":1066},209,"mobileye-campus-jerusalem-leed-platinum-environmental-rating","Mobileye’s New HQ Receives Top Environmental Rating","We are honored to have earned the highest-possible LEED Platinum® rating from the U.S. Green Building Council for the new Mobileye campus in Jerusalem.","https://static.mobileye.com/website/us/corporate/images/5d451c8e4bdbb54323f9b68200ce2567_1707220750915.jpg","The new Mobileye Campus Jerusalem will serve as our global headquarters and principal research & development facility.","\u003Cp>Mobileye has new headquarters opening soon in Jerusalem, and we&rsquo;re proud to report that it has received the highest-possible LEED Platinum&reg; environmental rating for new construction.\u003C/p>\n\u003Cp>Issued by the \u003Ca href=\"https://www.usgbc.org/\">U.S. Green Building Council\u003C/a> (USGBC), the Leadership in Energy and Environmental Design (LEED) program is one of the most respected and widely recognized environmental certification standards for construction. LEED projects are graded in nine categories for environmental impact, from integrative design and material use to human health and energy efficiency. Qualifying projects receive one of four grades, of which Platinum is the highest.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/d8b23b8ec1da3302c4cacc63574ba201_1680686828219.png\" alt=\"U.S. Green Building Council LEED Platinum environmental construction certification for the new Mobileye Campus Jerusalem.\" width=\"1650\" height=\"776\" />\u003C/p>\n\u003Cp>Mobileye set out in the planning phase to earn a (second-highest) Gold rating as a minimum, but has exceeded that goal with a Platinum rating. Of particular note, the ground-up, purpose-built \u003Ca href=\"https://www.usgbc.org/projects/mobileye-campus-jerusalem\">Mobileye Campus Jerusalem\u003C/a> received the highest possible marks for Water Efficiency, Innovation, and Regional Priorities (such as energy usage and water-efficient landscaping). The LEED Platinum rating underlines Mobileye&rsquo;s focus on ambitious environmental, social, and governance goals.\u003C/p>\n\u003Ch3>\u003Cstrong>Efficiency by Design\u003C/strong>\u003C/h3>\n\u003Cp>In building the facility, Mobileye has enacted a wide array of \u003Ca href=\"https://www.mobileye.com/blog/earth-day-autonomous-electric-vehicles-environment/\">environmentally friendly\u003C/a> features. For example, we selected water-cooled (instead of air-cooled) chillers for the HVAC system, with a high coefficient of performance (CoP) between 13 and 19 &ndash; far exceeding the CoP of 3 at which most chillers are rated. External fins and internal blinds on all external fa&ccedil;ades help reduce incoming solar radiation (and therefore HVAC usage), which is especially significant in light of our extremely hot summers. And the underground on-premises \u003Ca href=\"https://www.mobileye.com/blog/mobileye-ces-2022-self-driving-secret-data/\">data center\u003C/a> was designed for peak power-usage effectiveness (PUE). Such features help reduce overall energy usage by 68 percent &ndash; more than double our initial target &ndash; with the remaining energy usage completely offset by renewable energy credits.\u003C/p>\n\u003Cp>As much of the building material as possible was sourced from recycled content, including 100% of the steel. Nearly half of the open space is vegetated, fed by recycled condensed HVAC water. The facility includes underground \u003Ca href=\"https://www.mobileye.com/news/zeekr-mobileye-supervision/\">electric vehicle\u003C/a> charging stations and ample parking for bicycles, and is situated in a highly developed area near many public mass-transit lines &ndash; with a new light-rail line currently being constructed alongside it, too.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/dd9111e1ef629f35347aea3185cd6cef_1707220777419.jpg\" alt=\"\" width=\"919\" height=\"1070\" />\u003C/p>\n\u003Cp>&ldquo;Environmental impact was a top priority for us in building Mobileye Campus Jerusalem,&rdquo; notes Kobi Ohayon, Chief Operations Officer at Mobileye. &ldquo;In planning the new facility, we looked at every metric possible and all the resources available to us in order to minimize our carbon footprint and create as &lsquo;clean&rsquo; a facility as possible &ndash; both for those who&rsquo;d be working there and for the city and environment surrounding us. We&rsquo;re incredibly proud and honored that our efforts have earned Mobileye the top LEED Platinum rating from USGBC.&rdquo;\u003C/p>\n\u003Ch3>\u003Cstrong>Green Building Leadership\u003C/strong>\u003C/h3>\n\u003Cp>Though issued by an American organization, the LEED program is being adopted by building projects around the world. The new Mobileye facility is one of just seven here (and only the second in \u003Ca href=\"https://www.mobileye.com/blog/robotaxi-night-drive-jerusalem-unedited-video/\">Jerusalem\u003C/a>) to date to receive a LEED Platinum rating, alongside such noteworthy constructions as the new National Library, the Porter School of Environmental Studies at Tel Aviv University, and \u003Ca href=\"https://www.usgbc.org/projects/intel-rd-campus-central-israel\">Intel&rsquo;s new R&amp;D facility in Petah Tikva\u003C/a>.\u003C/p>\n\u003Cp>&ldquo;Mobileye&rsquo;s LEED certification demonstrates tremendous green building leadership,&rdquo; said Peter Templeton, president and CEO of the U.S. Green Building Council. &ldquo;LEED was created to make the world a better place and revolutionize our buildings and communities by providing everyone with access to healthy, green, and high-performing buildings. The new Mobileye Campus Jerusalem is a prime example of how the innovative work of project teams can create local solutions that contribute to making a global difference.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/287b9a7a0e3d1a1eb43ab91b3cf04805_1680630981501.jpg\" alt=\"Mobileye&rsquo;s CEO Prof. Amnon Shashua, chairman Pat Gelsinger, and COO Kobi Ohayon survey construction of our new campus.\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>Once it opens later this year, the campus will serve as Mobileye&rsquo;s global headquarters and principal R&amp;D facility. The 130,000-square-meter (~1.4-million-square-foot) facility has been built in Jerusalem&rsquo;s Har Hotzvim technological hub, near Mobileye&rsquo;s current headquarters. The state-of-the-art campus has ten floors above ground and seven underground, with office space, conference rooms, and other facilities for over 2,500 \u003Ca href=\"https://www.mobileye.com/blog/international-day-of-persons-with-disabilities-shekel-perfects-data-team/\">employees\u003C/a> &ndash; along with specialized labs and a vehicle workshop custom-built to support our \u003Ca href=\"https://www.mobileye.com/technology/\">development of technologies for assisted and autonomous driving\u003C/a>.\u003C/p>\n\u003Cp>In addition to our new headquarters and our other facilities in Jerusalem, Mobileye operates offices and workshops across the country and around the world &ndash; including locations in Germany, China, Japan, and the United States.\u003C/p>","2023-04-05T07:00:00.000Z","News, Awards",{"id":1068,"type":5,"url":1069,"title":1070,"description":1071,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1071,"image":1072,"img_alt":1073,"content":1074,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1075,"tags":997},207,"when-will-self-driving-cars-be-available","The Multiple Lanes on the Road to the Autonomous Future","When will self-driving cars be available? Mobileye has more than one answer, from hands-off to eyes-off and even completely driverless solutions.","https://static.mobileye.com/website/us/corporate/images/f9f2700e1135d9967437197a1ec011bf_1681140041144.jpg","Mobileye is developing a broad spectrum of advanced solutions to put our self-driving technologies on the road.","\u003Cp>Let&rsquo;s cut to the chase here for a moment. You&rsquo;ve been hearing about self-driving vehicles for \u003Ca href=\"https://www.mobileye.com/blog/history-autonomous-vehicles-renaissance-to-reality/\">a long time now\u003C/a>, and you want to know where they are already. Reasonable enough, considering how much time and money have already gone into pursuing them. Yet even with some vehicles incorporating varying degrees of autonomous features, the dream of fully autonomous vehicles often seems just out of reach &ndash; tantalizingly close, but always a few years away.\u003C/p>\n\u003Cp>Well, we&rsquo;re here to tell you that self-driving technology is coming, and it&rsquo;s coming soon. Just when, exactly, is a matter of the degree of autonomy you&rsquo;re looking for, and what form you want it to take.\u003C/p>\n\u003Ch3>\u003Cstrong>Multiple Technologies, Multiple Solutions\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cstrong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/83877fdae92f3ea904afe246cb12429b_1679908831728.jpg\" alt=\"Mobileye's vision for what self-driving vehicles will look like and how they will function has evolved and expanded.\" width=\"1650\" height=\"777\" />\u003C/strong>\u003C/p>\n\u003Cp>The trouble with self-driving vehicle technology is that it&rsquo;s not monolithic, however it might seem from the outside. In order for self-driving vehicles to function safely and effectively, they&rsquo;ll require a wide array of technologies working in unison &ndash; including various sensors, maps, processors, and software... all of which we&rsquo;re hard at work perfecting here at Mobileye.\u003C/p>\n\u003Cp>The upside to that complexity, however, is that we can put those building blocks out on the road in a variety of configurations to incrementally deliver a \u003Ca href=\"https://www.mobileye.com/solutions/\">broad range of solutions\u003C/a>.\u003C/p>\n\u003Ch3>\u003Cstrong>The Increasing Capabilities of Driver Assistance\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cstrong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/0ab72e520b424eb08286c8ba872014a8_1679908953250.jpg\" alt=\"Mobileye SuperVision is an eyes-on/hands-off solution derived directly from our self-driving development program.\" width=\"1650\" height=\"776\" />\u003C/strong>\u003C/p>\n\u003Cp>Under Mobileye&rsquo;s approach, the first step in rolling out the benefits of self-driving technology is to enable drivers to take their hands off the wheel (while keeping their eyes on the road).\u003C/p>\n\u003Cp>Some automakers have already begun offering this kind of functionality with increasingly sophisticated driver-assist feature sets &ndash; carrying names like Highway Pilot and Traffic-Jam Assist &ndash; many of which are enabled by our technology.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-bridge-to-consumer-autonomous-vehicles/\">Mobileye SuperVision&trade;\u003C/a> takes those capabilities to the next level. It is derived directly from our self-driving development program and enables a vehicle to function almost like a fully autonomous vehicle: cruise on the highway, inch along through traffic, change lanes, overtake slower-moving vehicles, navigate around obstacles on the shoulder, and more &ndash; all by itself, while the driver supervises.\u003C/p>\n\u003Cp>This highly advanced eyes-on/hands-off solution is already on the road in China inside a \u003Ca href=\"https://youtu.be/R8qTOPpQ2-I\">growing variety of vehicles\u003C/a> from our launch partner Zeekr, whose parent company \u003Ca href=\"https://www.mobileye.com/news/geely-holding-group-expands-mobileye-collaboration/\">Geely is slated to expand Mobileye SuperVision\u003C/a> to additional markets and further models from other brands under its vast umbrella. And other automakers are due soon to follow. To date, there are already more than 90,000 vehicles equipped with Mobileye SuperVision on the road &ndash; a number we expect to reach 150,000 by the end of 2023, and to top 1.2 million by 2026.\u003C/p>\n\u003Ch3>\u003Cstrong>The Coming Wave of Consumer AVs\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/079309dc521e94e5b734c26d89dbbd59_1679909463586.jpg\" alt=\"Mobileye Chauffeur is our eyes-off/hands-off solution for the coming wave of consumer autonomous vehicles.\" width=\"1650\" height=\"776\" />\u003C/p>\n\u003Cp>The next step will be to allow drivers to take not only their hands off the wheel, but their \u003Ca href=\"https://www.mobileye.com/opinion/defining-a-new-taxonomy-for-consumer-autonomous-vehicles/\">eyes off the road\u003C/a> as well.\u003C/p>\n\u003Cp>Fortunately, the bulk of the technology required for this high degree of autonomy is in fact already present in Mobileye SuperVision &ndash; including the computer-vision system, maps, driving policy, and processors. To take the next step up to Mobileye Chauffeur&trade; (our eyes-off/hands-off solution), we will replace the driver&rsquo;s guidance with redundant processors and active sensors. This will allow consumer passenger vehicles to operate autonomously in a variety of (but not all) situations. And we&rsquo;re glad to report that development and integration of these components is already well underway.\u003C/p>\n\u003Cp>Mobileye Chauffeur is slated to go into production in 2026. Based on the progress we&rsquo;ve made to date and the path we&rsquo;ve charted forward, automakers are already lining up to implement it in cars they&rsquo;ll sell to the public. Once you get your hands on (or off) one of these vehicles, you&rsquo;ll be able to sit back and enjoy the ride as the vehicle handles all the driving for you &ndash; first on highways, then on arterial and rural roads, and ultimately in the city as well.\u003C/p>\n\u003Ch3>\u003Cstrong>Autonomous Mobility on Demand\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cstrong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/979ae3214eadc09d897ae657815d8248_1679909488138.jpg\" alt=\"Mobileye Drive is our completely driverless solution for a full range of autonomous Mobility-as-a-Service applications.\" width=\"1650\" height=\"776\" />\u003C/strong>\u003C/p>\n\u003Cp>Suppose you don&rsquo;t need your own car, though, or don&rsquo;t want to be saddled with the burdens of vehicle ownership. What if all you want is to be able to order a ride in a self-driving vehicle that will take you where you need to go?\u003C/p>\n\u003Cp>That&rsquo;s what&rsquo;s known as \u003Ca href=\"https://www.mobileye.com/solutions/drive/\">autonomous mobility-as-a-service (MaaS)\u003C/a>. We&rsquo;re already seeing small fleets of robotaxis operating on limited bases in parts of the United States and China. With \u003Ca href=\"https://www.mobileye.com/blog/mobileye-drive-self-driving-system/\">Mobileye Drive&trade;\u003C/a>, we aim to take autonomous MaaS even further.\u003C/p>\n\u003Cp>Mobileye Drive incorporates everything found in Mobileye SuperVision and Mobileye Chauffeur. Only here, the circumstances under which it&rsquo;s capable of operating autonomously are defined by geographic area, not by road type. And with a built-in teleoperation system (in case, say, the vehicle gets a flat tire or is pulled over by police), vehicles equipped with this solution won&rsquo;t need a driver on board at all &ndash; and in some cases won&rsquo;t even have a steering wheel. All you&rsquo;ll need to do is open up an app (like \u003Ca href=\"https://moovit.com/\">Moovit\u003C/a>) and order a ride &ndash; like you would with any taxi or ride-hailing service, but without a driver.\u003C/p>\n\u003Cp>We&rsquo;ve already begun testing our robotaxis in multiple locations around the world. In the future, Mobileye Drive will be optimized for use in robotaxis, ride-pooling, public transport, and goods delivery, empowering manufacturers and transport operators to offer a variety of autonomous vehicles incorporating our reliable and road-tested self-driving technology.\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/GZ9Dsm-oqi8\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>In short, there&rsquo;s still a few steps left ahead of us on the road to the future of completely autonomous vehicles, and we can&rsquo;t skip any of them. But we can enjoy the benefits of their ongoing development along the way as we incrementally transfer the responsibility of driving from the hands of humans to the advancing capabilities of self-driving technology.\u003C/p>","2023-03-30T07:00:00.000Z",{"id":1077,"type":24,"url":1078,"title":1079,"description":1080,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1080,"image":1081,"img_alt":1082,"content":1083,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1084,"tags":1085},206,"israel-prize-awarded-to-mobileye-founder-prof-amnon-shashua","Prof. Amnon Shashua Awarded Israel Prize","The nation's highest civilian honor, the recognition for lifetime achievement cites our CEO's contruibutions to philanthropy, industry, and automotive safety.","https://static.mobileye.com/website/us/corporate/images/94c089545fd9312bb2e582198a23aecd_1679518410534.jpg","Mobileye president and CEO Prof. Amnon Shashua","\u003Cp>JERUSALEM, March 22 &mdash; Today, the State of Israel&rsquo;s Ministry of Education named Prof. Amnon Shashua, Mobileye&rsquo;s founder, president and CEO, as the recipient of the Israel Prize for Lifetime Achievement, the nation&rsquo;s highest civilian honor.\u003C/p>\n\u003Cp>The Israel Prize, now in its 70\u003Csup>th\u003C/sup> year, celebrates excellence in arts, culture, business and sciences, along with recognizing those who have made lifelong contributions to Israeli society. Shashua was honored for his groundbreaking contributions to the tech industry in Israel, his global impact on automotive safety and applied artificial intelligence, and his philanthropy.\u003C/p>\n\u003Cp>&ldquo;I have the privilege of working with thousands of people in Israel and around the world, and together we have achieved scientific and technological innovations with tremendous impact,\" said Shashua.&nbsp;\u003C/p>\n\u003Cp>&ldquo;This moment also offers a chance to emphasize an important mission for me &ndash; fostering social cohesion across the communities that make Israel unique, a cause that my family pursues through our foundation&rsquo;s work in several fields. Congratulations to the other recipients of the Israel Prize on their achievements.&rdquo;\u003C/p>\n\u003Cp>Shashua is a world-renowned expert in AI, computer vision, natural language processing, and other related fields. He is a 2020 Dan David Prize laureate in the field of artificial intelligence and was selected as the 2022 Mobility Innovator by the Automotive Hall of Fame. Shashua has founded and actively leads four companies using applied AI in various fields from automotive to assisted wearables to fintech: Mobileye, OrCam, AI21 Labs, and \"One Zero,\" the first digital bank in Israel.\u003C/p>\n\u003Cp>Shashua and the other two recipients of this year&rsquo;s lifetime achievement prizes will receive their award in a special ceremony on April 26.\u003C/p>\n\u003Cp>___________________________________\u003C/p>\n\u003Cp>Mobileye (Nasdaq: MBLY) leads the mobility revolution with its autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in computer vision, artificial intelligence, mapping, and data analysis. Since its founding in 1999, Mobileye has pioneered such groundbreaking technologies as REM&trade; crowdsourced mapping, True Redundancy&trade; sensing, and Responsibility Sensitive Safety (RSS). These technologies are driving the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions, powering industry-leading advanced driver-assistance systems and delivering valuable intelligence to optimize mobility infrastructure. To date, more than 130 million vehicles worldwide have been built with Mobileye technology inside. In 2022 Mobileye listed as an independent company separate from Intel (Nasdaq: INTC), which retains majority ownership. For more information, visit \u003Ca href=\"https://www.mobileye.com\">https://www.mobileye.com\u003C/a>.\u003C/p>\n\u003Cp>&ldquo;Mobileye,&rdquo; the Mobileye logo and Mobileye product names are registered trademarks of Mobileye Global. All other marks are the property of their respective owners.\u003C/p>","2023-03-22T07:00:00.000Z","Awards, Amnon Shashua",{"id":1087,"type":5,"url":1088,"title":1089,"description":1090,"primary_tag":40,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1090,"image":1091,"img_alt":1092,"content":1093,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1094,"tags":1095},204,"johann-jungwirth-driverless-tech-business-faz-podcast","Mobileye’s SVP of AVs on Driverless Tech & Business","Explore the latest developments in autonomous driving technology and business models with insights from Mobileye's Senior VP of AV, Johann “JJ” Jungwirth. ","https://static.mobileye.com/website/us/corporate/images/56029497307b31e319f9a42612fd2a6d_1678708468949.png","Johann “JJ” Jungwirth, Senior Vice President of Autonomous Vehicles at Mobileye.","\u003Cp>Mobileye's Senior Vice President of AV Johann &ldquo;JJ&rdquo; Jungwirth recently appeared on the Frankfurter Allgemeine Zeitung (FAZ) podcast to discuss the state of self-driving development and what&rsquo;s still needed to put autonomous vehicles on the road.\u003C/p>\n\u003Cp>In the interview, JJ details the level of AI required to support the \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\">camera, radar, and lidar sensors\u003C/a> that are needed to model the driving environment, classify detected objects, and aid in \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\">decision-making for autonomous driving\u003C/a>. But he also points out that autonomous driving has already begun to hit the streets with ride-hailing services in several cities.\u003C/p>\n\u003Cp>As part of the episode, JJ also discusses the business models for autonomous driving and driverless services &ndash; such as \u003Ca href=\"https://www.mobileye.com/blog/robotaxi-night-drive-jerusalem-unedited-video/\">robotaxis\u003C/a>, which will be the first to adopt the technology, before private vehicles. JJ believes that OEMs will most likely adapt the systems that fit the driving styles of their customers to deliver a unique autonomous driving experience. He also discusses the issue of liability, and notes that insurers have signaled their readiness to provide coverage for vehicles and service providers.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.faz.net/podcasts/f-a-z-kuenstliche-intelligenz-podcast/autonomes-fahren-und-robo-taxis-von-2025-an-auf-deutschen-strassen-18730179.html\" target=\"_blank\" rel=\"noopener\">Click here\u003C/a> to listen to the full interview (in German).\u003C/p>","2023-03-15T07:00:00.000Z","Autonomous Driving, Driverless MaaS",{"id":1097,"type":5,"url":1098,"title":1099,"description":1100,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1100,"image":1101,"img_alt":1102,"content":1103,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1104,"tags":997},202,"autonomous-vehicle-test-drivers-pioneers-mobility","Autonomous Vehicle Test Drivers: Pioneers of the Future of Mobility","A glimpse into the life of our test drivers, who are ensuring that the self-driving cars of the future are safe, reliable, and comfortable. ","https://static.mobileye.com/website/us/corporate/images/6a89554e4d540da12aa463c684695806_1678182142326.jpg","Mobileye’s autonomous-vehicle test drivers are professional driving instructors who test our technologies in real-world traffic.","\u003Cp>\"I love being in the AV workshop. I feel like I am an astronaut working at NASA.&rdquo;\u003Cbr />&ndash; Tomer Horoszowski, Mobileye AV Test Driver\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>The average American or European spends at least \u003Ca href=\"https://hbr.org/2021/05/that-dreaded-commute-is-actually-good-for-your-health\">300 hours a year\u003C/a> in a car. But as progress in autonomous vehicle technology advances, we&rsquo;re moving into a new era of mobility, in which that time behind a steering wheel will be completely transformed.\u003C/p>\n\u003Cp>Without the burden of driving a car, drivers will have new options for how to spend their commuting time. They might attend a virtual meeting with colleagues halfway across the world, play a game of chess, eat their lunch in a mobile restaurant, or take a much-needed nap. The possibilities are endless.\u003C/p>\n\u003Cp>As more and more people come to enjoy the benefits of autonomous technology, few will know about the pioneers who had a special role in ensuring that these vehicles were safe, reliable, and ready for widespread consumer adoption.&nbsp;\u003C/p>\n\u003Cp>At Mobileye, those pioneers are called our AV Test Driver Team.\u003C/p>\n\u003Ch3>\u003Cstrong>Pioneers of the Future \u003C/strong>\u003Cstrong>of Mobility\u003C/strong>\u003C/h3>\n\u003Cp>Mobileye&rsquo;s autonomous-vehicle (AV) test drivers are an elite group of professional drivers who are playing an essential role in making Mobileye&rsquo;s &ldquo;\u003Ca href=\"https://www.mobileye.com/opinion/defining-a-new-taxonomy-for-consumer-autonomous-vehicles/\">eyes-off/hand-off\u003C/a>&rdquo; vision a reality.\u003C/p>\n\u003Cp>&ldquo;We&rsquo;re all race-car driving instructors,&rdquo; Horoszowski points out. &ldquo;It&rsquo;s something that makes us the best AV test drivers.&rdquo;\u003C/p>\n\u003Cp>Like the pilots who graduated from the U.S. Air Force Academy in the 1950s to become the first American astronauts at the dawn of the Space Age, the race car driving instructors who make up Mobileye&rsquo;s AV testing team have found that their driving expertise provides them with the unique set of skills required for honing one of the greatest technological advancements of our time&mdash;self-driving cars.\u003C/p>\n\u003Ch3>\u003Cstrong>Safety \u003C/strong>\u003Cstrong>First\u003C/strong>\u003C/h3>\n\u003Cp>Gilad Galili, our head test driver, explains that as a \u003Ca href=\"https://www.cnbc.com/2017/11/24/advanced-driving-lessons-why-you-should-take-them.html\">driving instructor\u003C/a>, your first priority is safety&mdash;and that the same principle applies to testing autonomous vehicles.\u003C/p>\n\u003Cp>&ldquo;Safety is number one and we don&rsquo;t take chances,&rdquo; says Galili. &ldquo;A good test driver is able to perceive all drivers around the vehicle and understand their intentions,&rdquo; he adds. &ldquo;He should be able to &lsquo;read the street&rsquo; and always be several steps ahead of all the other drivers in the environment.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/3763e4946b13323b7b41f1fec08bb5d7_1677677131497.jpg\" alt=\"Mobileye test drivers in the autonomous-vehicle workshop at our headquarters in Jerusalem, Israel.\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>Racing drivers possess excellent hand-eye coordination, car control, and situational awareness, allowing them to respond to changes in a split second. And race car driving also requires them to acquire expert skills in navigating challenging racetracks and making critical decisions under pressure.&nbsp;\u003C/p>\n\u003Cp>The AV test drivers of Mobileye emphasize that the art of car racing is what has taught them to drive at a level of safety that is much higher than most drivers.&nbsp;&nbsp;\u003C/p>\n\u003Cp>&ldquo;As a race-car driving instructor,&rdquo; says Horoszowski, &ldquo;you learn in-depth about how the elements of driving come together&mdash;for example, you must know precisely how to use the gas pedal and the brake to control the handling of the car. Racing also teaches you to plan. You are always planning your race and thinking about possible moves of other cars. The distances are very close&mdash;most of the race, you might be less than a meter from another car. So, you have to be very aware of other cars and know how to prevent an accident.&rdquo;&nbsp;&nbsp;\u003C/p>\n\u003Cp>\u003Cspan class=\"ui-provider xf b c d e f g h i j k l m n o p q r s t u v w x y z ab ac ae af ag ah ai aj ak\" dir=\"ltr\">\"It&rsquo;s important to understand that we have to have very high-level driving skills,&rdquo; adds AV test driver David Polak. &ldquo;And not only advanced technical driving skills, but we actually have to understand the physics of driving as well.&rdquo;\u003C/span>\u003C/p>\n\u003Ch3>\u003Cstrong>A More Comfort\u003C/strong>\u003Cstrong>able Ride\u003C/strong>\u003C/h3>\n\u003Cp>Besides their main focus on safety, Mobileye&rsquo;s AV test drivers also ensure that AVs using Mobileye technology feel comfortable and natural. For example, passengers should feel no difference between how an AV drives and how a human-driven vehicle drives.&nbsp;&nbsp;&nbsp;\u003C/p>\n\u003Cp>&ldquo;Because we train them on the streets of \u003Ca href=\"https://www.youtube.com/watch?v=pDyMzz8HMIc\">Jerusalem\u003C/a>, which are challenging due to the mix of different types of drivers, our cars drive very naturally and are able to adapt easily to other cities around the world,&rdquo; says Galili. In fact, Mobileye AV testing has taken place&mdash;successfully&mdash;in some of the most challenging urban driving environments in the world, like \u003Ca href=\"https://www.youtube.com/watch?v=2H0UIkur1K0&amp;t=74s\">Tokyo\u003C/a>, \u003Ca href=\"https://www.youtube.com/watch?v=Q69tBNCVJa0&amp;t=16s\">Paris\u003C/a>, and \u003Ca href=\"https://www.youtube.com/watch?v=50NPqEla0CQ&amp;t=1415s\">New York\u003C/a>.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/ff9893bac05bf161ea530ee4caa5e11f_1677749158035.jpg\" alt=\"A Mobileye self-driving vehicle operates autonomously while testing on the Champs-&Eacute;lys&eacute;es in Paris, France.\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>This aspect of driving is so important that Mobileye has also created another dedicated team that focuses exclusively on refining the comfort and natural feeling of the self-driving driving experience.&nbsp;&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>To the Limit\u003C/strong>\u003C/h3>\n\u003Cp>Veteran race-car drivers have years of experience handling a vehicle under pressure and know how to push their vehicles&rsquo; limits to achieve their goal of winning races.\u003C/p>\n\u003Cp>When the goal is to make autonomous driving far safer than human driving, Mobileye AV drivers face a different challenge: to understand and test the limits of an autonomous vehicle in a public environment while protecting pedestrians, passengers, other drivers, and themselves.\u003C/p>\n\u003Cp>&ldquo;They have a really intense job,&rdquo; says Kevin Rosenblum, Mobileye&rsquo;s vice-president of AV sensing and one of the developers who works closely with the AV test team. &ldquo;They have to have incredible focus during the whole ride. I admire them.&rdquo;&nbsp;&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>Valuable Feedback\u003C/strong>\u003C/h3>\n\u003Cp>After every drive, the AV test drivers upload data, their notes, and video footage from the drive to the issue-tracking system. Years of advanced driving in challenging situations as professional racing drivers have helped the AV test drivers to become sensitive to all aspects of driving a vehicle, a skill which helps them provide valuable feedback to the development team.\u003C/p>\n\u003Cp>&ldquo;I feel I am very sensitive to how the car drives&mdash;you could say I am &lsquo;one&rsquo; with it,&rdquo; says Gilad Heskia, another racing instructor who joined Mobileye as an AV test driver. This valuable feedback enables Mobileye&rsquo;s development team to consistently improve the technology, making it essential to getting Mobileye&rsquo;s technology ready for market.&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/c16abc73ab89975ee93af3f8407d4d88_1677677205203.png\" alt=\"Inside one of Mobileye&rsquo;s developmental autonomous vehicles as it operates hands-free in a real-world driving situation.\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Ch3>\u003Cstrong>Connecting to the Big Picture\u003C/strong>\u003C/h3>\n\u003Cp>&ldquo;Being an AV test driver is a dream come true,&rdquo; adds Polak. &ldquo;I&rsquo;ve always been in love with cars, since I was a kid. I&rsquo;ve always loved reading about concept cars. And now, I am actually driving the \u003Ca href=\"https://www.bbc.com/news/business-45900484\">concept cars\u003C/a> that will one day become the standard cars of the future. It&rsquo;s amazing.&rdquo;&nbsp;&nbsp;\u003C/p>\n\u003Cp>&ldquo;I think that having AVs on the road is the most effective way to reduce accidents,&rdquo; says Heskia. That&rsquo;s because self-driving cars, unlike humans, never get drowsy, drunk, or distracted. In fact, Mobileye&rsquo;s aim is for its \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\">redundant sensor subsystems\u003C/a> to ensure a level of safety that is one full order of magnitude higher than human drivers.\u003C/p>\n\u003Cp>&ldquo;Having them on the road will also help \u003Ca href=\"https://link.springer.com/article/10.1007/s11116-021-10241-0\">traffic flow\u003C/a> and minimize traffic jams&rdquo; adds Heskia. &ldquo;With \u003Ca href=\"https://www.mobileye.com/solutions/drive/\">MaaS\u003C/a>&nbsp;(Mobility-as-a-Service), if you need a car, you order one. Commutes will become much easier&mdash;no one likes to be stuck in traffic.&rdquo;&nbsp;&nbsp;&nbsp;\u003C/p>\n\u003Cp>But the ease and safety of the daily commute is not the only factor driving the AV test drivers.\u003C/p>\n\u003Cp>&ldquo;AVs will save everyone a lot of energy and reduce emissions&mdash;they will have a big positive impact on the environment,&rdquo; comments Polak.&nbsp;&nbsp;\u003C/p>\n\u003Cp>With improved traffic flow and fewer cars on the road (since each car will carry more occupants than currently), AVs should reduce not only traffic congestion but also air pollution. And since self-driving cars will almost universally be electric and use smart technology to optimize their energy consumption, they will help to even further reduce greenhouse gas emissions.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>At Mobileye, our vision of a future with smarter automobiles has been evolving since 1999. And with every day that passes, we&rsquo;re moving closer to that vision.\u003C/p>\n\u003Cp>Our trailblazing AV test drivers play an instrumental role in realizing that vision, ensuring that when our autonomous driving technology is rolled out, it&rsquo;s safe and reliable for everyone.\u003C/p>","2023-03-07T08:00:00.000Z",{"id":1106,"type":24,"url":1107,"title":1108,"description":1109,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1109,"image":1110,"img_alt":1111,"content":1112,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1113,"tags":1114},203,"mobileye-named-av-leader-by-two-industry-research-groups","Research Groups Name Mobileye AV Leader","Guidehouse Insights & ABI Research both placed Mobileye in number one position across categories in recent competitive assessments of AV technology suppliers.","https://static.mobileye.com/website/us/corporate/images/b45a2e3d4810325ca074f6c0751cec09_1677702040136.png","A NIO ES8 equipped with Mobileye’s self-driving hardware and software. ","\u003Cp>\u003Cstrong>\u003Cspan data-contrast=\"none\">JERUSALEM, March 2, 2023\u003C/span>\u003C/strong>\u003Cspan data-contrast=\"none\"> &mdash; Mobileye has been recognized as the leader in the development of autonomous vehicle technology by two leading research groups, Guidehouse Insights and ABI Research. In assessing several technology companies pursuing the AV market, both the \u003C/span>\u003Ca href=\"https://guidehouseinsights.com/reports/guidehouse-insights-leaderboard-automated-driving-systems\">Guidehouse Insights Leaderboard: Automated Driving Systems\u003C/a>\u003Cspan data-contrast=\"none\"> and the inaugural \u003C/span>\u003Ca href=\"https://www.abiresearch.com/market-research/product/7780562-autonomous-vehicle-platforms/\">ABI Research Autonomous Vehicle Platforms\u003C/a>\u003Cspan data-contrast=\"none\"> reports concluded that Mobileye deserved the top score.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Mobileye achieved the top spot in both rankings within frameworks that assessed multiple companies both quantitatively and qualitatively across a broad range of criteria covering technology, innovation, strategy, implementation, customer bases and more. In the Guidehouse Leaderboard, Mobileye moved forward six positions since the last report published in 2021, an accomplishment that reflects the significant strides Mobileye has made in recent years demonstrating and delivering on its autonomous vision.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">\u003Cimg style=\"display: block; margin-left: auto; margin-right: auto;\" src=\"https://static.mobileye.com/website/us/corporate/images/82455a5ee279086d58a1cc050edd01e8_1677773046696.png\" alt=\"Graphic of ABI Autonomous Vehicle Platform rankings chart\" width=\"600\" height=\"600\" />\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Each report highlighted four companies leading the AV space, and Mobileye was the only company to appear in both lists.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">&ldquo;We appreciate the independent recognition of our tech leadership from these two leading analytical firms,&rdquo; said Mobileye President and CEO Prof. Amnon Shashua. &ldquo;It&rsquo;s a testament to the hard work of thousands of Mobileye employees who are developing the future of transportation every day. It&rsquo;s also a reminder that this industry continues to develop quickly, and we should never take our success for granted.&rdquo;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">\u003Cimg style=\"display: block; margin-left: auto; margin-right: auto;\" src=\"https://static.mobileye.com/website/us/corporate/images/275a9574c58e1af9a90b4bda35dd5590_1677773156031.png\" alt=\"Graphic of Guidehouse Insights Autonomous Platforms ranking\" width=\"600\" height=\"600\" />\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">With AV research and development having matured in recent years, both research groups emphasized the importance of companies having viable business models for future AV deployment. Each report recognized that Mobileye&rsquo;s established leading position and long track-record of execution in the vision-based ADAS market supports a strong path to realizing automated driving of increasing capabilities in both the consumer and mobility-as-a-service markets.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Another key advantage in both reports was Mobileye&rsquo;s approach to scalable, deployable AV tech through its Mobileye SuperVision\u003C/span>\u003Cspan data-contrast=\"none\">&trade;\u003C/span>\u003Cspan data-contrast=\"none\"> eyes-on, hands-off technology which is already in production, along with the Mobileye Drive\u003C/span>\u003Cspan data-contrast=\"none\">&trade;\u003C/span>\u003Cspan data-contrast=\"none\"> robotaxi solution and Mobileye Chauffeur\u003C/span>\u003Cspan data-contrast=\"none\">&trade;\u003C/span>\u003Cspan data-contrast=\"none\"> consumer AV offering; all built on nearly two decades of success in camera-based safety and driver-assist systems installed in more than 135 million vehicles globally.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Media Contact: Justin Hyde\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003Cbr />\u003C/span>\u003Ca href=\"mailto:Justin.Hyde@Mobileye.com\">\u003Cspan data-contrast=\"none\">Justin.Hyde@Mobileye.com\u003C/span>\u003C/a>\u003Cspan data-contrast=\"none\">&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003Cbr />\u003C/span>\u003Cspan data-contrast=\"none\">+1 202-656-6749\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Analyst Relations: Luca Gervasoni\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003Cbr />\u003C/span>\u003Ca href=\"mailto:Luca.Gervasoni@mobileye.com\">\u003Cspan data-contrast=\"none\">Luca.Gervasoni@mobileye.com\u003C/span>\u003C/a>\u003Cspan data-contrast=\"none\">&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003Cbr />\u003C/span>\u003Cspan data-contrast=\"none\">+ 1 954-600-3373\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">___________________________________\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">Mobileye (Nasdaq: MBLY) leads the mobility revolution with its autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in computer vision, artificial intelligence, mapping, and data analysis. Since its founding in 1999, Mobileye has pioneered such groundbreaking technologies as REM&trade; crowdsourced mapping, True Redundancy&trade; sensing, and Responsibility Sensitive Safety (RSS). These technologies are driving the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions, powering industry-leading advanced driver-assistance systems and delivering valuable intelligence to optimize mobility infrastructure. To date, more than 135 million vehicles worldwide have been built with Mobileye technology inside. In 2022 Mobileye listed as an independent company separate from Intel (Nasdaq: INTC), which retains majority ownership. For more information, visit \u003C/span>\u003Ca href=\"https://www.mobileye.com/\">\u003Cspan data-contrast=\"none\">https://www.mobileye.com\u003C/span>\u003C/a>\u003Cspan data-contrast=\"none\">.&nbsp;\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan data-contrast=\"none\">&ldquo;Mobileye,&rdquo; the Mobileye logo and Mobileye product names are registered trademarks of Mobileye in various jurisdictions. All other marks are the property of their respective owners.\u003C/span>\u003Cspan data-ccp-props=\"{}\">&nbsp;\u003C/span>\u003C/p>","2023-03-02T08:00:00.000Z","Industry, Autonomous Driving, Driverless MaaS, Mapping & REM, News, Awards",{"id":1116,"type":5,"url":1117,"title":1118,"description":1119,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1119,"image":1120,"img_alt":1121,"content":1122,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1123,"tags":997},187,"history-autonomous-vehicles-renaissance-to-reality","A Brief History of Autonomous Vehicles – from Renaissance to Reality","Mankind has been dreaming of self-driving vehicles for decades – even centuries. Mobileye is proud to be driving those dreams across the finish line.","https://static.mobileye.com/website/us/corporate/images/f451465cac4b08769db2c21cefb6d700_1676211731836.png","Automobiles, and autonomous vehicles especially, have evolved significantly from da Vinci's initial design to today's robotaxis.","\u003Cp>Someday, when historians look back at this point in history, how will they characterize it? If we at Mobileye have anything to say on the matter, they&rsquo;ll reflect on this as the turning point just before the dawn of self-driving vehicles. A time before anyone could order a ride in a robotaxi or purchase a fully autonomous vehicle of their own. A time when automobiles were still driven by humans, parked by humans, goods delivered by humans, and commutes endured by humans for long hours spent behind the wheel... but a time when great strides were being made to translate the long-held dream of autonomous mobility into reality.\u003C/p>\n\u003Cp>That places us today at the precipice of a new era in automotive history. But just how long, exactly, have people been dreaming of autonomous vehicles? And what are the events that have brought us to where we are today, at this potential turning point in the evolution of human mobility?\u003C/p>\n\u003Cp>Keep reading for a brief history of the evolution of autonomous vehicles, and the part that Mobileye has come to play in ushering in their arrival.\u003C/p>\n\u003Ch3>\u003Cstrong>An Early Dreamer\u003C/strong>\u003C/h3>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/d658353fa03e68f11f9f535d10c72486_1677404199034.jpg\" alt=\"This rendering depicts a self-driving, self-propelled cart designed by Leonardo da Vinci in the 16th century.\" width=\"1604\" height=\"1070\" />\u003C/p>\n\u003Cp>The concept of an autonomous vehicle (AV) dates back to long before the advent of the automobile itself &ndash; to the 16\u003Csup>th\u003C/sup> century, if you can believe it. That&rsquo;s when \u003Ca href=\"https://www.museogalileo.it/en/museum/explore/temporary-exhibitions/275-leonardo-s-automobile-en.html\" target=\"_blank\" rel=\"noopener noreferrer\">Leonardo da Vinci designed a small, three-wheeled, self-propelled cart\u003C/a> regarded as not only the first self-driving vehicle, but the first robot of any kind. More clockwork than motor vehicle, it incorporated a series of springs for propulsion, a pre-programmable steering system, and a parking brake released remotely by rope.\u003C/p>\n\u003Cp style=\"text-align: left;\">In 2016, historians in Florence built a prototype from da Vinci&rsquo;s original sketches. Da Vinci himself surely would not have been surprised \u003Ca href=\"https://youtu.be/NLvoOkeCGrw\" target=\"_blank\" rel=\"noopener noreferrer\">to see it actually work\u003C/a> as it did. But we can only imagine what he would have thought if he&rsquo;d seen \u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-test-demo-road-trip/\" target=\"_blank\" rel=\"noopener noreferrer\">our AV recently driving itself through Milan\u003C/a>, where he spent part of his life and career.\u003C/p>\n\u003Ch3 style=\"text-align: left;\">\u003Cstrong>The Magnetism of the Future\u003C/strong>\u003C/h3>\n\u003Cp style=\"text-align: left;\">It would take centuries for technology to catch up to where da Vinci left off. But when it did, the dream of autonomous vehicles rode along with it.\u003C/p>\n\u003Cp style=\"text-align: left;\">In 1925, electrical engineer \u003Ca href=\"https://jalopnik.com/the-man-who-tested-the-first-driverless-car-in-1925-had-1792312207\" target=\"_blank\" rel=\"noopener noreferrer\">Francis P. Houdina\u003C/a> radio-controlled a full-size automobile through the streets of New York. The car crashed and the project failed. But as automobiles proliferated in the decades that followed, so too did efforts to automate their operation.\u003C/p>\n\u003Cp style=\"text-align: left;\">At the 1939 New York World&rsquo;s Fair, industrial designer Norman Bel Geddes showcased a \u003Ca href=\"https://blog.mcny.org/2013/11/26/i-have-seen-the-future-norman-bel-geddes-and-the-general-motors-futurama/\" target=\"_blank\" rel=\"noopener noreferrer\">dazzling array of visionary transportation concepts\u003C/a>. His &ldquo;Futurama&rdquo; exhibit featured rooftop helipads and interstate highways &ndash; as well as semi-autonomous vehicles to travel along them, using a combination of radio control and magnets embedded in the pavement.\u003C/p>\n\u003Ch5 style=\"text-align: left;\">\u003Ccode>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/d4aec9c0fe628cf71bb3adb97f8f7251_1676214431534.jpg\" alt=\"Norman Bel Geddes' Futurama exhibit at the 1939 New York World's Fair.\" width=\"1604\" height=\"1070\" />\u003Cbr />\u003C/code>\u003Cem>Norman Bel Geddes&rsquo; \"Magic Motorways\"\u003C/em>\u003C/h5>\n\u003Cp style=\"text-align: left;\">General Motors sponsored the exhibit and continued developing the idea over the following couple of decades with its \u003Ca href=\"https://www.gm.com/stories/firebird\" target=\"_blank\" rel=\"noopener noreferrer\">Firebird series of concept cars\u003C/a>. Others experimented with magnetic guidance as well, but upgrading infrastructure proved too costly and restrictive to present a viable and scalable solution for autonomous mobility.\u003C/p>\n\u003Ch3>\u003Cstrong>An Academic Pursuit\u003C/strong>\u003C/h3>\n\u003Cp>Academia led the next stage in AV development. In 1961, researchers at \u003Ca href=\"https://youtu.be/8Mxk2L3lu9Q\" target=\"_blank\" rel=\"noopener noreferrer\">Stanford University developed a small cart\u003C/a> to navigate on the surface of the moon using a basic form of computer vision. In 1977, mechanical engineers at the University of Tsukuba in Japan took the idea further with a passenger vehicle capable of driving autonomously at up to 20 miles per hour. Further progress was made in the mid-90s at the University of Parma in Italy &ndash; also not far from where da Vinci lived (and dreamed) centuries prior.\u003C/p>\n\u003Cp>One of the most significant efforts came under the aegis of the \u003Ca href=\"https://www.autonomousvehicleinternational.com/features/the-prometheus-project.html\" target=\"_blank\" rel=\"noopener\">Eureka PROMETHEUS Project\u003C/a>, which brought together a consortium of universities, automakers, and tech companies in the late 1980s and early &lsquo;90s. Funded by (mostly European) governments to the tune of &euro;749 million (roughly &euro;1.3 billion in today&rsquo;s money), the project culminated in 1994 with thousand-kilometer (~620-mile) drive on Parisian highways that included operating hands-free, in convoy, automatically changing lanes while tracking and passing other vehicles.\u003C/p>\n\u003Cp>Perhaps the greatest strides, however, were made once the Pentagon brought its enormous resources to bear.\u003C/p>\n\u003Ch3>\u003Cstrong>The Best Defense\u003C/strong>\u003C/h3>\n\u003Cp>Through its Defense Advanced Research Projects Agency (DARPA), the U.S. Department of Defense began supporting autonomous-vehicle research in the 1980s. Its first major undertaking in the self-driving sphere saw it fund the Autonomous Land Vehicle (ALV) project, which employed an early form of lidar to navigate off-road.\u003C/p>\n\u003Ch5>\u003Cem>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/9c88c92a3a8850ad15ad9aa385e12928_1676214226278.jpg\" alt=\"Stanford University's winning Volkswagen Touareg prototype at the DARPA Grand Challenge.\" width=\"1605\" height=\"1070\" />\u003Cbr />\u003C/em>\u003Cem>Photo courtesy of DARPA\u003C/em>\u003C/h5>\n\u003Cp>DARPA also supported \u003Ca href=\"https://www.cs.cmu.edu/afs/cs/project/alv/www/\" target=\"_blank\" rel=\"noopener noreferrer\">Carnegie Mellon University&rsquo;s Navigation Laboratory\u003C/a> (or &ldquo;Navlab&rdquo;), which produced a series of experimental AVs. In 1986, Navlab 1 &ndash; a full-size van packed with computers, sensors, and cooling systems &ndash; drove itself (slowly) around suburban neighborhoods. In 1995, the Navlab 5 minivan drove from Pittsburgh all the way to San Diego under the banner of \"No Hands Across America.\" The onboard researchers handled the throtle and brakes while the vehicle steered itself for over 98% of the 2,850-mile (~4,600-kilometer) journey.\u003C/p>\n\u003Cp>The floodgates really opened, however, with the \u003Ca href=\"https://www.darpa.mil/about-us/timeline/-grand-challenge-for-autonomous-vehicles\" target=\"_blank\" rel=\"noopener noreferrer\">DARPA Grand Challenge\u003C/a> &ndash; a series of competitions that saw Stanford and Carnegie Mellon (among others) sparring for AV leadership. None of the contestants finished the first desert competition in 2004, but five did the following year &ndash; led by Stanford&rsquo;s Volkswagen Touareg (using Intel processors). Two years later, Carnegie Mellon won the urban competition with a specially equipped Humvee.\u003C/p>\n\u003Cp>DARPA&rsquo;s competitions proved that self-driving technology could work. And that milestone in turn spurred a slew of startups, tech firms, and automakers to start their own AV development programs over the course of the following decade-plus &ndash; some of them employing former Grand Challenge contestants.\u003C/p>\n\u003Ch3>\u003Cstrong>Building Blocks\u003C/strong>\u003C/h3>\n\u003Cp>While others were grappling with solving the AV question as a whole, Mobileye was already establishing itself as a leader in what would become the building blocks of self-driving technology.\u003C/p>\n\u003Ch5>\u003Ciframe src=\"https://www.youtube.com/embed/L1Bmc61l99A\" width=\"560\" height=\"314\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003Cbr />\u003Cem>Archival footage: Mobileye&rsquo;s first developmental AV began testing in 2013.\u003C/em>\u003C/h5>\n\u003Cp>From our founding in 1999, Mobileye focused on developing and supplying computer-vision technology for advanced driver-assistance systems, and our experience and track record grew steadily from there. By 2013, our technology was already incorporated into more than a million vehicles, and the driver-assist features we were enabling kept getting more advanced. But as far as we&rsquo;d come, we were convinced we could go even further.\u003C/p>\n\u003Cp>A self-driving system, we concluded, wasn&rsquo;t really one system at all. It was a collection of individual functions working in unison &ndash; driver-assist functions like those our technology was already supporting. So we fitted an array of those systems to an Audi A7 and connected them to create our first self-driving car. It worked so well that our customers, who had already come to rely on us for increasingly sophisticated driver-assist technologies, begun asking us to furnish them with the hands-free driving features that our prototype demonstrated.\u003C/p>\n\u003Ch3>\u003Cstrong>From Vision to Reality\u003C/strong>\u003C/h3>\n\u003Cp>As demand grew, so too did our AV development program. Our test fleet expanded and evolved from the original internal-combustion Audis to Ford Fusion hybrid sedans and NIO ES8 electric crossovers &ndash; each with an ever-increasing range of autonomous capabilities. And we dispatched those vehicles for testing in a variety of challenging \u003Ca href=\"https://www.mobileye.com/news/autonomous-vehicle-testing-miami-stuttgart/\" target=\"_blank\" rel=\"noopener\">locations around the world\u003C/a>.\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/5obbgj5gIug\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>Along the way, we upgraded from a single forward-facing camera to surround vision. We added \u003Ca href=\"https://www.mobileye.com/blog/radar-lidar-next-generation-active-sensors/\" target=\"_blank\" rel=\"noopener\">radar and lidar sensors\u003C/a>, specialized \u003Ca href=\"https://www.mobileye.com/blog/rem-mapping-avs/\" target=\"_blank\" rel=\"noopener\">AV maps\u003C/a>, and the \u003Ca href=\"https://www.mobileye.com/blog/responsibility-sensitive-safety-unwritten-rules-of-the-road/\" target=\"_blank\" rel=\"noopener\">driving policy\u003C/a> to share the road safely and effectively with other drivers. And we've applied the results of our ongoing AV development program to a \u003Ca href=\"https://www.mobileye.com/solutions/\">broad range of solutions\u003C/a> &ndash; from \u003Ca href=\"https://www.mobileye.com/blog/cloud-enhanced-driver-assist/\" target=\"_blank\" rel=\"noopener\">enhanced\u003C/a> and \u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-bridge-to-consumer-autonomous-vehicles/\" target=\"_blank\" rel=\"noopener\">premium driver-assist\u003C/a> to complete \u003Ca href=\"https://www.mobileye.com/blog/mobileye-drive-self-driving-system/\" target=\"_blank\" rel=\"noopener\">self-driving systems\u003C/a>.\u003C/p>\n\u003Cp>As far as we&rsquo;ve come, however, we know there&rsquo;s still so much more to come. The dream of truly autonomous vehicles, held for so long by so many, is on the verge of becoming reality. And we&rsquo;re proud of the role we&rsquo;re playing in driving it across the line &ndash; and in teaching it to drive itself even further.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Ch6>\u003Ca href=\"https://static.mobileye.com/website/us/corporate/images/all_timeline_1676366973.png\" target=\"_blank\" rel=\"noopener\">\u003Cimg style=\"width: 100%;\" src=\"https://static.mobileye.com/website/us/corporate/images/all_timeline_1676366973.png\" alt=\"The timeline of autonomous-vehicle development, from Da Vinci to today\" width=\"423\" height=\"1070\" />\u003C/a>\u003C/h6>","2023-02-27T08:00:00.000Z",{"id":1125,"type":5,"url":1126,"title":1127,"description":1128,"primary_tag":140,"author_name":10,"is_hidden":11,"lang":12,"meta_description":1128,"image":1129,"img_alt":1130,"content":1131,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1132,"tags":1133},189,"responsibility-sensitive-safety-unwritten-rules-of-the-road","The Unwritten Rules of the Road, Codified in RSS™","Our drive for autonomous vehicle safety led Mobileye to formulate the Responsibility-Sensitive Safety™ model and share it with the world.\n\n ","https://static.mobileye.com/website/us/corporate/images/7c7cc092c168c2aaa405916a1d7bef4f_1672933384808.png"," Responsibility-Sensitive Safety™ is an open, comprehensive, and verifiable mathematical model for autonomous vehicle safety.","\u003Cp>Picture yourself pulling up at an intersection. The right of way is yours, and the intersection looks clear &ndash; but still you proceed cautiously, with your foot covering the brake pedal. And good thing you did, because the other driver didn&rsquo;t yield, and then a pedestrian you couldn&rsquo;t see stepped off the curb from behind a parked van.\u003C/p>\n\u003Cp>Had you simply acted according to what you saw when approaching the intersection and legitimately followed the written rules of the road, you might have collided with one or both of them. But as an experienced driver, you followed your intuition and heeded the unwritten rules of the road as well.\u003C/p>\n\u003Cp>At Mobileye, we understand that those informal &ldquo;rules&rdquo; are no less important than the formal ones, so we&rsquo;ve codified the former into the \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener\">Responsibility-Sensitive Safety&trade;\u003C/a> model &ndash; an open, comprehensive, and verifiable mathematical model for \u003Ca href=\"https://www.mobileye.com/blog/tag/av-safety/\" target=\"_blank\" rel=\"noopener\">autonomous vehicle safety\u003C/a>. It&rsquo;s a formula for enabling self-driving cars to safely share the road with human drivers and a crucial element in the effort to deploy AVs at scale around the world.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/2c4b23d1842f9a42d8bffe01a946e938_1672933725530.png\" alt=\"For autonomous vehicles to interact safely with human-driven vehicles, they need a decision-making framework like RSS.\" width=\"1650\" height=\"776\" />\u003C/p>\n\u003Ch3>\u003Cstrong>Why Do We Need an Autonomous Vehicle Safety Model?\u003C/strong>\u003C/h3>\n\u003Cp>The Responsibility-Sensitive Safety model (RSS&trade;) is based on the premise that for self-driving technology to be both safe and useful, it must be able to handle not only the majority of everyday driving tasks, but the &ldquo;edge cases&rdquo; as well. That is, it must be able to deal with unexpected events in an inherently unpredictable driving environment &ndash; at least for as long as autonomous vehicles (AVs) share the road with human drivers. &nbsp;\u003C/p>\n\u003Cp>What an AV needs, then, is a set of parameters by which to evaluate all of its decisions. Not simply to control the vehicle&rsquo;s movements or to strictly abide by traffic laws, but an all-encompassing framework for autonomous vehicle safety allowing the AV to deal with whatever situation might arise &ndash; even those we cannot predict.&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>Five Rules to Live and Drive By\u003C/strong>\u003C/h3>\n\u003Cp>RSS rises to that challenge by codifying the intuitive skills of driving into \u003Ca href=\"https://www.mobileye.com/blog/rss-explained-the-five-rules-for-autonomous-vehicle-safety/\" target=\"_blank\" rel=\"noopener\">five all-encompassing rules\u003C/a>:\u003Cbr />1) \u003Cstrong>Do not hit the car in front of you\u003C/strong> &ndash; based on a mathematical formulation of the &ldquo;two-second rule&rdquo; we all learn in driving school.\u003Cbr />2) \u003Cstrong>Do not cut in recklessly\u003C/strong> &ndash; similar to the longitudinal safe distance established by Rule #1, but applied here to the lateral distance between vehicles.\u003Cbr />3) \u003Cstrong>Right of way is given, not taken\u003C/strong> &ndash; which tells the AV to watch out for other vehicles that may not yield the right of way to it, even when they should.\u003Cbr />4) \u003Cstrong>Be cautious in areas with limited visibility\u003C/strong> &ndash; just because the vehicle&rsquo;s sensors can&rsquo;t detect a potential hazard doesn&rsquo;t mean the way is clear.\u003Cbr />5) \u003Cstrong>If you can avoid a crash without causing another one, you must\u003C/strong> &ndash; because ultimately, the goal is to prevent collisions.\u003C/p>\n\u003Cp>\u003Ciframe src=\"https://www.youtube.com/embed/pn88uJbkQqc\" width=\"560\" height=\"314\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>We believe that any AV following the five rules of RSS will never cause an accident, and will respond safely and appropriately to any potentially dangerous situation it might encounter. To substantiate that position, we&rsquo;ve opened RSS up to independent scrutiny, and continue to invite outside members of industry, academia, and government to identify any scenario that an AV operating under RSS would not be able to handle.&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>The Leanest Possible Driving Policy\u003C/strong>\u003C/h3>\n\u003Cp>RSS stems from a \u003Ca href=\"https://arxiv.org/abs/1708.06374\" target=\"_blank\" rel=\"noopener\">research paper on autonomous vehicle safety published in 2017\u003C/a> by three of \u003Ca href=\"https://www.mobileye.com/about/management/\" target=\"_blank\" rel=\"noopener\">Mobileye&rsquo;s leading minds\u003C/a>, but its real-world applications extend far beyond the academic.\u003C/p>\n\u003Cp>RSS forms the basis of the singularly lean driving policy by which our AVs operate. Rather than trying to determine every possible scenario that could potentially happen, our RSS-based driving policy focuses on what is likely to happen. It does so by continually calculating reasonable worst-case assumptions within the range of realistic possibilities &ndash; not unlike how a human drives.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/3562e87a9e8ff414c63f5e334583c9af_1672933433454.png\" alt=\"Responsibility-Sensitive Safety codifies safe driving into five rules to govern all of an autonomous vehicle&rsquo;s decisions.\" width=\"1650\" height=\"776\" />\u003C/p>\n\u003Cp>This &ldquo;lean&rdquo; computing process takes place on our highly efficient and compact \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener\">EyeQ&trade; Systems-on-Chip\u003C/a> and enables the AV to make sound, split-second decisions on how to operate in real-world scenarios &ndash; not just in limited geographic areas, but in \u003Ca href=\"https://www.mobileye.com/news/autonomous-vehicle-testing-miami-stuttgart/\" target=\"_blank\" rel=\"noopener\">locations around the world\u003C/a>, where driving conditions and behavior differ widely.&nbsp;\u003C/p>\n\u003Ch3>\u003Cstrong>Opening Roads to Autonomous Mobility\u003C/strong>\u003C/h3>\n\u003Cp>However good and trustworthy it is, developing the technology required for an AV to function correctly is only part of the equation. To deploy autonomous vehicles at scale, AVs will need roads on which they&rsquo;re allowed to drive. And that means that government regulators will need to adapt the rulebooks developed over the course of decades for human-driven vehicles to include self-driving vehicles as well.\u003C/p>\n\u003Cp>RSS supports that process by not merely telling, but showing regulators that self-driving technology based on RSS can be trusted to safely operate on public roadways. And that empowers regulators, policymakers, and other \u003Ca href=\"https://www.mobileye.com/blog/responsibility-sensitive-safety-gains-traction-worldwide/\" target=\"_blank\" rel=\"noopener\">stakeholders\u003C/a> to make informed decisions in paving the way for autonomous mobility.\u003C/p>\n\u003Cp>The adaptability of RSS even allows for individual jurisdictions to customize the model to suit their local roadways and driving cultures. One country might prioritize safety, for example, by mandating that AVs maintain a greater distance from other vehicles, while another might emphasize fitting more seamlessly into traffic. RSS allows for such fine-tuning &ndash; much in the same way that speed limits are determined according to local conditions.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/2c34229cd8ee4f4cd998e05a3af9c63b_1677141578538.png\" alt=\"RSS formulates the unwritten rules of the road into an open, comprehensive, and verifiable mathematical model.\" width=\"1650\" height=\"776\" />\u003C/p>\n\u003Cp>Arguably more than any individual application or parameter, however, the Responsibility-Sensitive Safety framework is about engendering trust through transparency. And we believe that trust is precisely what&rsquo;s needed for the world to embrace the future of transportation whose time has come.\u003C/p>","2023-02-23T08:00:00.000Z","Video, AV Safety",{"id":1135,"type":1136,"url":1137,"title":1138,"description":1139,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1139,"image":1140,"img_alt":1141,"content":1142,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1143,"tags":1144},201,"interview","autonocast-prof-amnon-shashua-taxonomy","Prof. Amnon Shashua: Changing How We Talk About Automated Driving","On the latest episode of the Autonocast, Mobileye’s founder and CEO discussed our new taxonomy for highly automated and fully autonomous vehicles.","https://static.mobileye.com/website/us/corporate/images/212a38ecdfd1db6e1045804f68b7b143_1676988542835.jpg","Professor Amnon Shashua, founder and CEO, Mobileye","\u003Cp>In episode #273 of the popular podcast \u003Cem>Autonocast\u003C/em>, Mobileye founder and CEO \u003Ca href=\"https://www.mobileye.com/amnon-shashua/\">Professor Amnon Shashua\u003C/a> explains how Mobileye&rsquo;s \u003Ca href=\"https://www.mobileye.com/opinion/defining-a-new-taxonomy-for-consumer-autonomous-vehicles/\">new taxonomy for autonomous driving\u003C/a> creates a clearer picture as the automotive industry moves forward with the productization of autonomous and highly automated systems.\u003C/p>\n\u003Cp>In this episode he and co-hosts Alex Roy, Ed Niedermeyer, and Kirsten Korosec covered a broad range of subjects related to mobility, from validation and the role of the driver to the future of driver assistance.\u003C/p>\n\u003Cp>Listen to the full episode at \u003Ca href=\"http://www.autonocast.com/blog/2023/2/16/273-mobileye-ceo-amnon-shashua-on-changing-how-we-talk-about-automated-driving\">Autonocast.com\u003C/a>, on \u003Ca href=\"https://open.spotify.com/episode/2nbqhVVaQt60rfk8KL8Qva?si=cKI5JTnrQZyB3xe9_d3Lvg\">Spotify\u003C/a>, \u003Ca href=\"https://podcasts.apple.com/us/podcast/273-mobileye-ceo-amnon-shashua-on-changing-how-we-talk/id1168333433?i=1000600049822\">Apple Podcasts\u003C/a>, \u003Ca href=\"https://podcasts.google.com/feed/aHR0cDovL3d3dy5hdXRvbm9jYXN0LmNvbS9ibG9nP2Zvcm1hdD1yc3M/episode/NTgwOTEzZTZkNDgyZTlkMmRhMjk2NzAxOjU4MDkyN2NlZTU4YzYyZDQ4MDFkNWVlNjo2M2VlODY2ODQxZjcwNDRjNjdjYWQxZTg?sa=X&amp;ved=0CAYQuIEEahcKEwiY8b-F3Kb9AhUAAAAAHQAAAAAQAQ\">Google Podcasts\u003C/a>, or your other favorite podcast platform.\u003C/p>","2023-02-21T08:00:00.000Z","Autonomous Driving, From our CEO",{"id":1146,"type":5,"url":1147,"title":1148,"description":1149,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1149,"image":1150,"img_alt":1151,"content":1152,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1153,"tags":1154},200,"mobileye-supervision-bridge-to-consumer-autonomous-vehicles","Bridging to the Autonomous Future with Mobileye SuperVision™","Incorporating the essential building blocks for consumer AVs, Mobileye SuperVision™ is our bridge from assisted to autonomous driving.","https://static.mobileye.com/website/us/corporate/images/21a6e12d0e849bbf17263e0f0770df82_1676885917761.png","Derived from our camera-based self-driving system, Mobileye  SuperVision is our most advanced driver-assistance system yet.","\u003Cp>Of all the iconic landmarks in all the cities around the world, the Brooklyn Bridge is one that stands out. Not only for its stoic design or its impressively long span, but for the extent to which it transformed mobility and the city around it. The bridge&rsquo;s construction opened up ground transportation between Brooklyn and Manhattan for the first time and (quite literally) paved the way for their unification into boroughs of the same metropolis.\u003C/p>\n\u003Cp>Now more than a century and a half since construction began to bridge that divide, we&rsquo;re gazing across another. On one side is the driver-assistance technology we&rsquo;ve been pioneering for decades; on the other, the future of fully autonomous mobility. \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\">Mobileye SuperVision&trade;\u003C/a> bridges the distance between them.\u003Cstrong>&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/536311f8f5f148bfd3136200b3a6bbbd_1676550949128.jpg\" alt=\"Tested in some of the most challenging cities in the world, Mobileye SuperVision is our bridge to consumer autonomous vehicles.\" width=\"1605\" height=\"1070\" />\u003C/strong>\u003C/p>\n\u003Ch3>\u003Cstrong>Eyes on the Road, Hands off the Wheel \u003C/strong>\u003C/h3>\n\u003Cp>The act of driving regular cars requires the driver to watch the road and steer the vehicle. Completely self-driving cars will relieve the human driver of both those obligations (if they&rsquo;ll even require human drivers at all). With Mobileye SuperVision, you still have to watch the road, but you can take your hands off the wheel and let the system do most of the driving for you (within its specified Operational Design Domains &ndash; that is, the types of road on which it is capable of self-operating).\u003C/p>\n\u003Cp>Mobileye SuperVision is essentially the production version of the camera-based autonomous driving system that we&rsquo;ve been testing and honing for years in real-world traffic on public roadways in some of the world&rsquo;s most challenging driving environments &ndash; from Jerusalem to \u003Ca href=\"https://www.mobileye.com/press-kit/press-kit-mobileye-new-york-city/\">New York\u003C/a> and \u003Ca href=\"https://www.mobileye.com/blog/paris-ratp-autonomous-vehicle-testing-pilot/\">Paris\u003C/a> to \u003Ca href=\"https://youtu.be/2H0UIkur1K0\">Tokyo\u003C/a>. And it brings to bear all that we&rsquo;ve learned over the course of nearly a quarter century in developing advanced driver-assistance systems, which has seen our technology integrated into more than 125 million vehicles (and counting).\u003C/p>\n\u003Cp>Based on such extensive experience, Mobileye SuperVision delivers eyes-on but hands-off operation of standard driving functions on all regular road types at up to 80 mph (130 km/h). In other words, a vehicle equipped with Mobileye SuperVision can function largely like an autonomous vehicle &ndash; albeit while under the driver&rsquo;s guidance.\u003C/p>\n\u003Cp>\u003Ciframe src=\"https://www.youtube.com/embed/mvOqxOeO0pI\" width=\"560\" height=\"314\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Ch3>\u003Cstrong>Incorporating the Latest Technologies\u003C/strong>\u003C/h3>\n\u003Cp>To enable such self-driving functionality, Mobileye has developed (and continues to develop) a broad range of technologies &ndash; including sensors, maps, driving policy, and processors.\u003C/p>\n\u003Cp>Mobileye SuperVision incorporates all of these &ndash; including 11 cameras, \u003Ca href=\"https://www.mobileye.com/technology/rem/\">REM&trade;\u003C/a>-powered Mobileye Roadbook&trade; maps, \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\">RSS&trade;\u003C/a>-based driving policy, and two of our latest \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\">EyeQ&trade; Systems-on-Chip\u003C/a> in an integrated ECU. And with \u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-zeekr-ota-update/\">over-the-air updates\u003C/a>, it has the built-in capacity for further upgrades as development advances.\u003C/p>\n\u003Cp>The sum of all these parts makes Mobileye SuperVision both a highly advanced driver-assistance system and the baseline for eyes-off solutions (within certain ODDs) and for completely driverless solutions (without need for a driver in the vehicle at all). With the cameras, maps, and driving policy required for eyes-off operation already proven in the context of the eyes-on/hands-off system, all that will be needed to move from eyes-on to eyes-off is to add \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\">redundancies\u003C/a> (such as \u003Ca href=\"https://www.mobileye.com/blog/radar-lidar-next-generation-active-sensors/\">active sensors\u003C/a> and additional processing power).\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/ab6850f3abe51847503f9afb8bf2168c_1676559702049.png\" alt=\"Mobileye SuperVision incorporates a full range of cutting-edge technologies, including sensors, processors, and software.\" width=\"1650\" height=\"777\" />\u003C/a>\u003C/p>\n\u003Ch3>\u003Cstrong>Driving Today on the Bridge to the Future\u003C/strong>\u003C/h3>\n\u003Cp>This cutting-edge solution is already on the road and is destined for further implementation. Our \u003Ca href=\"https://www.mobileye.com/opinion/our-new-deal-with-geely-is-a-game-changer-says-shashua/\">launch partner \u003C/a>Zeekr (part of the Geely group) has already put over 70,000 \u003Ca href=\"https://youtu.be/R8qTOPpQ2-I\">Zeekr 001\u003C/a> electric vehicles on the road equipped with Mobileye SuperVision. It recently revealed the \u003Ca href=\"https://www.mobileye.com/news/zeekr-mobileye-supervision/\">Zeekr 009\u003C/a>, also featuring Mobileye SuperVision. And Geely has announced that \u003Ca href=\"https://www.mobileye.com/news/geely-holding-group-expands-mobileye-collaboration/\">three more brands\u003C/a> under its vast umbrella will incorporate this system into additional models.\u003C/p>\n\u003Cp>Based on orders already in place, there are due to be 150,000 vehicles on the road incorporating Mobileye SuperVision by the end of 2023; and by 2026, we expect to have nine different models from six manufacturers &ndash; amounting to an anticipated 1.2 million vehicles &ndash; featuring this highly advanced system on roads around the world.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/989ddfceb9b1ef037c04d5cc3f711a26_1676551140698.jpg\" alt=\"Mobileye SuperVision allows for eyes-on/hands-off capabilities on a full range of Operational Design Domains.\" width=\"1650\" height=\"838\" />\u003C/p>\n\u003Cp>Between the capabilities it delivers today and the baseline it establishes for the future, Mobileye SuperVision serves as our bridge between assisted driving on one side, and autonomous driving on the other. In ushering in its arrival, we get a glimpse at what it must have felt like to drive across the East River for the first time, and at the promise such a monumental feat of engineering holds to transform the scope of human mobility.\u003C/p>","2023-02-16T08:00:00.000Z","ADAS, Video",{"id":1156,"type":654,"url":1157,"title":1158,"description":1159,"primary_tag":32,"author_name":589,"is_hidden":11,"lang":12,"meta_description":1159,"image":1160,"img_alt":1161,"content":1162,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1163,"tags":1164},199,"defining-a-new-taxonomy-for-consumer-autonomous-vehicles","Defining a New Taxonomy for Consumer Autonomous Vehicles","Mobileye’s CEO and CTO detail the new taxonomy revealed at CES 2023 for deploying eyes-off/hands-off self-driving consumer vehicles.","https://static.mobileye.com/website/us/corporate/images/848b0bf9c27dbd2df10eaa07a2a2c790_1675702827035.png","Mobileye's new taxonomy for autonomous vehicles is based on defining the interaction between man and machine.","\u003Cp>Tech and auto companies had a very turbulent 2022 in almost every aspect. In spite of great turmoil, the automobile industry has shifted gears in its pursuit to adopt and deploy consumer-level autonomy in the near future. A number of the industry conventions around autonomous driving have again become unclear and ambiguous as a result of this development. The confusion threatens to obscure the real benefits of autonomy in terms of safety, convenience, and efficiency. &nbsp;\u003C/p>\n\u003Cp>We see a growing need for a new way of talking and thinking about consumer AVs (CAV) that recognizes how they will work in the real world, and ensures the usefulness, safety, and scalability of consumer-level AVs. To achieve that, Mobileye has laid down a new taxonomy alongside a set of basic requirements for CAV, which was presented in our yearly address at \u003Ca href=\"https://www.mobileye.com/blog/ces-2023-recap/\">the last CES\u003C/a>.\u003C/p>\n\u003Ch3>The Need for Clarity\u003C/h3>\n\u003Cp>Autonomous driving is viewed mainly through the prism of SAE Levels of autonomy, also known as J3016, which is widely accepted as the industry standard. When introduced in 2014, the SAE J3016 provided a very good reference for AV development and regulation while everybody was still wrapping their minds around the question of &ldquo;what is autonomous driving, exactly?&rdquo;. However, as we move forward with the productization of autonomous and highly automated systems, it is evident that the current Level 1-5 taxonomy cannot form a basis for a product-oriented description that is clear for both the engineer and the end customer.\u003C/p>\n\u003Cp>When looking at the current industry discourse, we see two issues that need to be addressed. The first issue is vague and unclear definitions from an end-user perspective. Second and more important is the unnecessary distinction between Level 3 and Level 4. According to J3016, Level 3 and Level 4 differ in the Minimum Risk Maneuver (MRM) requirements and the vigilance level of the human driver. This may lead to &ldquo;\u003Cem>failures by design\u003C/em>&rdquo; of the autonomous system, as demonstrated in an \u003Ca href=\"https://amnon-shashua.medium.com/on-black-swans-failures-by-design-and-safety-of-automated-driving-systems-1401076e9027\">opinion paper\u003C/a> we published in 2021.\u003C/p>\n\u003Ch3>Simpler Language\u003C/h3>\n\u003Cp>To deal with the deficiencies depicted above, we propose a simplified language that defines the levels of autonomy based on four&nbsp;axes:&nbsp;(i)&nbsp;Eyes-on/Eyes-off,&nbsp;(ii) Hands-on/Hands-off,&nbsp;(iii) Driver versus No-driver, and (iv) MRM requirement. As seen in the chart below, this creates four product categories covering the entire spectrum of automated driving.\u003C/p>\n\u003Cp>\u003Ca href=\"https://static.mobileye.com/website/us/corporate/images/84fb9e3a75b5cc9da909240510812dc9_1675683237023.png\" target=\"_blank\" rel=\"noopener\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/84fb9e3a75b5cc9da909240510812dc9_1675683237023.png\" alt=\"Mobileye's new taxonomy for driving automation, based on the respective roles of man and machine.\" width=\"1650\" height=\"928\" />\u003C/a>\u003C/p>\n\u003Cp>\u003Cstrong>1) Eyes-on/Hands-on\u003C/strong>: this category covers all the basic driver-assist functions, such as Autonomous Emergency Braking (AEB) and Lane Keep Assist (LKA). The driver is still responsible for the entire driving task while the system monitors the human driver (Level 1-2 according to SAE).\u003C/p>\n\u003Cp>\u003Cstrong>2) Eyes-on/Hands-off\u003C/strong>: this is a driver-assistance function where the driver&rsquo;s hands can be off the steering wheel while the system takes control of the driving and the driver supervises the system (hence, Eyes-on) within a specified Operational Design Domain (ODD). With a proper driving monitoring system (DMS), one can create a very useful human/machine synergetic interaction (analogous to pilots supervising the auto-pilot system in an aircraft) and increase the overall safety of driving. This is usually referred to as Level 2+, a term that was first coined by Mobileye and not formally defined by SAE. Due to the absence of the Eyes-on/Hands-off category from the SAE taxonomy, it is usually wrongly classified as Level 3-4. It is important to emphasize that the role of the &ldquo;supervisor&rdquo; changes from (1) to (2): in an Eyes-on/Hands-on system it is the \u003Cem>system\u003C/em> that is supervising the driver and intervening (rarely) to avoid an accident (like applying the brakes to avoid collision). It is important that \u003Cem>interventions\u003C/em> by the system happen rarely and in emergency situations rather than a continuous interference with the human driver. In an Eyes-on/Hands-off system it is the \u003Cem>human driver\u003C/em> who is supervising the system. To be effective, the interventions should happen rarely, and in order to keep the human driver vigilant, a proper DMS should be in place. An Eyes-on/Hands-off setting increases safety since the failure modes of the human and the system are very different: human failures are mostly concentrated on lack of attention and distraction that can happen in good weather and mundane road conditions whereas system failures mostly occur in challenging environments (weather, road types, and traffic maneuvers).\u003C/p>\n\u003Cp>\u003Cstrong>3) Eyes-off/Hands-off\u003C/strong>: the system controls the driving function within a specified ODD (say, highways with on/off ramp transitions) without the human driver needing to supervise the driving (hence, Eyes-off). Once the ODD comes to an end, and if the driver does not take back control, the system is able to conduct a full MRM and stop safely on the shoulder of the road. This category can be classified as either Level 3 or Level 4 according to SAE J3016. It is worth noting that J3016 exempts the Level 3 system from performing full MRM by allowing it to stop in-lane, but we argue that a \u003Ca href=\"https://theintercept.com/2023/01/10/tesla-crash-footage-autopilot/\">stop-in-lane emergency maneuver is not safe\u003C/a>. In addition, the Eyes-off/Hands-off system still requires a qualified driver sitting in the driver&rsquo;s seat so he/she will be able to take control in non-safety-related situations happening at zero velocity in order not to jeopardize the flow of traffic (e.g., deadlocks, policeman directing traffic, etc.).\u003C/p>\n\u003Cp>\u003Cstrong>4) No Driver\u003C/strong>: when there is no human driver present, say in a Robotaxi, the role of the human driver is replaced by a \u003Cem>teleoperator\u003C/em> who can intervene to resolve non-safety situations like those mentioned above.\u003C/p>\n\u003Cp>In the simplified language we propose above, the requirements from the driver are well defined, so there are no ambiguities from the end-user perspective. The human driver is either supervised by the system or is supervising the system or is allowed to disconnect attention entirely without being bothered by the system. The Eyes-off category translates to a full MRM capability of safely stopping on the shoulder of the road without blocking traffic. The value proposition of an Eyes-off setting is \u003Cstrong>time\u003C/strong>, i.e., the human in the driving seat can legally attend to non-driving matters, within the prescribed ODD, without the need to supervise the system.\u003C/p>\n\u003Ch3>Usefulness, Safety, Scalability\u003C/h3>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/4fae36a27d981a211498c92aaf56312a_1675683373282.png\" alt=\"The process of moving from human operation to autonomous driving in an ODD should be clear.\" width=\"1650\" height=\"704\" />\u003C/p>\n\u003Cp>We contend that an Eyes-off system should be governed by three principles: (i) usefulness, (ii) safety, and (iii) scalability.\u003C/p>\n\u003Cp>\u003Cstrong>Usefulness\u003C/strong>: A good Eyes-off system should operate in an ODD that enables prolonged and continuous periods of use such that going in/out from an ODD does not happen frequently. As depicted in the figure below, if we look at the average person&rsquo;s commute, going in/out the ODD should happen only once. According to this requirement, we set the minimum useful ODD threshold to be freeways up to 80 mph, including the ability to navigate on-ramps and off-ramps. Anything below that is considered not useful. For example, a system with an ODD of freeways up to 40 mph would require the driver to take control every time the lead vehicle exceeds 40 mph. This also poses a safety risk as it increases the friction between the human and the machine.\u003C/p>\n\u003Cp>\u003Cstrong>Safety\u003C/strong>: A safe Eyes-off solution should have no systematic errors, i.e., an error that can be reproduced in a certain emergency situation that is within the system&rsquo;s ODD. Trying to hide behind statistics that the particular scenario will only happen rarely leads to unacceptable compromises in system design.&nbsp;\u003C/p>\n\u003Cp>To better articulate the point, let's look at the ODD of current Level 3 systems coming to market &ndash; freeways-only up to 40 mph with no lane changes. Because of the low operating speed, the system should allegedly be able to cope with any in-lane emergency braking. However, out-of-lane emergency maneuvers &ndash; responding to dangerous cut-ins, for example &ndash; are considered rare and not supported by the sensor configuration, the sensing state, and driving policy algorithms. For example, a sensor configuration that does not include high-resolution 360-degree coverage cannot support a lane change under all traffic densities. Failing to do so creates a &ldquo;reproducible error.&rdquo;\u003C/p>\n\u003Cp>The result of our requirement for no reproducible errors is that the system ODD should be much larger than the customer ODD. In other words, it is possible to limit the ODD to the customer to not allow lane changes in a regular mode of operations, but still, the system needs to be able to change lanes in an emergency maneuver. This notion should translate directly to the system&rsquo;s design &ndash;high-resolution surround sensors, driving policy, high-definition maps, controllability, and so forth.\u003C/p>\n\u003Cp>\u003Cstrong>Scalability\u003C/strong>: We believe the operational domain of a full Eyes-off/Hands-off vehicle can best be thought of as a stack of ODDs &ndash; starting from highways, then adding arterial roads, signaled intersections, unprotected turns and so forth, that eventually add up to autonomy everywhere. We call those ODDs &ldquo;autonomous blades,&rdquo; and through this, we have built an evolutionary path of incremental steps in our products from Eyes-on/Hands-off systems to full AVs.\u003C/p>\n\u003Cp>Without this approach, AV development simply doesn&rsquo;t scale. Every blade of operation requires a &ldquo;moonshot&rdquo; of validation and engineering, and if that moonshot is successful, it doesn&rsquo;t guarantee success in the next blade. Just because a prototype AV can manage city streets in daylight hours doesn&rsquo;t mean it can easily adapt to multilane highways at 80 mph at night, and vice versa. Instead, we have focused our &ldquo;moonshot&rdquo; efforts on the Eyes-on/Hands-off system with full ODD as a baseline for eyes-off blades. &nbsp;\u003C/p>\n\u003Ch3>The Bridge to Consumer AVs\u003C/h3>\n\u003Cp>Our \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\">Mobileye SuperVision&trade;\u003C/a> camera-only Eyes-on/Hands-off system already contains the entire technological backbone needed to enable hands-off driving, such that the transition to eyes-off blades only adds active sensors as redundant components to the perception system. All the heavy lifting of detailed sensing, the driving policy required to maneuver the car in any traffic scenario, and the requirement for HD maps covering all types of roads are all done in the SuperVision system. The redundancies to the perception system then become the only incremental work needed to make the leap from eyes-on to eyes-off.\u003C/p>\n\u003Cp>Today more than ever, we believe in the potential for autonomous technology to transform the world and how we travel daily. We at Mobileye know many share that belief, and as we put our technologies like SuperVision on the road, we will see those benefits come to life &ndash; especially if our industry can clearly share what that future looks like and disambiguate as many uncertainties as we can.\u003C/p>","2023-02-06T08:00:00.000Z","Industry, Autonomous Driving, Opinion, From our CEO",{"id":1166,"type":24,"url":1167,"title":1168,"description":1169,"primary_tag":1170,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1169,"image":1171,"img_alt":1172,"content":1173,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1174,"tags":1175},196,"mobileye-shares-fourth-quarter-and-full-year-2022-results-and-provides-2023-outlook","Q4 & Full-Year 2022 Financial Results","These latest financial results highlight the growing demand for the state-of-the-art, industry-leading products and solutions in Mobileye's portfolio.",17,"https://static.mobileye.com/website/us/corporate/images/ee8834cb53a833f204a9b06d1ab12bf0_1674731579225.png","Mobileye’s validation lab is shown testing new software against real world data.","\u003Cp>Mobileye today released its financial results for the fourth quarter and full-year of 2022, providing new details on demand for core ADAS and traction for advanced solutions. Mobileye’s fourth quarter performance highlights included a revenue increase of 59% year-over-year to $565 million.\u003C/p>\u003Cp>Read the entire update \u003Ca href=\"https://ir.mobileye.com/news-releases/news-release-details/mobileye-discloses-fourth-quarter-and-full-year-2022-results-and\" rel=\"noopener noreferrer\" target=\"_blank\">here\u003C/a>.\u003C/p>\u003Cp>Access the earnings call webcast \u003Ca href=\"https://ir.mobileye.com/events/event-details/q4-2022-mobileye-earnings-call\" rel=\"noopener noreferrer\" target=\"_blank\">here\u003C/a>.&nbsp;&nbsp;\u003C/p>","2023-01-26T15:00:00.000Z","News, Financial",{"id":1177,"type":5,"url":1178,"title":1179,"description":1180,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1180,"image":1181,"img_alt":1182,"content":1183,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1184,"tags":839},195,"ces-2023-recap","Driving Evolution at CES 2023: Everything You Might Have Missed","Mobileye returned to CES this year as a public company, with a lot to show – including a clear path to the future of autonomous driving.","https://static.mobileye.com/website/us/corporate/images/e1156cc4e033b35050360f512d4202c0_1673894548922.jpg","Mobileye displayed its latest solutions and technologies for assisted and autonomous driving at CES 2023 in Las Vegas.","\u003Cp>The global technology industry descended on Las Vegas earlier this month for CES 2023. After participating virtually for the past two years, Mobileye returned to the big show as a newly listed company with a clear vision for the path towards the future of autonomous mobility.\u003C/p>\n\u003Cp>This year, our focus was on \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener\">Mobileye SuperVision&trade;\u003C/a> and bridging the gap from assisted to autonomous driving. Here&rsquo;s a rundown of everything you might have missed from Mobileye at CES 2023, from the auditorium to the show floor.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/KQ1_SqcU2Ak\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Ch3>\u003Cstrong>Mobileye: Now, Next, Beyond &ndash; press conference with Prof. Amnon Shashua\u003C/strong>\u003C/h3>\n\u003Cp>Our CEO&rsquo;s annual address at CES stands among the most eagerly anticipated events in automotive tech, and this year more viewers signed up and tuned in than ever before. \u003Ca href=\"https://www.mobileye.com/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Professor Amnon Shashua\u003C/a> \u003Cspan style=\"background-color: white;\">presented a new driving automation taxonomy, using three axes to characterize degrees of autonomous capability: eyes on or eyes off, hands on or hands off, driver or no driver.\u003C/span> He gave insight into our most advanced technologies and solutions, outlined our business strategy, discussed the path of validating an eyes-off system, and much more.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/UCBlR4QFQCA\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Ch3>\u003Cstrong>&nbsp;On the Show Floor at CES\u003C/strong>\u003C/h3>\n\u003Cp>At CES 2023, Mobileye greeted tens of thousands of visitors at our booth with an array of informative and engaging demos. At its center was an immersive theater experience highlighting \u003Ca href=\"https://www.mobileye.com/solutions/\" target=\"_blank\" rel=\"noopener\">our most advanced solutions\u003C/a>, from \u003Ca href=\"https://www.mobileye.com/blog/cloud-enhanced-driver-assist/\" target=\"_blank\" rel=\"noopener\">Cloud-Enhanced Driver-Assist&trade;\u003C/a> through \u003Cspan style=\"color: #080606;\">Mobileye SuperVision&trade; to Mobileye Chauffeur&trade; and Mobileye Drive&trade;\u003C/span>.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/videoseries?list=PLWCfS_Yhbvs7b4VB3lem-V0z5W57QALP0\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>Two vehicles graced our show stand: the \u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-zeekr-ota-update/\" target=\"_blank\" rel=\"noopener\">Zeekr 001\u003C/a> and our \u003Ca href=\"https://www.mobileye.com/blog/robotaxi-night-drive-jerusalem-unedited-video/\" target=\"_blank\" rel=\"noopener\">NIO ES8 robotaxi\u003C/a>. Both were flanked by sliding &ldquo;x-ray&rdquo; screens to give visitors a look inside the Mobileye systems that deliver their respective advanced capabilities. We also shined the spotlight on our \u003Ca href=\"https://www.mobileye.com/blog/radar-lidar-next-generation-active-sensors/\" target=\"_blank\" rel=\"noopener\">imaging radar and FMCW lidar\u003C/a>, demonstrated our full product spectrum, and showcased the \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener\">EyeQ&trade; chips\u003C/a> that power everything we do.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/videoseries?list=PLWCfS_Yhbvs4WPIkTLuDmLBQyBPFJh_ia\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Ch3>\u003Cstrong>Beyond the Convention Center\u003C/strong>\u003C/h3>\n\u003Cp>During the show we also announced a \u003Ca href=\"https://www.mobileye.com/news/mobileye-and-wnc-collaborate-on-imaging-radar-production/\" target=\"_blank\" rel=\"noopener\">new partnership for radar production\u003C/a>, revealed that we have secured an \u003Ca href=\"https://www.mobileye.com/news/mobileye-kicks-off-av-pilot-in-germany/\" target=\"_blank\" rel=\"noopener\">AV testing permit\u003C/a> in Germany, detailed our \u003Ca href=\"https://www.mobileye.com/news/mobileye-growth-pipeline-fueled-with-supervision-and-future-av-wins/\" target=\"_blank\" rel=\"noopener\">business pipeline\u003C/a>, and participated in the introduction of the \u003Ca style=\"color: #0070c0;\" href=\"https://www.mobileye.com/news/holon-mover-ces-mobileye-drive/\" target=\"_blank\" rel=\"noopener\">HOLON mover\u003C/a> (featuring Mobileye Drive).\u003C/p>\n\u003Cp>Three of our \u003Ca href=\"https://www.mobileye.com/about/management/\" target=\"_blank\" rel=\"noopener\">senior executives\u003C/a> also provided additional insight in the run-up to the show into the unique ways in which Mobileye is driving the evolution from assisted to autonomous mobility.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/videoseries?list=PLWCfS_Yhbvs7kgUMj7mFSEwoJfKH2TNBd\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>For Mobileye, CES 2023 was an exciting and action-packed show, and we&rsquo;re energized to reach new heights &ndash; Now, in what comes Next, and in what lies Beyond.\u003C/p>","2023-01-25T08:00:00.000Z",{"id":1186,"type":24,"url":1187,"title":1188,"description":1189,"primary_tag":40,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1189,"image":1190,"img_alt":1191,"content":1192,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1193,"tags":1194},194,"holon-mover-ces-mobileye-drive","HOLON Mover Featuring Mobileye Drive™ at CES","The latest application for our turnkey self-driving system takes the futuristic form of an urban shuttle designed for autonomous Mobility-as-a-Service.\n\n","https://static.mobileye.com/website/us/corporate/images/ca7685f335a68afb05bb2ab5d1ebaf0a_1672941596222.jpg","The HOLON mover incorporates Mobileye Drive, our turnkey self-driving system, to deliver fully autonomous urban transportation. (Credit: HOLON)","\u003Cp>What will autonomous mobility look like? What form will self-driving vehicles take? There&rsquo;s no single answer &ndash; but here&rsquo;s one rather enticing proposition: \u003Ca href=\"https://www.benteler.com/en/press-media/news-and-press-releases/detail/The%20autonomous%20mover%20for%20everyone:%20World%20premiere%20of%20HOLON%20vehicle%20at%20CES%202023/\" target=\"_blank\" rel=\"noopener noreferrer\">the new HOLON mover\u003C/a>, unveiled last week at CES 2023 in Las Vegas.\u003C/p>\n\u003Cp>The latest implementation of \u003Ca href=\"https://www.mobileye.com/blog/mobileye-drive-self-driving-system/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Drive&trade;\u003C/a>, the HOLON mover is a shuttle bus designed for urban transportation. It&rsquo;s all-electric, accessible, and fully autonomous, thanks to technology from Mobileye.\u003C/p>\n\u003Ch3>\u003Cstrong>Meet the HOLON Mover\u003C/strong>\u003C/h3>\n\u003Cp>HOLON is a new mobility brand from \u003Ca href=\"https://www.benteler.com/en/press-media/news-and-press-releases/detail/BENTELER%20establishes%20HOLON,%20a%20new%20brand%20for%20the%20autonomous%20mobility%20of%20the%20future/\" target=\"_blank\" rel=\"noopener noreferrer\">BENTELER\u003C/a>, a leading manufacturer and supplier of metal components to the automobile industry. The new company&rsquo;s first product, the HOLON mover is an autonomous shuttle bus for use in a variety of urban transportation applications &ndash; including ride pooling, ride hailing, and scheduled service.\u003C/p>\n\u003Cp>The HOLON mover has room for up to 15 passengers on board. To optimize inclusivity, it features electric double-wing doors, photo-electric sensors, an accessibility ramp, automatic securing of wheelchairs, information presented in braille, and an audiovisual guide. It can travel autonomously around town at up to 60 km/h (37 mph), with an all-electric range estimated at 290 kilometers (180 miles) between charges.\u003C/p>\n\u003Cp>To make all that happen, HOLON brought together an impressive array of partners. BENTELER&rsquo;s extensive experience in automotive component manufacturing means the vehicle will be built to automotive standards of safety and quality. The renowned Italian design house Pininfarina penned its stylish form. Cognizant Mobility furnishes the software. And Mobileye provides the autonomous driving system.\u003C/p>\n\u003Cp>[**]gallery:holon-mover[**]\u003C/p>\n\u003Ch3>\u003Cstrong>Driven by Mobileye&trade;\u003C/strong>\u003C/h3>\n\u003Cp>The HOLON mover is the latest in a string of implementations for Mobileye Drive &ndash; our comprehensive \u003Ca href=\"https://www.mobileye.com/solutions/\" target=\"_blank\" rel=\"noopener noreferrer\">self-driving solution\u003C/a> for commercial vehicles. Mobileye Drive incorporates multiple types of sensors, with its camera subsystem operating independently of its radar/lidar subsystem to deliver \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">True Redundancy&trade;\u003C/a>. It also utilizes Mobileye Roadbook&trade;, our AV map generated and continually updated by our \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">Road Experience Management&trade; (REM&trade;)\u003C/a> crowdsourced mapping technology. It operates under our lean driving policy based on RSS&trade;: the \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">Responsibility-Sensitive Safety&trade;\u003C/a> model. And the whole system runs on our \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ&trade; System-on-Chip\u003C/a>.\u003C/p>\n\u003Cp>&ldquo;The market needs vehicles that seamlessly work in any traffic environment &ndash; and that is exactly what HOLON and Mobileye plan to deliver,&rdquo; Johann Jungwirth, Mobileye&rsquo;s Senior Vice President of Autonomous Vehicles, said at CES. &ldquo;I am confident that the HOLON mover will be a huge success. There are communities around the world that are searching for a mobility solution just like it.&rdquo;\u003C/p>\n\u003Cp>The HOLON mover joins a list of applications for Mobileye Drive previously unveiled, including \u003Ca href=\"https://www.mobileye.com/blog/self-driving-robotaxi-sixt-germany-iaa/\" target=\"_blank\" rel=\"noopener noreferrer\">our own Robotaxi\u003C/a>, the \u003Ca href=\"https://www.mobileye.com/blog/udelv-transporter-autonomous-delivery-vehicles-powered-by-mobileye/\" target=\"_blank\" rel=\"noopener noreferrer\">Udelv Transporter\u003C/a>, the \u003Ca href=\"https://www.mobileye.com/blog/mobileye-transdev-lohr-maas-i-cristal-shuttles-robotaxis/\" target=\"_blank\" rel=\"noopener noreferrer\">Lohr i-Cristal\u003C/a>, and a self-driving shuttle platform from \u003Ca href=\"https://www.schaeffler.com/en/media/press-releases/press-releases-detail.jsp?id=87723393\" target=\"_blank\" rel=\"noopener noreferrer\">Schaeffler\u003C/a>.\u003C/p>\n\u003Cp>[**]gallery:holon-mover-at-ces-2023[**]\u003C/p>\n\u003Ch3>\u003Cstrong>When will these self-driving vehicles be available? \u003C/strong>\u003C/h3>\n\u003Cp>HOLON aims to begin producing the mover in the United States towards the end of 2025, and plans to expand production to sites in Europe and Asia in the following years. And it already has partnerships in place for \u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">self-driving Mobility-as-a-Service\u003C/a> with Beep in the U.S. and with Hamburger Hochbahn in Germany.\u003C/p>\n\u003Cp>We look forward to seeing the HOLON mover hit the road in the near future to reach another milestone in our drive \u003Ca href=\"https://www.mobileye.com/blog/autonomous-vehicle-technology-everywhere-in-every-way-for-everyone/\" target=\"_blank\" rel=\"noopener noreferrer\">to bring the benefits of autonomous mobility to everyone, everywhere, in every way\u003C/a>.\u003C/p>","2023-01-12T08:00:00.000Z","Autonomous Driving, Driverless MaaS, News, Events",{"id":1196,"type":5,"url":1197,"title":1198,"description":1199,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1199,"image":1200,"img_alt":1201,"content":1202,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":1203,"tags":97},225,"now-next-beyond-prof-amnon-shashua-at-ces-2023","Now, Next, Beyond: Prof. Amnon Shashua at CES 2023","At CES this year, Prof. Amnon Shashua, President and CEO of Mobileye, presented our roadmap and progress towards fully autonomous vehicles.","https://static.mobileye.com/website/us/corporate/images/5a27679c9f345c771979c316d73ec7d5_1692801152226.png","Mobileye: Now, Next, Beyond - CES 2023 Press Conference with Prof. Amnon Shashua","\u003Cp>The address delivered each year at CES by our CEO stands among the most eagerly anticipated events in automotive tech, and this year more viewers signed up and tuned in than ever before.&nbsp;Professor Amnon Shashua&nbsp;presented a new driving automation taxonomy, using three axes to characterize degrees of autonomous capability: eyes on or eyes off, hands on or hands off, driver or no driver.&nbsp;He gave insight into our most advanced technologies and solutions, outlined our business strategy, discussed the path of validating an eyes-off system, and much more.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe title=\"YouTube video player\" src=\"https://www.youtube.com/embed/UCBlR4QFQCA\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>","2023-01-06T08:00:00.000Z",{"id":1205,"type":24,"url":1206,"title":1207,"description":1208,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1208,"image":1209,"img_alt":1210,"content":1211,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1212,"tags":1213},193,"mobileye-growth-pipeline-fueled-with-supervision-and-future-av-wins","SuperVision and Future AV Wins Fuel Growth","Mobileye booked a record stream of design wins for our ADAS technologies in 2022 which are expected to yield a $17 billion revenue pipeline through 2030.","https://static.mobileye.com/website/us/corporate/images/697750f14adf441a4f4844d92237a124_1672880637316.jpg","The Zeekr 001 electric vehicle. Owners in China will soon get the latest over-the-air update of Mobileye’s SuperVision tech. ","\u003Cp>LAS VEGAS, January 5, 2023 — Today at CES 2023, Mobileye founder and CEO Prof. Amnon Shashua will illustrate the company’s global leadership in autonomous vehicle and advanced driver-assistance technology, providing new details on the company’s business for 2023 and beyond. In particular, Shashua will lay out how future consumer AVs will come to market at scale by harnessing Mobileye’s state-of-the-art driver assist system, Mobileye SuperVision™, as the baseline for higher levels of autonomy.&nbsp;&nbsp;&nbsp;\u003C/p>\u003Cp>Mobileye now sees a revenue pipeline\u003Cspan style=\"color: rgb(64, 64, 64);\">[1]\u003C/span> of ADAS business through 2030 of greater than $17 billion – including $3.5 billion of projected revenue from SuperVision alone, a product that was only launched in the fourth quarter of 2021. Thanks to other new products like EyeQ6™ system-on-chip, the pipeline grew impressively over the course of 2022 as Mobileye added $6.7 billion projected revenue in ADAS, across projected future volume of 63.6 million systems.\u003C/p>\u003Cp>In addition to ADAS, Mobileye announced an additional $3.5 billion in expected revenue from Autonomous Mobility-as-a-Service products through 2028, based on deals with three major partners, including a recently secured mobility-as-a-service AV program with a leading EU commercial vehicle builder. Mobileye also has a line of sight for $1.5 billion in revenue from a single consumer AV (Mobileye Chauffeur™) program through 2030.\u003C/p>\u003Cp>“In the short time since we went public in October, Mobileye’s business has accelerated substantially,” said Shashua. “We’ve seen strong positive response from our existing customers and new automakers as well who believe in our vision for building cloud-connected, AI-driven driver assist and autonomous technology that can scale globally and provide meaningful benefits to millions of drivers worldwide.”\u003C/p>\u003Ch2>\u003Cstrong>SuperVision leads the market\u003C/strong>\u003C/h2>\u003Cp>Mobileye’s SuperVision “eyes-on, hands-off” system is seeing strong customer demand in China, where more than 70,000 Zeekr 001 EV owners will soon get an additional over-the-air update that unlocks key mapping-based features. SuperVision will also be included in the upcoming Zeekr 009, along with near-term global launches on models from three other brands under the Geely Group umbrella.\u003C/p>\u003Cp>By combining a camera-only sensing system with Mobileye’s key mapping and decision-making technology, SuperVision gives automakers an affordable, flexible platform for eyes-on, hands-off driving across a range of operational design domains.\u003C/p>\u003Cp>The system’s success and speed-to-market in China’s highly competitive automotive landscape has driven new business wins for SuperVision-based systems around the world. Mobileye has kicked off development work with a premium European automaker for programs targeting delivery in 2025, with other customers at advanced stages of development. Overall, Mobileye now expects volume of SuperVision based vehicles to reach about 1.2 million units in 2026.\u003C/p>\u003Cp>Just as importantly, OEM’s are showing strong desire to leverage investments in SuperVision as a bridge to enable eyes-off autonomous functions – comparable to SAE Level 3 and Level 4 -- across a variety of operational design domains. This can be done by simply adding additional sensing suites and computing power in a modular way to create a high-value, cost-efficient, eyes-off product for consumer-owned vehicles in the medium term.\u003C/p>\u003Ch2>\u003Cstrong>Continued progress on Mobility-as-a-Service technology and validation\u003C/strong>\u003C/h2>\u003Cp>Beyond SuperVision and consumer-owned AVs, Mobileye has continued to develop its Mobileye Drive mobility-as-a-service (MaaS) autonomous vehicle technology in 2022. We recently signed an MOU for several thousand units with a major global producer of light commercial vehicles. In 2023, Mobileye will continue testing its AV tech, with pilots of our latest vehicle technology hitting the road in Germany.\u003C/p>\u003Cp>While sentiment around AVs has swung widely over the past year, Mobileye has remained focused on delivering scalable, modular AV technology. Ensuring public and regulatory trust in autonomous vehicles before they hit the road will require robust, transparent validation. Today, Shashua will detail for the first time Mobileye’s three-layer validation approach to AV technology that leverages our unique assets (such as Road Experience Mapping data and True Redundancy™). This combination of real-world testing, simulation and hardware-in-the-loop validation allows Mobileye to marshal its massive road-test data towards solving AVs at scale.\u003C/p>\u003Cp>Our MaaS deployment partners also continue to make progress. At CES 2023, our AV collaborator Holon revealed a new autonomous people mover powered by Mobileye Drive, one of many such collaborations we have worldwide with companies exploring different AV business models.&nbsp;\u003C/p>\u003Cp>By approaching AV tech in the same way we popularized ADAS technology – focusing on solutions that can be built affordably, work globally and built flexibly for different types of vehicles – Mobileye sees a clear and financially sustainable path to developing both consumer-owned and fleet-deployed AVs.\u003C/p>\u003Ch2>\u003Cstrong>Additional expansion in ADAS\u003C/strong>\u003C/h2>\u003Cp>All of these advancements build from our ongoing innovation in the ADAS market, where we continue to see strong demand and growth. Last year alone, some 233 models launched globally with Mobileye technology inside.\u003C/p>\u003Cp>Our innovation roadmap for the future includes key new technologies like imaging radar, a sensor that can provide the benefits of lidar at a fraction of the cost. After demonstrating the potential of imaging radar last year, Mobileye has begun collaborating with Wistron NeWeb, an experienced automotive radar supplier, to bring the technology to production two years from now. And our other advanced solutions, from cloud-enhanced services to intelligent speed assist, continue to make strong inroads among key global automakers.\u003C/p>\u003Cp>As a public company once again, Mobileye’s history as a pioneer in ADAS demonstrates how we’ve changed the world over the past 20 years, and how we will bring innovation to life at a global scale in the years ahead.\u003C/p>\u003Cp>[1] Mobileye’s revenue for the periods presented represent estimated volumes based on projections of future production volumes that were provided by our current and prospective OEMs at the time of sourcing the design wins for the models related to those design wins. See the disclaimer under the heading “Forward–Looking Statements” below for important limitations applicable to these estimates.\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>&nbsp;\u003Cstrong>Media Contact: \u003C/strong>\u003C/p>\u003Cp>Justin Hyde\u003C/p>\u003Cp>\u003Ca href=\"mailto:Justin.Hyde@Mobileye.com\" rel=\"noopener noreferrer\" target=\"_blank\">Justin.Hyde@Mobileye.com\u003C/a>\u003C/p>\u003Cp>+1 202-656-6749&nbsp;\u003C/p>\u003Cp>___________________________________\u003C/p>\u003Cp>Mobileye (Nasdaq: MBLY) leads the mobility revolution with its autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in computer vision, artificial intelligence, mapping, and data analysis. Since its founding in 1999, Mobileye has pioneered such groundbreaking technologies as REM™ crowdsourced mapping, True Redundancy™ sensing, and Responsibility Sensitive Safety (RSS). These technologies are driving the ADAS and AV fields towards the future of mobility – enabling self-driving vehicles and mobility solutions, powering industry-leading advanced driver-assistance systems and delivering valuable intelligence to optimize mobility infrastructure. To date, more than 125 million vehicles worldwide have been built with Mobileye technology inside. In 2022 Mobileye listed as an independent company separate from Intel (Nasdaq: INTC), which retains majority ownership. For more information, visit \u003Ca href=\"https://www.mobileye.com/\" rel=\"noopener noreferrer\" target=\"_blank\">https://www.mobileye.com\u003C/a>.\u003C/p>\u003Cp>&nbsp;“Mobileye,” the Mobileye logo and Mobileye product names are registered trademarks of Mobileye in various jurisdictions. All other marks are the property of their respective owners.\u003C/p>\u003Cp>\u003Cstrong style=\"font-family: intelone-display-regular, Inter, sans-serif; color: rgb(64, 64, 64);\">Forward-Looking Statements \u003C/strong>\u003C/p>\u003Cp>\u003Cspan style=\"color: rgb(64, 64, 64);\">Statements in this press release and the presentation referenced herein that are not statements of historical fact, including statements about our beliefs and expectations, are forward-looking statements and should be evaluated as such. Forward-looking statements include descriptions of our business plan and strategies. These statements often include words such as “anticipate,” “expect,” “suggests,” “plan,” “believe,” “intend,” “estimates,” “targets,” “projects,” “should,” “could,” “would,” “may,” “will,” “forecast,” or the negative of these terms, and other similar expressions, although not all forward-looking statements contain these words. We base these forward-looking statements or projections on our current expectations, plans and assumptions that we have made in light of our experience in the industry, as well as our perceptions of historical trends, current conditions, expected future developments and other factors we believe are appropriate under the circumstances and at such time. You should understand that these statements are not guarantees of performance or results. The forward-looking statements and projections are subject to and involve risks, uncertainties and assumptions and you should not place undue reliance on these forward-looking statements or projections. Although we believe that these forward-looking statements and projections are based on reasonable assumptions at the time they are made, you should be aware that many factors could cause actual results to differ materially from those expressed in the forward-looking statements and projections. Important factors that may materially affect such forward-looking statements and projections include the following: future business, social and environmental performance, goals and measures; our anticipated growth prospects and trends in markets and industries relevant to our business; business and investment plans; expectations about our ability to maintain or enhance our leadership position in the markets in which we participate; future consumer demand and behavior; future products and technology, and the expected availability and benefits of such products and technology; development of regulatory frameworks for current and future technology; ​projected cost and pricing trends; ​future production capacity and product supply; potential future benefits and competitive advantages associated with our technologies and architecture and the data we have accumulated; the future purchase, use and availability of products, components and services supplied by third parties, including third-party IP and manufacturing services; uncertain events or assumptions, including statements relating to our addressable markets, estimated vehicle production and market opportunity, potential production volumes associated with design wins and other characterizations of future events or circumstances; future responses to and effects of the COVID-19 pandemic; availability, uses, sufficiency and cost of capital and capital resources, including expected returns to stockholders such as dividends, and the expected timing of future dividends; tax- and accounting-related expectations. Detailed information regarding these and other factors that could affect Mobileye’s business and results is included in Mobileye’s&nbsp;SEC&nbsp;filings, including the company’s Registration Statement (No. 333-267685) on Form S-1, particularly in the section entitled the “Risk Factors”. Copies of these filings may be obtained by visiting our Investor Relations website at&nbsp;ir.mobileye.com&nbsp;or the SEC’s website at&nbsp;www.sec.gov. \u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: rgb(64, 64, 64);\">The estimates included herein are based on projections of future production volumes that were provided by our current and prospective OEMs at the time of sourcing the design wins for the models related to those design wins. For the purpose of these estimates we estimated sale prices based on our management’s estimates for the applicable product bundles and periods. Achieving design wins is not a guarantee of revenue, and our sales may not correlate with the achievement of additional design wins. Moreover, our pricing estimates are made at the time of a request for quotation by an OEM (in the case of estimates related to contracted customers), so that worsening market or other conditions between the time of a request for quotation and an order for our solutions may require us to sell our solutions for a lower price than we initially expected. These estimates may deviate from actual production volumes and sale prices (which may be higher or lower than the estimates) and the amounts included for prospective but uncontracted production volumes may never be achieved.&nbsp;Accordingly, these estimations are subject to and involve risks, uncertainties and assumptions and you should not place undue reliance on these forward-looking statements or projections. \u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: rgb(64, 64, 64);\">Mobileye does not intend to provide any updates to information concerning its actual or anticipated future results of operations, including 2022 results or guidance for fiscal year 2023, in this press release or the presentation referenced herein, and investors should not infer from any statement made in this release or the presentation referenced herein any implications relating to Mobileye’s results of operations or guidance for such periods.&nbsp;The&nbsp;estimates presented are just estimates and are not based on contracted orders.&nbsp;Mobileye’s actual revenue for the periods presented is likely to vary materially from the estimates.&nbsp;\u003C/span>\u003C/p>","2023-01-05T00:00:00.000Z","Press Kit, Industry, News, Autonomous Driving, ADAS, Financial",{"id":1215,"type":24,"url":1216,"title":1217,"description":1218,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1218,"image":1219,"img_alt":1220,"content":1221,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1222,"tags":1223},190,"mobileye-and-wnc-collaborate-on-imaging-radar-production","Mobileye & WNC collaborate on imaging radar","Mobileye is developing automotive-grade, software-defined, four-dimensional digital imaging radars in-house that are targeted for production in two years.","https://static.mobileye.com/website/us/corporate/images/989f7126b675ec5849735fffea8dd911_1672779722464.png","Mobileye’s imaging radar supports other key AV vision sensors, and can detect objects, vehicles and pedestrians at distances of up to 1,000 feet. Illustration: Mobileye. ","\u003Cp>\u003Cspan style=\"color: rgb(64, 64, 64); background-color: rgb(255, 255, 255);\">LAS VEGAS, January 4, 2023 — Over the past few years, Mobileye has been developing a new technology to help autonomous vehicles sense and understand their environment – regardless of weather, lighting or road types – in addition to the company’s renowned camera-based perception systems. Known as software-defined imaging radar, or 4D radar, the technology will play a key role in bringing autonomous vehicles and the most advanced forms of driver-assistance technology to life.&nbsp;\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: rgb(64, 64, 64);\">Today, Mobileye announced a collaboration with Wistron NeWeb Corp. (WNC) for production of its software-defined imaging radars. WNC, based in Taiwan, works as a major electronics and radar supplier for automakers worldwide. This collaboration&nbsp;is expected to allow Mobileye and WNC to begin producing automotive-grade imaging radars two years from now, with strong initial interest in the technology from key automaker customers.&nbsp;\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: rgb(64, 64, 64);\">“The imaging radars we have been developing over the past few years are uniquely designed to be an essential enabler of high autonomy levels in future vehicles, by delivering rich and reliable radar output, upgrading perception-by-radar capabilities, and reducing the need for multiple lidar sensors,” said Yaniv Avital, Mobileye’s Radar Vice President and General Manager. “WNC’s experience and accomplishments as an automotive supplier can help us bring this much-needed innovation to the market by our original targeted timeline and at the expected quality.”&nbsp;&nbsp;\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: rgb(64, 64, 64);\">The imaging radar developed by Mobileye goes far beyond the simple devices on vehicles today. Radars emit radio frequency signals to detect obstacles, and just like cameras, the more data they can process, the more details they can spot. When paired with advanced cameras, radars can provide sensing at longer distances and in certain weather or lighting conditions that can even challenge camera-based imaging.&nbsp;\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: rgb(64, 64, 64);\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/39f65e4a35960603c77ed0cb7babb57c_1672783775834.png\" alt=\"A visualization of Mobileye’s imaging radar at left; a comparison of what camera sensors see on top right versus radar data on the bottom right. The different colors in the radar image represent speed.\">Mobileye’s imaging radars use advanced radar architecture including Massive MIMO (multiple-input, multiple-output) antenna design, a high-end radio frequency design developed in-house, and high-fidelity sampling – all enabling accurate object detection and wider dynamic range. Thanks to an integrated system-on-chip design that maximizes processor efficiency, and world-leading algorithms for interpreting radar data, Mobileye’s imaging radars deliver a detailed, four-dimensional image of surroundings up to 1,000 feet away and beyond. With a 140-degree field-of-view at medium range and 170-degree field of view in close range, the radar enables more accurate detection of pedestrians, vehicles or obstructions that other sensors might miss – even on crowded urban streets.&nbsp;&nbsp;\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: rgb(64, 64, 64);\">“The new imaging radar technology is a key focus for future high-level autonomous driving,” said Repus Hsiung, Vice President &amp; General Manager of the Automotive &amp; Industrial Solutions BG at WNC. \"We are delighted to collaborate with Mobileye to accelerate the availability of advanced imaging radars in the market. Leveraging our expertise in automotive electronics and radar solutions, we look forward to working with Mobileye to further develop exciting new capabilities.”&nbsp;\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: rgb(64, 64, 64);\">&nbsp;Mobileye’s True Redundancy™ approach to autonomous vehicles envisions using imaging radars to create a 360-degree sensing system that operates in addition to, but independently from, a camera-based system. By having multiple systems that are each capable of navigating a vehicle alone, the “eyes off” autonomous system can deliver reliable rides with low chance of failure and simplified safety validation. Imaging radar can also play a role in more advanced hands-free ADAS solutions as an alternative to LiDAR solutions, which are typically far more expensive.&nbsp;&nbsp;\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: rgb(64, 64, 64);\">___________________________________&nbsp;\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: rgb(64, 64, 64);\">Mobileye (Nasdaq: MBLY) leads the mobility revolution with its autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in computer vision, artificial intelligence, mapping, and data analysis. Since its founding in 1999, Mobileye has pioneered such groundbreaking technologies as REM™ crowdsourced mapping, True Redundancy™ sensing, and Responsibility Sensitive Safety (RSS). These technologies are driving the ADAS and AV fields towards the future of mobility – enabling self-driving vehicles and mobility solutions, powering industry-leading advanced driver-assistance systems and delivering valuable intelligence to optimize mobility infrastructure. To date, more than 125 million vehicles worldwide have been built with Mobileye technology inside. In 2022 Mobileye listed as an independent company separate from Intel (Nasdaq: INTC), which retains majority ownership. For more information, visit \u003C/span>\u003Ca href=\"https://www.mobileye.com/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: rgb(5, 99, 193);\">https://www.mobileye.com\u003C/a>\u003Cspan style=\"color: rgb(64, 64, 64);\">.&nbsp;\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: rgb(64, 64, 64);\">&nbsp;“Mobileye,” the Mobileye logo and Mobileye product names are registered trademarks of Mobileye in various jurisdictions. All other marks are the property of their respective owners.&nbsp;\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: rgb(64, 64, 64);\">&nbsp;\u003C/span>\u003C/p>","2023-01-04T00:00:00.000Z","Press Kit, News, Autonomous Driving, AV Safety",{"id":1225,"type":24,"url":1226,"title":1227,"description":1228,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1228,"image":1229,"img_alt":1230,"content":1231,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1222,"tags":1232},191,"mobileye-kicks-off-av-pilot-in-germany","Mobileye Kicks Off AV Pilot in Germany","In a key step, Mobileye has successfully completed the AV-Permit process outlined by TÜV SÜD to operate NIO ES8 vehicles with AV technology on German streets.","https://static.mobileye.com/website/us/corporate/images/1da7649597f1a18fe101abbb7eeb4f87_1673551896394.jpg","NIO ES8 equipped with cameras, radar and lidar sensors, of the type to be used in an autonomous vehicle test in Germany.","\u003Cp>With the start of the year 2023, Mobileye has taken a key step on the road to the autonomous future of mobility: Mobileye obtained a permit recommendation from TÜV SÜD, an independent third-party for testing, certification, auditing and advisory service in Germany, enabling Mobileye to operate its AV technology on German streets. The permit makes way for Mobileye to expand the pilot phase in Germany and operate \u003Ca href=\"https://www.mobileye.com/blog/self-driving-robotaxi-sixt-germany-iaa/\" rel=\"noopener noreferrer\" target=\"_blank\">Mobileye Drive™-equipped NIO ES8s\u003C/a> with a responsible safety driver on all roads in Germany.\u003C/p>\u003Ch3>\u003Cstrong>Kick-Off for Mobility-as-a-Service (MaaS) Projects in Munich and Darmstadt\u003C/strong>\u003C/h3>\u003Cp>With the official recommendation, Mobileye is taking the next step in realizing new mobility concepts in Germany and beyond. \u003Ca href=\"https://www.nio.com/news/nio-inc-announces-strategic-collaboration-mobileye-bring-level-4-autonomous-driving-vehicles\" rel=\"noopener noreferrer\" target=\"_blank\">NIO’s ES8\u003C/a> was chosen by Mobileye in 2021 as the vehicle platform for Mobility-as-a-Service (MaaS) offerings underway with various partners in Munich and Darmstadt, as well as in other projects around Europe. NIO ES8s equipped with Mobileye’s self-driving hardware and software are planned to be used in a robotaxi service as well as in the integration of on-demand shuttles into local public transport in Germany. Following the regulations adopted by European Union and German authorities for safe autonomous driving (AV) testing and deployment in 2022, the pilot stage for these services on German roads will accelerate throughout 2023. A safety driver will be behind the wheel until all needed approvals and permits are obtained for the vehicle to be entirely unmanned.\u003C/p>\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/8c9ad648354eb06543761a4ea6d73f83_1673551976724.jpg\" alt=\"Mobileye will test its autonomous vehicles on German roadways, like this Autobahn through Munich, where TÜV SÜD is based.\">\u003C/p>\u003Cp>“We are excited to have taken the next steps in bringing our self-driving technology onto German streets,” says \u003Ca href=\"https://www.mobileye.com/blog/robotaxi-night-drive-jerusalem-unedited-video/\" rel=\"noopener noreferrer\" target=\"_blank\">Johann Jungwirth\u003C/a>, Senior Vice President, Autonomous Vehicles at Mobileye. “We are thankful for TÜV Süd’s trusted support in expanding our autonomous vehicle technology testing in Germany. This allows us to show our capabilities to consumers, automakers and transportation agencies.”\u003C/p>\u003Ch3>\u003Cstrong>Mobileye Technology Onboard\u003C/strong>\u003C/h3>\u003Cp>The NIO ES8 is retrofitted with a broad range of sensors and Mobileye’s own autonomous self-driving system Mobileye Drive™ for a defined operational design domain (“ODD”)\u003Cspan style=\"color: rgb(64, 64, 64);\">. A set of 13 cameras, plus an independent secondary perception system consisting of six surround radars and three long-range and six short-range surround lidars gives the self-driving vehicle redundant sensing capabilities (we call this approach \u003C/span>\u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: rgb(64, 64, 64);\">True Redundancy™\u003C/a>\u003Cspan style=\"color: rgb(64, 64, 64);\">). The \u003C/span>\u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: rgb(64, 64, 64);\">EyeQ™ System-on-Chip \u003C/a>\u003Cspan style=\"color: rgb(64, 64, 64);\">(SoC) provides the necessary computing power to not only process the real-world data but also to make use of Mobileye’s \u003C/span>\u003Ca href=\"https://www.mobileye.com/technology/rem/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: rgb(64, 64, 64);\">Road Experience Management™ (REM™)\u003C/a>\u003Cspan style=\"color: rgb(64, 64, 64);\"> – AV maps – and lean driving policy with the mathematical model for automated driving we call \u003C/span>\u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: rgb(64, 64, 64);\">Responsibility-Sensitive Safety™ (RSS™)\u003C/a>\u003Cspan style=\"color: rgb(64, 64, 64);\">. \u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: rgb(64, 64, 64);\">This system allows for scalability and different MaaS services like autonomous robotaxis, public-transit shuttles or last-mile goods delivery for defined ODD.\u003C/span>\u003C/p>\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/bcd5346c56d70331fe329e7beb6e2f0a_1673551997991.jpg\" alt=\"A fleet of NIO ES8 robotaxis powered by Mobileye Drive™ outside our headquarters in Jerusalem, Israel.\">\u003C/p>\u003Ch3>\u003Cstrong>Ensuring AV Standards with TÜV SÜD\u003C/strong>\u003C/h3>\u003Cp>\u003Cspan style=\"color: black; background-color: white;\">As in 2020, when Mobileye was one of the first companies outside of automakers to receive a permit to&nbsp;\u003C/span>\u003Ca href=\"https://www.mobileye.com/news/mobileye-testing-self-driving-vehicles-germany/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: rgb(81, 94, 255); background-color: rgb(255, 255, 255);\">test autonomous vehicles (AVs) on open roads in Germany\u003C/a>\u003Cspan style=\"color: black; background-color: white;\">, TÜV SÜD and Mobileye have worked closely together with the regulatory authorities to ensure safe operation of the self-driving vehicle powered by Mobileye Drive.\u003C/span>\u003C/p>\u003Cp>Mobileye provided comprehensive technical documentation as well as undergoing various safety tests with the robotaxi in the past months to obtain the TÜV SÜD approval recommendation for operating self-driving vehicles on public roads in Germany.. TÜV SÜD has developed a rigorous assessment framework and test procedure in recent years that works as a blueprint for providers of AV technologies while building consumer trust in the reliability and safety of the technology.\u003C/p>","Autonomous Driving, Driverless MaaS, AV Safety, News",{"id":1234,"type":69,"url":1235,"title":1236,"description":1237,"primary_tag":73,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1237,"image":1238,"img_alt":1239,"content":1240,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1241,"tags":1242},192,"mobileye-at-ces-2023","Mobileye at CES 2023 Complete Press Kit","All the information you're looking for about Mobileye at CES 2023 can be found here in our online press kit, including Prof. Amnon Shashua's press conference.","https://static.mobileye.com/website/us/corporate/images/dcbdebad96fd1149f00e9a93eee90eb9_1672863216143.jpg","Mobileye at CES 2023 ","\u003Cp>\u003Cspan style=\"color: #000000;\">At CES 2023, Mobileye showcased innovation and progress on the road to autonomy. Through a series of events and presentations, including the Mobileye: Now, Next, Beyond Press Conference with Professor Amnon Shashua; demonstrations at the Mobileye booth; and exciting partner initiatives; Mobileye brought to CES 2023 an insider&rsquo;s view into the autonomous vehicle revolution it is driving.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye: Now, Next, Beyond - CES 2023 Press Conference with CEO Prof. Amnon Shashua (Replay)\u003C/strong>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/UCBlR4QFQCA\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Ch2>\u003Cstrong>Mobileye CES 2023 news:\u003C/strong>\u003C/h2>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/news/mobileye-growth-pipeline-fueled-with-supervision-and-future-av-wins/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Growth Pipeline Fueled with SuperVision&trade; and Future AV Wins\u003C/a>\u003C/p>\n\u003Cp>\u003Ca style=\"color: #000000;\" href=\"https://www.mobileye.com/news/mobileye-and-wnc-collaborate-on-imaging-radar-production/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye and WNC collaborate on imaging radar production\u003C/a>\u003C/p>\n\u003Cp>\u003Ca style=\"color: #000000;\" href=\"https://www.mobileye.com/news/mobileye-kicks-off-av-pilot-in-germany/\" target=\"_blank\" rel=\"noopener\">Mobileye kicks off AV pilot in Germany | Mobileye Blog\u003C/a>\u003Cspan style=\"color: #000000;\"> \u003C/span>\u003C/p>\n\u003Cp>\u003Ca style=\"color: #000000;\" href=\"https://www.mobileye.com/news/mobileye-announces-ces-press-conference-with-prof-amnon-shashua/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Announces CES 2023 Press Conference with Prof. Amnon Shashua\u003C/a>\u003C/p>\n\u003Ch3>\u003Cstrong style=\"color: #000000;\">Mobileye \u003C/strong>\u003Cstrong style=\"color: #404040; background-color: white;\">SuperVision&trade;:\u003C/strong>\u003C/h3>\n\u003Cp>\u003Ca style=\"background-color: #ffffff; color: #242424;\" href=\"https://static.mobileye.com/website/common/files/SuperVision%20one%20pager.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">SuperVision Backgrounder\u003C/a> \u003Cspan style=\"color: #242424; background-color: #ffffff;\">(download)\u003C/span>\u003C/p>\n\u003Cp>[**]gallery:mobileye-supervision[**]\u003C/p>\n\u003Ch3>\u003Cstrong>CES 2023 Photos:\u003C/strong>\u003C/h3>\n\u003Cp>[**]gallery:ces-2023[**]\u003C/p>\n\u003Ch3>\u003Cstrong style=\"color: #000000;\">Mobileye Visual Assets:\u003C/strong>\u003C/h3>\n\u003Cp>[**]gallery:mobileye-at-ces-2023[**]\u003C/p>\n\u003Cp>\u003Ca href=\"https://vimeo.com/763958794\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye: Autonomous Driving and Technology Development \u003C/a>(Broll)\u003C/p>","2023-01-03T08:00:00.000Z","Events, Amnon Shashua, Press Kit",{"id":1244,"type":24,"url":1245,"title":1246,"description":1247,"primary_tag":190,"author_name":10,"is_hidden":11,"lang":12,"meta_description":1247,"image":1248,"img_alt":1249,"content":1250,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1251,"tags":1252},188,"mobileye-announces-ces-press-conference-with-prof-amnon-shashua","Mobileye Announces CES 2023 Press Conference with Prof. Amnon Shashua  ","Mobileye: Now, Next, Beyond to be presented live from Las Vegas on January 5 at 11 a.m. PST ","https://static.mobileye.com/website/us/corporate/images/07bae5284d54540c19d53a26fece6615_1671703079240.png","Mobileye: Now, Next, Beyond -  CES 2023 Press Conference with Prof. Amnon Shashua","\u003Cp>\u003Cstrong style=\"color: #080606;\">JERUSALEM, December 21, 2022:\u003C/strong>\u003Cspan style=\"color: #080606;\"> Mobileye (Nasdaq: MBLY) will present Mobileye: Now, Next, Beyond; its CES 2023 Press Conference with CEO Professor Amnon Shashua on January 5, 2023 at 11 a.m. PST.&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #080606;\">Prof. Shashua will provide an&nbsp;update on the state of Mobileye&rsquo;s market leadership&nbsp;and how Mobileye will bring its near- and long-term vision to life. The presentation will illustrate the industry&rsquo;s most advanced ADAS-to-AV technology, detailing the spectrum of solutions Mobileye is pioneering to redefine the driving experience now and in the future.&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #080606;\">CES visitors are invited to learn more about how Mobileye is driving the autonomous vehicle evolution by visiting Mobileye in the Las Vegas Convention Center (LVCC) West Hall booth #4601, Thursday,&nbsp;Jan.&nbsp;5 through Sunday, Jan.&nbsp;8, 2023, during show hours.&nbsp;\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #080606;\">Mobileye: Now, Next, Beyond\u003C/strong>\u003Cspan style=\"color: #080606;\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #080606;\">Press Conference with Professor Amnon Shashua&nbsp;\u003C/strong>\u003Cspan style=\"color: #080606;\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #080606;\">When:\u003C/strong>\u003Cspan style=\"color: #080606;\"> Thursday, January 5, 11 a.m. PST (2 p.m. EST | 7 p.m. GMT | 9 p.m. IST)&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #080606;\">Where:\u003C/strong>\u003Cspan style=\"color: #080606;\"> LVCC West Hall | Room W327&nbsp;&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #080606;\">How:\u003C/strong>\u003Cspan style=\"color: #080606;\"> Register today to reserve your seat: \u003C/span>\u003Ca style=\"color: #0563c1;\" href=\"https://www.mobileye.com/ces-2024/\">\u003Cu>https://www.mobileye.com/ces-2023\u003C/u>\u003C/a>\u003Ca href=\"https://www.mobileye.com/ces-2023/\" rel=\"noopener noreferrer\">\u003Cu>/\u003C/u>\u003C/a>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #080606;\">Livestream:\u003C/strong>\u003Cspan style=\"color: #080606;\"> Watch live at: \u003C/span>\u003Ca style=\"color: #0563c1;\" href=\"https://www.mobileye.com/ces-2024/\">\u003Cu>https://www.mobileye.com/ces-2023\u003C/u>\u003C/a>\u003Ca href=\"https://www.mobileye.com/ces-2023/\" rel=\"noopener noreferrer\">\u003Cu>/\u003C/u>\u003C/a>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #080606;\">Add the livestream to your calendar by registering in advance.&nbsp;&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #080606;\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #101820;\">Contacts\u003C/strong>\u003Cspan style=\"color: #101820;\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #101820;\">Dan Galves&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #101820;\">Investor Relations&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Ca style=\"color: #0563c1;\" href=\"mailto:investors@mobileye.com\" target=\"_blank\" rel=\"noopener noreferrer\">investors@mobileye.com\u003C/a>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #101820;\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #101820;\">Justin Hyde&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #101820;\">Media Relations&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Ca style=\"color: #0563c1;\" href=\"mailto:justin.hyde@mobileye.com\" target=\"_blank\" rel=\"noopener noreferrer\">justin.hyde@mobileye.com\u003C/a>\u003Cspan style=\"color: #101820;\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #080606;\">&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #404040;\">___________________________________&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #404040;\">Mobileye (Nasdaq: MBLY) leads the mobility revolution with its autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in computer vision, artificial intelligence, mapping, and data analysis. Since its founding in 1999, Mobileye has pioneered such groundbreaking technologies as REM&trade; crowdsourced mapping, True Redundancy&trade; sensing, and Responsibility Sensitive Safety (RSS). These technologies are driving the ADAS and AV fields towards the future of mobility &ndash; enabling self-driving vehicles and mobility solutions, powering industry-leading advanced driver-assistance systems and delivering valuable intelligence to optimize mobility infrastructure. To date, more than 125 million vehicles worldwide have been built with Mobileye technology inside. In 2022 Mobileye listed as an independent company separate from Intel (Nasdaq: INTC), which retains majority ownership. For more information, visit \u003C/span>\u003Ca style=\"color: #0563c1;\" href=\"https://www.mobileye.com/\" target=\"_blank\" rel=\"noopener noreferrer\">https://www.mobileye.com\u003C/a>\u003Cspan style=\"color: #404040;\">.&nbsp;\u003C/span>\u003C/p>","2022-12-21T08:00:00.000Z","Financial, ADAS, Autonomous Driving, News, Press Kit, Events",{"id":1254,"type":24,"url":1255,"title":1256,"description":1257,"primary_tag":1170,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1257,"image":1258,"img_alt":1259,"content":1260,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1261,"tags":1175},186,"mobileye-shares-its-third-quarter-financial-update-and-fourth-quarter--forecast","Q3 2022 Financial Results & Q4 Forecast","Today, Mobileye shared the following update on its third-quarter 2022  financial results and provided a forecast for the upcoming fourth quarter of 2022.","https://static.mobileye.com/website/us/corporate/images/e55949fed0f97df477631ca0c50344dc_1670360841678.jpg","Mobileye's fully autonomous robotaxi, pictured outside our headquarters in Jerusalem.","\u003Cp>Today, Mobileye shared a financial update on its third-quarter 2022 results and provided a forecast for the fourth quarter of 2022. “Our excellent third-quarter performance is an early indication of the success of our strategy,” said Prof. Amnon Shashua, president and CEO of Mobileye Global.\u003C/p>\u003Cp>Read the entire update \u003Ca href=\"https://ir.mobileye.com/news-releases/news-release-details/mobileye-discloses-third-quarter-2022-results-and-business\" rel=\"noopener noreferrer\" target=\"_blank\">here\u003C/a>.\u003C/p>","2022-12-07T00:00:00.000Z",{"id":1263,"type":5,"url":1264,"title":1265,"description":1266,"primary_tag":954,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1266,"image":1267,"img_alt":1268,"content":1269,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1270,"tags":1026},185,"international-day-of-persons-with-disabilities-shekel-perfects-data-team","International Day of Persons with Disabilities: the Perfect People for the Job","A team of employees on the autism spectrum make a significant impact at Mobileye. Today we’re proud to showcase their ongoing contribution.","https://static.mobileye.com/website/us/corporate/images/65519c68e60535934af7bcbdf8c88cbe_1669894495911.png","Mobileye's Perfects data-labeling team includes a dozen employees on the autism spectrum who help to hone our computer-vision algorithms.","\u003Cp>&ldquo;My work saves lives from car and road collisions all over the world. Mobileye has now become an innovative and groundbreaking safe-driving company globally, and I consider it the greatest blessing and honor to know I am saving lives.&rdquo;\u003C/p>\n\u003Cp>These words could have been spoken by any of the 3,000 people who work at Mobileye. But they hold a certain special significance coming from Jonathan Trauner.\u003C/p>\n\u003Cp>Jonathan is one of a dozen employees at Mobileye on the autism spectrum. Their work is invaluable in &ldquo;perfecting&rdquo; our algorithms and in pursuing our ultimate goal of saving lives lost to automobile accidents. To mark this \u003Ca href=\"https://www.un.org/en/observances/day-of-persons-with-disabilities\" target=\"_blank\" rel=\"noopener noreferrer\">International Day of Persons with Disabilities\u003C/a>, we&rsquo;re proud to showcase the contribution made by these valued employees.\u003C/p>\n\u003Cp>\u003Cstrong>Meaningful Contribution\u003C/strong>\u003C/p>\n\u003Cp>For more than five years now, Mobileye has been collaborating with \u003Ca href=\"https://www.shekel.org.il/en/\" target=\"_blank\" rel=\"noopener noreferrer\">SHEKEL\u003C/a>, a non-profit organization that works towards the inclusion of people with disabilities in the community. Among SHEKEL&rsquo;s goals is to secure employment for those it represents and thereby empower them to make a meaningful and positive impact on the world around them, despite their physical and/or cognitive differences &ndash; or sometimes, as we&rsquo;ve found, even turning those disabilities into advantages.\u003C/p>\n\u003Cp>Thanks to our collaboration with SHEKEL, individuals like Jonathan on the autism spectrum play an instrumental role as part of our &ldquo;Perfects&rdquo; team, where they&rsquo;re responsible for labeling and classifying \u003Ca href=\"https://www.mobileye.com/blog/mobileye-ces-2022-self-driving-secret-data/\" target=\"_blank\" rel=\"noopener\">the data from video clips on which our algorithms are trained\u003C/a> and tested.\u003C/p>\n\u003Cp>It&rsquo;s work that they&rsquo;re not only able to do, but which leverages the strengths often found in high-functioning individuals on the spectrum. Our Perfects team members with autism spectrum disorder (ASD) have proven themselves particularly well-suited to analyzing videos frame by frame, paying close attention to often minute details, following set rules and procedures, and doing it all consistently and reliably.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/a92ccf8f4f6d69546d23eac78831a706_1669894537096.jpg\" alt=\"Mobileye's Perfects data labeling team includes a dozen employees on the autism spectrum.\" />\u003C/p>\n\u003Cp>&ldquo;&lsquo;Cleaning&rsquo; data is priceless to Mobileye,&rdquo; notes Perfects team operations manager Avi Hershkovitz. &ldquo;It&rsquo;s what makes the difference between accuracy of, say, 99.001% and 99.01%&nbsp;, for example.&rdquo; That may not seem like a significant difference, but even a fraction of a percent can be consequential when you&rsquo;re dealing with safety-critical technology like \u003Ca href=\"https://www.mobileye.com/solutions/\" target=\"_blank\" rel=\"noopener noreferrer\">driver-assistance and self-driving systems\u003C/a>.\u003C/p>\n\u003Cp>\u003Cstrong>Why Diversify?\u003C/strong>\u003C/p>\n\u003Cp>The reasons for diversifying the workforce to include individuals with disabilities extend far beyond charity or corporate social responsibility.\u003C/p>\n\u003Cp>&ldquo;Inclusion, when done right, is beneficial financially for the company by identifying and matching company needs to employee abilities,&rdquo; explains Mollie Goldstein, who managed the program for SHEKEL before coming onboard full-time at Mobileye. &ldquo;A much wider range of tasks can be accomplished when employees bring different strengths and abilities.&rdquo;\u003C/p>\n\u003Cp>The participation of employees with disabilities in the workplace can also raise morale and motivation &ndash; not only for those employees themselves, but also &ldquo;for neurotypical employees who see the hurdles people have overcome to be given equal opportunities, and see how important their job is to them,&rdquo; Goldstein points out. &ldquo;Examples of successful people with disabilities just go to show the brilliant minds we may miss if we overlook this segment of the population.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>Beyond Employment\u003C/strong>\u003C/p>\n\u003Cp>Aside from financial compensation, benefits, and technical training, the program also helps these employees develop the social and interpersonal skills they need to operate and succeed as productive members of the workforce. They learn, for example, how to make themselves presentable for the workplace environment, the importance of showing up to work on time, and to notify their managers of any absences in advance. They're also given opportunities to socialize in a welcoming environment when they feel comfortable doing so.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"Mobileye &amp; SHEKEL\" src=\"https://player.vimeo.com/video/777082531?h=165c4c1e9a&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"640\" height=\"360\" frameborder=\"0\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Cp>&ldquo;Before I never really did anything real. And now I feel like I finally am doing something with my life,&rdquo; says Eli Schreiber, one of our Perfects QA technicians. &ldquo;Thankfully we&rsquo;ve learned, which is a pretty good skill I think, that we can now have the discipline and the capacity to work without having someone standing over us.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>A Perfect Match\u003C/strong>\u003C/p>\n\u003Cp>&ldquo;There are so many problems in the world and there's not much we can do,&rdquo; says Pini Segal, Mobileye&rsquo;s veteran VP of Payroll and an instrumental early internal champion of the SHEKEL collaboration. &ldquo;But when we have the opportunity to do something good and to do it right, there's nothing better than that. We simply love this project.&rdquo;\u003C/p>\n\u003Cp>With the SHEKEL program, we&rsquo;re able not only to integrate individuals of differing abilities into our company, but to recognize their unique capabilities and the significant contribution they can make towards our goal of making roads safer.\u003C/p>","2022-12-01T08:00:00.000Z",{"id":1272,"type":24,"url":1273,"title":1274,"description":1275,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1275,"image":1276,"img_alt":1277,"content":1278,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1279,"tags":1016},184,"goggo-network-and-mobileye-plan-to-collaborate-to-introduce-level-autonomous-vehicles-for-logistics-in-europe","Goggo Network & Mobileye collaborate on autonomous logistics vehicles","Goggo Network to incorporate up to 40 autonomous vehicles with the Level 4 self-driving system Mobileye Drive™ for the delivery of goods in Spain and France.\n","https://static.mobileye.com/website/us/corporate/images/7159351a49067075480f91fa4ddc9f9d_1670427843381.png","Goggo Network and Mobileye team up for autonomous logistics.","\u003Cp class=\"ql-align-justify\">\u003Cspan style=\"color: rgb(34, 34, 34);\">Goggo Network, \u003C/span>the autonomous mobility company founded by Martin Varsavsky and Yasmine Fage and with operations in Spain and France, has signed a Memorandum of Understanding (MoU) with Mobileye, a global leader in the development of computer vision and machine learning, data analysis, localization and mapping for Advanced Driver Assistance Systems and autonomous driving, as part of its strategy to drive the future of autonomous mobility and logistics in Spain and Europe. With this agreement, Goggo plans to introduce Level 4 autonomous vehicles in Spain for the first time.\u003C/p>\u003Cp class=\"ql-align-justify\">Specifically, up to 40 autonomous vehicles with SAE Level 4 features, equipped with the Mobileye Drive&nbsp;technology, are expected to be deployed and operated by Goggo in cities around Spain and France.\u003C/p>\u003Cp class=\"ql-align-justify\">This system matches Goggo’s plans to roll out logistics services across Europe in a next step, as Mobileye Drive is capable of adapting to various driving environments and driver behavior.\u003C/p>\u003Cp class=\"ql-align-justify\">Importantly, both Goggo Network and Mobileye agree that a transformation to a smarter, safer and more environmentally efficient driverless future is needed. For this reason, both companies have decided to join forces to achieve the autonomous delivery of goods using pioneering technology from Mobileye.\u003C/p>\u003Cp class=\"ql-align-justify\">Mobileye Drive is a part of Mobileye’s full stack of self-driving Mobility-as-a-Service (MaaS) solutions. Powered by the EyeQ® system-on-a-chip and fed by multiple input sensors like camera, radar and lidar, the self-driving system can be implemented in various use cases like robotaxis, consumer passenger vehicles or commercial delivery vehicles.\u003C/p>\u003Cp class=\"ql-align-justify\">For the implementation of the deployment, Goggo will assess the coverage of the road network and provide backend interfaces (APIs) for mission control and fleet management. In turn, Goggo will be responsible for obtaining all necessary permits and licences to test and operate the service in the relevant municipalities.\u003C/p>\u003Cp class=\"ql-align-justify\">Goggo plans to start operating with safety drivers to oversee the entire service in 2023, and once the respective safety validations have been completed and regulatory clearance obtained, to eliminate the driver altogether to make the service autonomous Level 4.\u003C/p>\u003Cp class=\"ql-align-justify\">\"At Goggo Network we are looking to collaborate with leading companies in their sector, such as Mobileye, which will allow us to add value to the autonomous logistics sector. With this agreement, we will be able to go a step further in our strategy and introduce Level 4 autonomous vehicles for the first time in our country, making the most of a pioneering technology that stands out for its safety”, explains Yasmine Fage, co-founder and COO of Goggo Network.\u003C/p>\u003Cp class=\"ql-align-justify\">Johann Jungwirth, Senior Vice President, Autonomous Vehicles of Mobileye, adds: “The collaboration with Goggo Network shows how our vision of delivering self-driving systems for various use cases will come to life. Reshaping mobility for the cities of the future will bring great benefits for everybody, and Goggo’s logistics solutions help bring us closer to this reality.”\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cbr>\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cstrong>About Goggo Network\u003C/strong>\u003C/p>\u003Cp class=\"ql-align-justify\">Founded in 2018 by Martin Varsavsky and Yasmine Fage, \u003Ca href=\"https://goggo.network/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: windowtext;\">Goggo Network\u003C/a> operates fleets of autonomous vehicles and robots for last mile transport, with a vision to provide autonomous, electric and shared mobility solutions by creating Autonomous Mobility Networks (AMN). Currently based in Madrid, Berlin and Paris, Goggo Network operates in Spain, France and Germany.\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\u003Cp class=\"ql-align-justify\">Mobileye (Nasdaq: MBLY) is driving the autonomous vehicle evolution with its autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in computer vision, artificial intelligence, mapping, and data analysis. Since its founding in 1999, Mobileye has pioneered such groundbreaking technologies as REM™ crowdsourced mapping, True Redundancy™ sensing, and Responsibility Sensitive Safety (RSS). These technologies are driving the ADAS and AV fields towards the future of mobility – enabling self-driving vehicles and mobility solutions, powering industry-leading advanced driver-assistance systems and delivering valuable intelligence to optimize mobility infrastructure. To date, more than 125 million vehicles worldwide have Mobileye technology inside. In 2022, Mobileye was listed as an independent company, separate from Intel (Nasdaq: INTC) which retains majority ownership of Mobileye.\u003C/p>","2022-11-28T00:00:00.000Z",{"id":1281,"type":5,"url":1282,"title":1283,"description":1284,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1284,"image":1285,"img_alt":1286,"content":1287,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1288,"tags":563},183,"enhanced-computer-vision-driver-assistance","Enhanced Computer Vision for More-Advanced Driver Assistance","With higher resolution, a wider field of view, and more powerful processors, Mobileye has once again raised the bar for advanced driver-assistance technology.","https://static.mobileye.com/website/us/corporate/images/e4e0ecd1d011a549ca1f1d05e7189ced_1669025316668.png","Through continuous development of computer-vision technology, Mobileye aims to enhance road safety, and save lives as a result.","\u003Cp>Remember what it was like to upgrade your home-theater setup from VHS to DVDs, or from high definition to 4K UHD? Or maybe you recently replaced your laptop or smartphone? If you&rsquo;ve experienced any of these &ndash; or anything similar &ndash; you&rsquo;ll likely appreciate the significance of the latest advancements in our driver-assistance technologies.\u003C/p>\n\u003Cp>Powered by cameras and processors even more capable than their predecessors, our latest \u003Ca href=\"https://www.mobileye.com/blog/computer-vision-eccv-2022/\" target=\"_blank\" rel=\"noopener noreferrer\">computer-vision technologies\u003C/a> for advanced driver-assistance systems (ADAS) can &ldquo;see&rdquo; more, better, farther, and wider. But even more important than the raw specs and capabilities of the hardware itself is what we can do with its augmented performance.\u003C/p>\n\u003Cp>The enhanced capabilities detailed below afford greater awareness of traffic signals, potential hazards, road conditions, other road users, and more. And that in turn translates not only to enhanced comfort for drivers and passengers, but increased safety for them and those with whom they share the road.\u003C/p>\n\u003Cp>\u003Cstrong>New Applications Enabled by Advanced Technology\u003C/strong>\u003C/p>\n\u003Cp>If you&rsquo;ve visited these pages before, you&rsquo;ll likely have read about the advanced technologies which we take great pride in developing and putting out onto the road. But before we dive into the new hardware, this time we&rsquo;d like to tell you first about the applications they enable &ndash; the &ldquo;real-world&rdquo; benefits of our latest tech.\u003C/p>\n\u003Cp>Take traffic lights, for example. Due to its dramatically higher resolution, our enhanced computer-vision system can now identify a red light from farther away than before, and assist the driver in decelerating towards a stop more smoothly and gradually. The same applies to all manner of objects and hazards that a driver might encounter out on the road, which we can now detect from a significantly greater distance than before. Not only that, but with a much wider field of view, the single sensor can see more of the vehicle&rsquo;s surroundings (beyond what&rsquo;s right in front of the vehicle) &ndash; enabling it to \u003Cspan style=\"color: #030303;\">better monitor fast-moving cross-traffic, for example, and help avoid side-swipe collisions at intersections.\u003C/span>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/97c4f59bba60fce35f45fbbe545d129b_1669198161014.png\" alt=\"Mobileye&rsquo;s cutting-edge computer vision technology identifies open car doors, helping to avoid collision with the obstacle and the occupants likely to appear alongside.\" />\u003C/p>\n\u003Cp>Or picture a car, stopped on the side of the road up ahead, opening its door into your lane. Our latest tech can better detect that open door, \u003Ca href=\"https://vimeo.com/733570009\" target=\"_blank\" rel=\"noopener noreferrer\">helping you to slow down and give it a wider berth\u003C/a> to avoid hitting it... not to mention the driver or passenger likely to exit from that door.\u003C/p>\n\u003Cp>Our proven computer-vision technology already excels, of course, at correctly \u003Ca href=\"https://www.mobileye.com/blog/intelligent-speed-assist-general-safety-regulation/\" target=\"_blank\" rel=\"noopener noreferrer\">identifying road signs\u003C/a> by their shape, color, and icons. But with Optical Character Recognition (OCR) capabilities enabled by the higher resolution and our ongoing development, our latest tech can also read the fine print and understand its context while on the move. So it can tell if you&rsquo;re entering a school zone, for example, and whether the speed limit is lower only at a particular time of day.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/7cac424e015057098ed9fbfedc82f6f6_1669198200517.jpg\" alt=\"With Optical Character Recognition (OCR) capabilities, Mobileye cameras can not only identify road signs by their shape, color, and icons, but read the text as well.\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #030303;\">In addition, the system employs Neural Network Semantic Segmentation (NSS) &ndash; a highly advanced type of artificial intelligence that automatically labels objects in an image. NSS works on an individual-pixel level, so with more pixels comes ever-higher precision. This enables the system to better identify an array of features &ndash; including environmental elements (such as snow, rain, and mud) &ndash; to inform the driver and vehicle about the condition of the road surface\u003C/span>.\u003C/p>\n\u003Cp>These are just a few examples in an extensive array of practical benefits afforded by these new technological advancements. \u003Cspan style=\"color: #030303;\">The system can also \u003C/span>identify emergency vehicles by their flashing lights\u003Cspan style=\"color: #030303;\">. It can recognize gestures and postures to discern whether a \u003C/span>person standing by the road is hailing a taxi, for example, or just talking on the phone. It can find the way through intersections without lane markings\u003Cspan style=\"color: #030303;\">, determine if a road&rsquo;s shoulder is safe to use in case of emergency, identify obstacles (such as oversized cargo and collapsible cranes) protruding from commercial vehicles, and \u003C/span>even &ldquo;remember&rdquo; how you park in your driveway, enabling the vehicle to repeat the procedure automatically.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/96616133cf21ec455bcdd9f39e5cc25c_1669198226410.png\" alt=\"Mobileye&rsquo;s latest computer-vision technology for driver assistance can identify emergency vehicles by their strobe lights.\" />\u003C/p>\n\u003Cp>\u003Cstrong>The Latest Hardware That Makes It Possible\u003C/strong>\u003C/p>\n\u003Cp>Now that you have a picture of what our tech can do, let&rsquo;s take a look at the new cameras and processors that make it all possible.\u003C/p>\n\u003Cp>Our latest computer-vision systems for driver assistance employ 8-megapixel optical sensors, representing a nearly fivefold increase in resolution over the 1.7-megapixel cameras used in earlier systems. The new cameras also offer a 120-degree field of view, covering a full third of the vehicle&rsquo;s horizontal surroundings with a single camera. That&rsquo;s more than twice the scope of earlier 52-degree cameras, and significantly wider than even the \u003Ca href=\"https://www.mobileye.com/news/nissan-rogue-to-showcase-mobileyezf-100-degree-adas-camera/\" target=\"_blank\" rel=\"noopener\">100-degree sensors\u003C/a> that were considered the cutting edge just two years ago. And we&rsquo;re able to support that wider field of view without compromising on image quality.\u003C/p>\n\u003Cp>At least as important are the constantly improving \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ&trade; Systems-on-Chip\u003C/a> (SoCs) tasked with processing what the upgraded cameras pick up. EyeQ5 upgrades on the commercial success of its predecessors by adding support for the new 8-mp/120-degree cameras detailed here. And \u003Ca href=\"https://www.mobileye.com/blog/eyeq6-system-on-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">the new EyeQ6\u003C/a> raises the bar even higher: our latest one-box, windshield-mountable SoC, EyeQ6 Lite boasts 450% more processing power than EyeQ4 Mid &ndash; with similar power consumption and in a 45% smaller package.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/98558b1236cb325d80ec235a3ae7864c_1669198891050.png\" alt=\"EyeQ6 Lite is Mobileye&rsquo;s latest System-on-Chip for core driver-assistance systems.\" />\u003C/p>\n\u003Cp>The result of all this cutting-edge equipment and the enhanced features they enable is a smoother, more comfortable, more advanced, and &ndash; most importantly &ndash; safer driving experience. That&rsquo;s the very essence of why Mobileye pioneered the use of computer-vision technology for driver assistance in the first place, nearly a quarter-century ago. And it&rsquo;s why we continue to develop newer, better technologies: to enhance road safety, reduce the incidence and severity of collisions, and save lives as a result.\u003C/p>","2022-11-23T08:00:00.000Z",{"id":1290,"type":24,"url":1291,"title":1292,"description":1293,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1293,"image":1294,"img_alt":1295,"content":1296,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1297,"tags":1298},182,"valeo-has-produced-its-millionth-front-camera-system-integrating-mobileye-eyeq-technology-at-its-wemding-site-in-germany","Valeo 10 millionth EyeQ front camera system","Valeo and Mobileye celebrate the production of Valeo’s 10 millionth front camera system containing Mobileye technology, at Valeo's site in Wemding, Germany.\n","https://static.mobileye.com/website/us/corporate/images/104d1a8413a6c8f2cce2bb36f0887278_1668610120296.jpg","Valeo and Mobileye marked the production of their 10 millionth ADAS camera unit at Valeo's factory in Germany. (Credit: Valeo)","\u003Cp class=\"ql-align-justify\">\u003Cspan style=\"color: rgb(78, 107, 124); background-color: white;\">Valeo, world leader in advanced driving assistance systems (ADAS), started its collaboration with Mobileye in 2015, choosing to integrate into its front camera system the Mobileye system-on-chip (SOC), named “EyeQ®”. Mobileye, with partners like Valeo, revolutionized driver assist systems by taking computer vision technology to a next level in the automotive industry. Together, Valeo and Mobileye have combined their best-in-class technology and have developed and manufactured multiple generations of front camera systems. The strong collaboration between Valeo and Mobileye is now focused on the integration of the latest generation of the Mobileye SoC into the Valeo front camera system and into Valeo’s cutting edge centralized computer. \u003C/span>\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cbr>\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cspan style=\"color: rgb(78, 107, 124); background-color: white;\">At the heart of ADAS, the Valeo front camera using the Mobileye SoC is making roads safer by supporting key features such as\u003C/span>\u003Cspan style=\"color: rgb(78, 107, 124);\"> autonomous emergency braking, adaptive cruise control and lane keeping assist. The front camera system itself has become the key enabler in reaching the safety requirements defined by the authorities and is set to equip 100% of new cars. To date, since starting this activity, Valeo has produced nearly 13 million front cameras.\u003C/span>\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cbr>\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cspan style=\"color: rgb(78, 107, 124); background-color: white;\">Marc Vrecko, President of Valeo’s Comfort and Driving Assistance Systems Business Group said: \u003C/span>\u003Cem style=\"color: rgb(78, 107, 124); background-color: white;\">“We are very pleased to announce that we have reached this milestone, which clearly shows that Valeo is accelerating in ADAS, as set out in our Move Up strategic plan\u003C/em>\u003Cem style=\"color: rgb(78, 107, 124);\">. In 2023, Valeo will produce an additional 9 million front cameras worldwide. By 2030, almost 90% o\u003C/em>\u003Cem style=\"color: rgb(78, 107, 124); background-color: white;\">f new vehicles will be equipped with this technology, and the ADAS content per vehicle will double. “\u003C/em>\u003C/p>\u003Cp class=\"ql-align-justify\">&nbsp;\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cem style=\"color: rgb(78, 107, 124);\">“Thanks to collaborations with strong partners like Valeo, we have been able to deliver our computer vision technology into more than 125 million vehicles worldwide, helping reduce vehicle crashes and injuries globally”,\u003C/em>\u003Cspan style=\"color: rgb(78, 107, 124);\"> said Nimrod Nehushtan, Senior Vice President of Strategy and Development at Mobileye. \u003C/span>\u003Cem style=\"color: rgb(78, 107, 124);\">“Together, we see great opportunities in the next few years to expand the capabilities of these technologies, providing automakers and end customers with affordable and reliable safety and convenience features.\u003C/em>\u003Cspan style=\"color: rgb(78, 107, 124);\">” \u003C/span>\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cbr>\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cspan style=\"color: rgb(78, 107, 124); background-color: white;\">Today, more than 90% of road accidents are caused by human error. ADAS are at the heart of the transformation of mobility, driven by enhancements in safety and comfort. The ADAS market is expected to grow by 17% per year to reach 60 billion euros in 2030. Valeo has the market’s most comprehensive portfolio of sensors (ultrasonic sensors, cameras, radars and LiDARs), software and associated intelligence.\u003C/span>\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cbr>\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cbr>\u003C/p>\u003Cp>\u003Cstrong style=\"color: rgb(78, 107, 124);\">About Valeo:\u003C/strong>\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cspan style=\"color: rgb(78, 107, 124);\">As a technology company and partner to all automakers and new mobility players, Valeo is innovating to make mobility cleaner, safer and smarter. Valeo enjoys technological and industrial leadership in electrification, driving assistance systems, reinvention of the interior experience and lighting. These four areas are vital to the transformation of mobility and will drive the Group’s growth in the coming years. Valeo is listed on the Paris Stock Exchange. Valeo in figures: In 2021, the Group generated sales of 17.3 billion euros and invested 12% of sales in R&amp;D. At December 31, 2021, Valeo had 184 plants, 21 research centers, 43 development centers and 16 distribution platforms, and employed 103,300 people in 31 countries worldwide.\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: rgb(78, 107, 124);\">&nbsp;\u003C/span>\u003C/p>\u003Cp>\u003Cstrong style=\"color: rgb(78, 107, 124);\">About Mobileye:\u003C/strong>\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cspan style=\"color: rgb(78, 107, 124);\">Mobileye (Nasdaq: MBLY) is driving the autonomous vehicle evolution with its autonomous driving and driver-assistance technologies, harnessing world-renowned expertise in computer vision, artificial intelligence, mapping, and data analysis. Since its founding in 1999, Mobileye has pioneered such groundbreaking technologies as REM™ crowdsourced mapping, True Redundancy™ sensing, and Responsibility Sensitive Safety (RSS). These technologies are driving the ADAS and AV fields towards the future of mobility – enabling self-driving vehicles and mobility solutions, powering industry-leading advanced driver-assistance systems and delivering valuable intelligence to optimize mobility infrastructure. To date, more than 125 million vehicles worldwide have Mobileye technology inside. In 2022, Mobileye was listed as an independent company, separate from Intel (Nasdaq: INTC) which retains majority ownership of Mobileye. \u003C/span>\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cspan style=\"color: rgb(78, 107, 124);\">For more information, visit \u003C/span>\u003Ca href=\"https://www.mobileye.com/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: blue;\">https://www.mobileye.com\u003C/a>\u003Cspan style=\"color: rgb(78, 107, 124);\">.\u003C/span>\u003C/p>\u003Cp class=\"ql-align-justify\">\u003Cbr>\u003C/p>","2022-11-16T00:00:00.000Z","News, ADAS",{"id":1300,"type":24,"url":1301,"title":1302,"description":1303,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1303,"image":1304,"img_alt":1305,"content":1306,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1307,"tags":1298},181,"zeekr-mobileye-supervision","ZEEKR 009 is Next for Mobileye SuperVision™","Luxury MPV follows up on the success of the ZEEKR 001, showcasing the differentiated capabilities of Mobileye’s most advanced driver-assistance system.","https://static.mobileye.com/website/us/corporate/images/e70e1e00dfd22e86b4ee1e6e19c95db3_1668422306215.jpg","The new ZEEKR 009 is packed with technology, including Mobileye SuperVision™. (Credit: ZEEKR)","\u003Cp>There&rsquo;s no such thing as a one-size-fits-all solution. (If there were, we might all be driving the same car.) But one solution can take many forms to suit different needs and tastes. Case in point: \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye SuperVision&trade;\u003C/a> and its latest application in the ZEEKR 009.\u003C/p>\n\u003Cp>The second model from Geely&rsquo;s premium electric mobility technology brand (following the ZEEKR 001), the new 009 takes the form of an upscale, all-electric multi-purpose vehicle (MPV). It has ample room for six in its high-tech, luxurious cabin. Its dual electric motors and surprisingly low drag coefficient of 0.27 help the ZEEKR 009 accelerate from 0 to 100 km/h (62 mph) of 4.5 seconds. And its available 140-kWh battery pack delivers a claimed 822 kilometers (511 miles) of driving range between charges.\u003C/p>\n\u003Cp>It \u003Cspan style=\"color: black;\">utilizes a large, single-piece die-cast aluminum rear body, which increases torsional stiffness and limits deformation in the event of an impact. \u003C/span>And, with Mobileye SuperVision, it offers \u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-test-demo-road-trip/\" target=\"_blank\" rel=\"noopener noreferrer\">cutting-edge driver-assistance technology\u003C/a>.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/b4b52df47f45bb60f4a254b0e559542c_1668422457134.jpg\" alt=\"The ZEEKR 009 features Mobileye SuperVision&trade; to power its highly advanced assisted driving capabilities.\" />\u003C/p>\n\u003Cp>\u003Cstrong>Cutting-Edge Highway Assist\u003C/strong>\u003C/p>\n\u003Cp>Like the \u003Ca href=\"https://youtu.be/R8qTOPpQ2-I\" target=\"_blank\" rel=\"noopener noreferrer\">ZEEKR 001\u003C/a>, the new 009 incorporates our most capable and advanced driver-assistance system, derived directly from our autonomous-vehicle development program.\u003C/p>\n\u003Cp>Mobileye SuperVision includes such features as autonomous lane changing, adaptive cruise control, comprehensive surround emergency assist, advanced traffic sign and light recognition, front and rear collision avoidance, and evasive maneuver assist. This highly advanced feature set is supported in the ZEEKR 009 by a high-resolution surround-view camera array, two of our \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ&trade;\u003C/a>5 High chips, and our \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">RSS\u003C/a>-based driving policy.\u003C/p>\n\u003Cp>With over-the-air software updates, the already advanced capabilities of Mobileye SuperVision can be further upgraded as development progresses. In fact, \u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-zeekr-ota-update/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye and ZEEKR recently rolled out just such an update\u003C/a> to over 50,000 ZEEKR 001s already in the hands of customers.\u003C/p>\n\u003Cp>Reinforcing the value of our technology, ZEEKR is highlighting its SuperVision-based Navigate-on-Pilot point-to-point assisted-driving system in the ZEEKR 009 with a special pre-order promotion.\u003C/p>\n\u003Cp>[**]gallery:zeekr-009[**]\u003C/p>\n\u003Cp>\u003Cstrong>Expanding Vision\u003C/strong>\u003C/p>\n\u003Cp>ZEEKR aims to commence delivery of the new 009 in China starting in January 2023. Once it does, the ZEEKR 009 will become both the second model in the premium brand&rsquo;s lineup, and the second vehicle to bring Mobileye SuperVision to market. But we&rsquo;re only beginning to tap the potential of this highly advanced solution.&nbsp;\u003C/p>\n\u003Cp>Less than two months ago, \u003Ca href=\"https://www.mobileye.com/news/geely-holding-group-expands-mobileye-collaboration/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye and Geely announced an expanded collaboration\u003C/a> that will see Mobileye SuperVision incorporated into additional models from ZEEKR and into three further brands under the \u003Ca href=\"http://zgh.com/our-business/?lang=en\" target=\"_blank\" rel=\"noopener noreferrer\">Geely Group umbrella\u003C/a>. And Geely, of course, is just \u003Ca href=\"https://www.mobileye.com/opinion/our-new-deal-with-geely-is-a-game-changer-says-shashua/\" target=\"_blank\" rel=\"noopener\">the first major automaker to adopt Mobileye SuperVision\u003C/a> for its most advanced models.\u003C/p>","2022-11-14T08:00:00.000Z",{"id":1309,"type":5,"url":1310,"title":1311,"description":1312,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1312,"image":1313,"img_alt":1314,"content":1315,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1316,"tags":1317},180,"computer-vision-eccv-2022","Showcasing our Computer Vision at ECCV 2022","Mobileye demonstrated how our computer-vision technology is transforming human mobility at the 2022 European Conference on Computer Vision in Tel Aviv.","https://static.mobileye.com/website/us/corporate/images/4eb3667d9f5a4bde6875e08c4e3a50ed_1667463850408.jpg","Mobileye's booth at ECCV Tel Aviv 2022 featured our robotaxi, interactive tabletop display, and unedited AV video.","\u003Cp>Mobileye has a broad range of specializations, from machine learning to data analysis and mapping. But arguably more than anything else, we&rsquo;re a computer-vision company at our core. That&rsquo;s the technological discipline on which we were founded, and which has now been integrated into more than 125 million vehicles equipped with our technology. So, when one of the world&rsquo;s premier computer-vision conferences came to Tel Aviv for the first time in its history, we naturally came out in force.\u003C/p>\n\u003Cp>Held last week in Tel Aviv, the \u003Ca href=\"https://eccv2022.ecva.net/\" target=\"_blank\" rel=\"noopener noreferrer\">2022 European Conference on Computer Vision\u003C/a> (ECCV) featured an array of speakers and exhibitors from across the industry and academia. And Mobileye &ndash; which is \u003Ca href=\"https://careers.mobileye.com/\" target=\"_blank\" rel=\"noopener noreferrer\">always looking to bring in top talents in the field\u003C/a> &ndash; was present with what proved to be one of the most popular exhibits at the conference.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"Mobileye at ECCV 2022\" src=\"https://player.vimeo.com/video/768272699?h=5dbed10f73&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" frameborder=\"0\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Cp>&nbsp;Our booth on the show floor revolved around three engaging displays:\u003C/p>\n\u003Cul>\n\u003Cli>Our \u003Ca href=\"https://www.mobileye.com/blog/self-driving-robotaxi-sixt-germany-iaa/\" target=\"_blank\" rel=\"noopener noreferrer\">robotaxi\u003C/a> occupied center-stage, with interactive tablets mounted on the dashboard and seatbacks inside showing footage of our autonomous vehicles testing in four locations around the world;\u003C/li>\n\u003Cli>A larger screen played our \u003Ca href=\"https://www.mobileye.com/press-kit/press-kit-mobileye-new-york-city/\" target=\"_blank\" rel=\"noopener\">unedited AV-drive video from New York\u003C/a>; and\u003C/li>\n\u003Cli>Our immersive interactive tabletop display showcased the various technologies we&rsquo;re putting into five of \u003Ca href=\"https://www.mobileye.com/blog/ces-2022-videos-demos/\" target=\"_blank\" rel=\"noopener noreferrer\">our most advanced applications\u003C/a>, each with our computer-vision technology at its core &ndash; the robotaxi, \u003Ca href=\"https://www.mobileye.com/blog/udelv-transporter-autonomous-delivery-vehicles-powered-by-mobileye/\" target=\"_blank\" rel=\"noopener noreferrer\">Udelv Transporter\u003C/a>, and \u003Ca href=\"https://www.mobileye.com/blog/mobileye-transdev-lohr-maas-i-cristal-shuttles-robotaxis/\" target=\"_blank\" rel=\"noopener noreferrer\">Lohr i-Cristal\u003C/a> (all powered by Mobileye Drive&trade;), the \u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-zeekr-ota-update/\" target=\"_blank\" rel=\"noopener noreferrer\">Zeekr 001\u003C/a> (featuring Mobileye SuperVision&trade;), and the \u003Ca href=\"https://www.mobileye.com/news/zeekr-mobileye-working-together/\" target=\"_blank\" rel=\"noopener noreferrer\">consumer AV we&rsquo;re also developing with Zeekr\u003C/a> (employing Mobileye Chauffeur&trade;).\u003C/li>\n\u003C/ul>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"Five applications for Mobileye's most advanced solutions\" src=\"https://player.vimeo.com/video/768270894?h=134cf394c5&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" frameborder=\"0\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Cp>More than 70 guests joined us for autonomous drives on surrounding roadways in one of three robotaxis we had stationed outside the main hall. With 11 cameras on board, our robotaxi (based on the NIO ES8 electric crossover) drives primarily on computer vision, with radar and lidar offering redundant sensing capabilities.\u003C/p>\n\u003Cp>Our chief executive \u003Ca href=\"https://www.mobileye.com/amnon-shashua/\" target=\"_blank\" rel=\"noopener noreferrer\">Prof. Amnon Shashua\u003C/a> served among the general chairs of the conference, where some of our top engineers also delivered presentations on Knowledge Distillation for Tasks Consolidation.\u003C/p>\n\u003Cp>With so many leading experts in the field in town for the conference, it was a unique privilege to demonstrate how we&rsquo;re applying computer-vision technology to make our roads safer today and change the face of human mobility for the future.\u003C/p>","2022-11-03T07:00:00.000Z","Video, Events, Industry",{"id":1319,"type":24,"url":1320,"title":1321,"description":1322,"primary_tag":16,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1322,"image":1323,"img_alt":1321,"content":1324,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1325,"tags":1175},179,"mobileye-announces-full-exercise-of-underwriters-option-to-purchase-additional-shares","Mobileye Announces Full Exercise of Underwriters’ Option to Purchase Additional Shares","Read about Mobileye's latest announcement: the full exercise of underwriters' option to purchase additional shares. Stay updated on Mobileye's developments.","https://static.mobileye.com/website/us/corporate/images/6fbfeec60ad73a664446c913cf8704d7_1667311782190.jpg","\u003Cp>\u003Cstrong>Jerusalem, \u003C/strong>November 1, 2022 &ndash; Mobileye Global Inc. (&ldquo;Mobileye&rdquo;) today announced, in connection with its previously announced initial public offering of 41,000,000 shares of its Class A common stock, the full exercise by the underwriters of their option to purchase 6,150,000 additional shares of Class A common stock at the public offering price of $21.00 per share less underwriting discounts and commissions. The issuance and sale of these additional shares closed today.\u003C/p>\n\u003Cp>As previously announced, in addition to the shares of Class A common stock sold in the public offering, General Atlantic purchased 4,761,905 shares of Class A common stock in a private placement at a price per share equal to the initial public offering price, for gross proceeds of $100 million. The sale of these shares will not be registered under the Securities Act of 1933, as amended.\u003C/p>\n\u003Cp>The net proceeds to Mobileye, after deducting underwriting discounts and commissions and estimated offering expenses payable by Mobileye from the initial public offering, including the exercise of the underwriters&rsquo; option to purchase additional shares, and the previously announced private placement is approximately $1.0 billion. A significant portion of the net proceeds is being used for repayment on a note owed to Mobileye&rsquo;s parent company, Intel Corporation, and Mobileye intends to use the remaining net proceeds for working capital and general corporate purposes.\u003C/p>\n\u003Cp>Goldman Sachs &amp; Co. LLC and Morgan Stanley acted as joint lead book-running managers for the offering. Evercore ISI, Barclays, Citigroup, BofA Securities, RBC Capital Markets, Mizuho, Wolfe | Nomura Alliance and BNP PARIBAS acted as book-running managers for the offering. Cowen, Siebert Williams Shank, PJT Partners, MUFG, Needham &amp; Company, Raymond James, Loop Capital Markets, Blaylock Van, LLC, Academy Securities, Drexel Hamilton, Independence Point Securities LLC, CICC, Cabrera Capital Markets LLC and Guzman &amp; Company acted as co-managers for the offering.\u003C/p>\n\u003Cp>A registration statement relating to the shares being sold in this offering was filed with the Securities and Exchange Commission and became effective on October 25, 2022. The offering was made only by means of a prospectus, copies of which may be obtained from: the SEC at www.sec.gov, and from: Goldman Sachs &amp; Co. LLC, Prospectus Department, 200 West Street, New York, NY 10282, telephone: 1-866-471-2526, or by emailing prospectus-ny@ny.email.gs.com; or Morgan Stanley &amp; Co. LLC, Attn: Prospectus Department, 180 Varick Street, Second Floor, New York, NY 10014.\u003C/p>\n\u003Cp>This press release shall not constitute an offer to sell or the solicitation of an offer to buy these securities, nor shall there be any sale of these securities in any state or jurisdiction in which such offer, solicitation or sale would be unlawful prior to registration or qualification under the securities laws of any such state or jurisdiction.\u003C/p>\n\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\n\u003Cp>Mobileye is a leader in the development and deployment of advanced driver-assistance systems (ADAS) and autonomous driving technologies and solutions. Mobileye pioneered ADAS technology more than 20 years ago and has continuously expanded the scope of its ADAS offerings, while leading the evolution to autonomous driving solutions.&nbsp;Mobileye&rsquo;s portfolio of solutions is built upon a comprehensive suite of purpose-built software and hardware technologies designed to provide the capabilities to make the future of ADAS and autonomous driving a reality. These technologies can be harnessed to deliver mission-critical capabilities at the edge and in the cloud, advancing the safety of road users, and revolutionizing the driving experience and the movement of people and goods globally.\u003C/p>\n\u003Cp>\u003Cstrong>Contacts\u003C/strong>\u003C/p>\n\u003Cp>Danielle Mann\u003C/p>\n\u003Cp>Intel Media Relations\u003C/p>\n\u003Cp>danielle.mann@intel.com\u003C/p>\n\u003Cp>John Pitzer\u003C/p>\n\u003Cp>Intel Investor Relations\u003C/p>\n\u003Cp>\u003Ca href=\"mailto:John.pitzer@intel.com\" target=\"_blank\" rel=\"noopener noreferrer\">john.pitzer@intel.com\u003C/a>\u003C/p>\n\u003Cp>Dan Galves\u003C/p>\n\u003Cp>Mobileye Investor Relations\u003C/p>\n\u003Cp>\u003Ca href=\"mailto:dan.galves2@mobileye.com\" target=\"_blank\" rel=\"noopener noreferrer\">dan.galves2@mobileye.com\u003C/a>\u003C/p>\n\u003Cp>Justin Hyde\u003C/p>\n\u003Cp>Corporate Communications and Media Relations\u003C/p>\n\u003Cp>\u003Ca href=\"mailto:justin.hyde@mobileye.com\" target=\"_blank\" rel=\"noopener noreferrer\">justin.hyde@mobileye.com\u003C/a>\u003C/p>\n\u003Cp>\u003Cstrong>&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>&ldquo;Wolfe | Nomura Alliance&rdquo; is the marketing name used by Wolfe Research Securities and Nomura Securities International, Inc. in connection with certain equity capital markets activities conducted jointly by the firms. Both Nomura Securities International, Inc. and WR Securities, LLC are serving as underwriters in the offering described herein. In addition, WR Securities, LLC and certain of its affiliates may provide sales support services, investor feedback, investor education, and/or other independent equity research services in connection with this offering.\u003C/p>","2022-11-01T07:00:00.000Z",{"id":1327,"type":69,"url":1328,"title":1329,"description":1330,"primary_tag":73,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1330,"image":1331,"img_alt":1332,"content":1333,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1334,"tags":1335},174,"mobileye-is-once-again-a-public-company","Mobileye Is Once Again a Public Company","Following our successful IPO launched by Intel Corporation, Mobileye is now publicly listed on the Nasdaq stock exchange under the ticker symbol MBLY.","https://static.mobileye.com/website/us/corporate/images/f51fc37ccb00d9b8c66ecd11c7e0746b_1666800723757.jpg","Mobileye team rings the Nasdaq opening bell celebrating the company’s return to the U.S. public markets, trading under the ticker “MBLY.” Credit: Photography courtesy of Nasdaq, Inc.","\u003Cp>Mobileye today began trading on the Nasdaq Stock Exchange under the ticker &ldquo;MBLY&rdquo; in connection with its Initial Public Offering by parent company, Intel.&nbsp;Building on Mobileye&rsquo;s track record of innovation and success as an Intel company, the IPO unlocks the value of Mobileye for shareholders. Today marks a return to the public markets for Mobileye since it was acquired by Intel in 2017. Intel and Mobileye commemorated the milestone from Nasdaq in Times Square and virtually from Mobileye&rsquo;s Jerusalem headquarters and across its offices around the world.&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>Mobileye logo\u003C/p>\n\u003Cp>[**]gallery:mobileye's-logo[**]\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #000000;\">IPO Photo Gallery\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:ipo-press-kit[**]\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/opinion/to-our-customers-a-letter-from-our-ceo/\" target=\"_blank\" rel=\"noopener\">Customer letter\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://vimeo.com/763958794/101f6ecfb4\" target=\"_blank\" rel=\"noopener\">IPO B-roll\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://vimeo.com/764201125/852c1cc028\" target=\"_blank\" rel=\"noopener\">Bell-Ringing Ceremony\u003C/a>\u003C/p>","2022-10-26T07:00:00.000Z","Press Kit, Events",{"id":1337,"type":69,"url":1338,"title":1339,"description":1340,"primary_tag":73,"author_name":16,"is_hidden":11,"lang":1341,"meta_description":1340,"image":1342,"img_alt":1332,"content":1343,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1334,"tags":1344},176,"mobileye-is-once-again-a-public-company-he","מובילאיי הופכת שוב לחברה ציבורית","רשומה כעת בנאסד\"ק תחת הסימול MBLY, בעקבות הנפקתה ע\"י אינטל.","he","https://static.mobileye.com/website/us/corporate/images/dde58c5b41115ff4f92798f859ac25bb_1666801405326.jpg","\u003Cp>מובילאיי החלה להיסחר היום בנאסד\"ק תחת הסימול MBLY בעקבות הנפקתה ע\"י חברת האם, אינטל. בהתבסס על הרזומה החדשני והמוצלח של מובילאיי כחברה של אינטל, ההנפקה פותחת את ערכה של מובילאיי עבור בעלי המניות. היום מצוינת חזרתה של מובילאיי לשווקים הציבוריים מאז שנרכשה על ידי אינטל בשנת 2017. אינטל ומובילאיי ציינו את אבן הדרך מנאסד\"ק בטיימס סקוור, ניו יורק; ממטה מובילאיי בירושלים; ומשאר משרדיה של החברה ברחבי העולם.&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>הלוגו של מובילאיי\u003C/p>\n\u003Cp>[**]gallery:mobileye's-logo[**]\u003C/p>\n\u003Cp>תמונות מההנפקה\u003C/p>\n\u003Cp>[**]gallery:ipo-press-kit[**]\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/news/to-our-customers-a-letter-from-our-ceo-he/\" target=\"_blank\" rel=\"noopener noreferrer\">מכתב ללקוחות\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://vimeo.com/763958794/101f6ecfb4\" target=\"_blank\" rel=\"noopener\">לינק לבירול\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://vimeo.com/764201125/852c1cc028\" target=\"_blank\" rel=\"noopener\">סרטון צלצול פעמון הפתיחה בנסדא\"ק\u003C/a>\u003C/p>","Press Kit",{"id":1346,"type":69,"url":1347,"title":1348,"description":1349,"primary_tag":73,"author_name":16,"is_hidden":11,"lang":338,"meta_description":1349,"image":1350,"img_alt":1351,"content":1352,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1334,"tags":1344},177,"mobileye-is-once-again-a-public-company-de","Erneuter Börsengang von Mobileye","Nach dem Börsengang durch Intel jetzt im Nasdaq unter dem Kürzel MBLY geführt","https://static.mobileye.com/website/us/corporate/images/3f4e1447194b6002191d0c471b582e22_1666800871579.jpg","Mobileye CEO Professor Amnon Shashua und Mitarbeiter:innen läuten die Nasdaq-Eröffnungsglocke, um die Rückkehr des Unternehmens an die US-Börsen unter dem Kürzel „MBLY\" zu feiern. Credit: Photography courtesy of Nasdaq, Inc.","\u003Cp>Mobileye er&ouml;ffnete heute seinen Wertpapierhandel an der Technologieb&ouml;rse Nasdaq unter dem K&uuml;rzel &bdquo;MBLY&ldquo; nach dem B&ouml;rsengang durch die Muttergesellschaft Intel. Der B&ouml;rsengang baut auf der Innovations- und Erfolgsbilanz von Mobileye als Intel-Unternehmen auf und erschlie&szlig;t nun den Wert von Mobileye f&uuml;r Aktion&auml;r:innen. Mit dem heutigen Tag kehrt Mobileye wieder an die B&ouml;rse zur&uuml;ck, nachdem das Unternehmen 2017 von Intel erworben wurde. Intel und Mobileye feierten den Meilenstein in den Nasdaq-R&auml;umlichkeiten am New Yorker Times Square sowie virtuell am Mobileye Stammsitz in Jerusalem und den weiteren Niederlassungen weltweit.\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye logo:\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:mobileye's-logo[**]\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #000000;\">IPO Photo Gallery\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:ipo-press-kit[**]\u003C/p>\n\u003Cp>\u003Ca style=\"background-color: #ffffff; color: #242424;\" href=\"https://www.mobileye.com/opinion/to-our-customers-a-letter-from-our-ceo/\" target=\"_blank\" rel=\"noopener\">An unsere Kunden: Ein Brief unseres CEOs\u003C/a>\u003C/p>\n\u003Cp>\u003Ca style=\"background-color: #ffffff; color: #242424;\" href=\"https://vimeo.com/764201125\" target=\"_blank\" rel=\"noopener noreferrer\">Feierliches L&auml;uten der Er&ouml;ffnungsglocke B-Roll\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://vimeo.com/763958794/101f6ecfb4\" target=\"_blank\" rel=\"noopener\">IPO \u003Cstrong>B-roll\u003C/strong>\u003C/a>\u003C/p>",{"id":1354,"type":69,"url":1355,"title":1356,"description":1357,"primary_tag":73,"author_name":16,"is_hidden":11,"lang":1358,"meta_description":1357,"image":1359,"img_alt":1360,"content":1361,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1334,"tags":1344},178,"mobileye-is-once-again-a-public-company-cn","Mobileye再次成功上市","继英特尔推动其首次公开募股后，Mobileye现已在纳斯达克成功上市，股票代码为“MBLY”","cn","https://static.mobileye.com/website/us/corporate/images/8153b11720a89c4465f5671556cc2257_1666801480990.jpg","Mobileye 首席执行官Amnon Shashua 教授与员工敲响纳斯达克开市钟，庆祝Mobileye 重返美国股市，交易代码为“MBLY”. Credit: Photography courtesy of Nasdaq, Inc.","\u003Cp>Mobileye今日开始在纳斯达克股票交易市场挂牌上市，股票代码为&ldquo;MBLY&rdquo;，此次首次公开募股（IPO）由母公司英特尔牵头达成。凭借持续不断的创新，Mobileye作为英特尔旗下子公司在过去数年里取得了丰硕成果，此次IPO也将为股东释放Mobileye的更多价值。自2017年被英特尔收购，此次IPO意味着Mobileye再次重返股市。英特尔和Mobileye在位于时代广场的纳斯达克庆祝了这一里程碑事件，与此同时，Mobileye耶路撒冷总部和全球各地分公司也在线上共同庆祝这一历史性时刻。\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #242424; background-color: #ffffff;\">Mobileye Logo\u003C/span>:\u003C/p>\n\u003Cp>[**]gallery:mobileye's-logo[**]\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #242424; background-color: #ffffff;\">图片下载链接\u003C/span>\u003Cstrong>\u003Cem> \u003C/em>\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:ipo-press-kit[**]\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/opinion/to-our-customers-a-letter-from-our-ceo/\" target=\"_blank\" rel=\"noopener\">\u003Cstrong>\u003Cem>致客户的一封信\u003C/em>\u003C/strong>\u003C/a>\u003C/p>\n\u003Cp>\u003Ca style=\"color: #242424; background-color: #ffffff;\" href=\"https://vimeo.com/763958794\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cstrong>视频下载链接\u003C/strong>\u003C/a>\u003Ca href=\"https://vimeo.com/763958794\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cstrong>\u003Cem> \u003C/em>\u003C/strong>\u003C/a>\u003C/p>\n\u003Cp>\u003Ca style=\"color: #242424; background-color: #ffffff;\" href=\"https://vimeo.com/764201125\" target=\"_blank\" rel=\"noopener noreferrer\">敲钟仪式\u003C/a>\u003C/p>",{"id":1363,"type":654,"url":1364,"title":1365,"description":1366,"primary_tag":32,"author_name":1367,"is_hidden":11,"lang":12,"meta_description":1366,"image":1368,"img_alt":1369,"content":1370,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1371,"tags":1372},173,"to-our-customers-a-letter-from-our-ceo","To Our Customers: A Letter From Our CEO","Our dream is for every vehicle to have life-saving driver-assistance technology.","Prof. Amnon Shashua","https://static.mobileye.com/website/us/corporate/images/337a502dd06c87f3378bf2c3d4804e12_1670503513049.jpg","Mobileye Founder and CEO, Amnon Shashua, speaks about the listing of Mobileye shares on the Nasdaq Stock Exchange","\u003Cp>To our valued customers,\u003C/p>\u003Cp>Today we opened an exciting new chapter in Mobileye’s history as we once again became a public company. Shares of “MBLY” stock began trading on the Nasdaq exchange just a short time ago. I couldn’t be more thrilled.\u003C/p>\u003Cp>Mobileye steps back into the public eye more visible and valuable than we were before. Under Intel’s ownership over the past five years, we have grown and prospered and are emerging from Intel bigger and stronger, and ready to deliver what you need from driver-assistance technology – everything from today’s popular safety applications to the fully autonomous driving solutions of the future.\u003C/p>\u003Cp>Our time as a wholly owned Intel subsidiary helped us accelerate our business and get ready for this next chapter. These past five years saw us successfully develop and deliver highly advanced versions of the core platform technologies needed for the most advanced driver-assistance systems (ADAS): \u003Ca href=\"https://www.mobileye.com/technology/rem/\" rel=\"noopener noreferrer\" target=\"_blank\">Road Experience Management™ (REM™)\u003C/a> mapping technology, \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" rel=\"noopener noreferrer\" target=\"_blank\">Responsibility-Sensitive Safety (RSS)\u003C/a> driving policy, \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" rel=\"noopener noreferrer\" target=\"_blank\">True Redundancy™\u003C/a> sensing and the most advanced generations of our defining silicon-plus-software system-on-chips known as \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" rel=\"noopener noreferrer\" target=\"_blank\">EyeQ®\u003C/a>. Every one of these technologies enhances the capabilities of basic ADAS and enables a broader value chain of premium solutions that generate new consumer demand.\u003C/p>\u003Cp>Since we began shipping our first EyeQ, more than 800 vehicle models have shipped with Mobileye inside, totaling more than 125 million vehicles equipped with our technology. Our dream is for every vehicle to have life-saving driver-assistance technology. Together, we can make that vision a reality.\u003C/p>\u003Cp>Mobileye’s journey began more than two decades ago on my belief that computer vision technology could help prevent automobile crashes and save lives. We started small and grew quickly into a global company supplying more than 50 automakers. I see so much promise ahead and am as committed as ever to continuing our work to bring about a safer future. I have made these challenges my purpose and look forward to leading Mobileye as we work to fulfill the opportunity ahead.\u003C/p>\u003Cp>I want to personally thank all the Intel and Mobileye team members, along with our outside partners, who worked tirelessly for many months to make this day possible.\u003C/p>\u003Cp>This is not new territory for us. We operated independently before and during our time with Intel – self-funding our R&amp;D and transformation from technology supplier to mobility enabler. And I am confident in our ability to continue growing and thriving as a publicly traded company. We look to building on our successful relationships with you – our valued customers – to deliver the future of transportation.\u003C/p>\u003Cp>Sincerely,\u003C/p>\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/common/images/amnon_sig_200.png\" alt=\"the signature of Prof. Amnon Shashua, founder and CEO of Mobileye\">\u003C/p>\u003Cp>Prof. Amnon Shashua\u003C/p>\u003Cp>CEO of Mobileye Global Inc.\u003C/p>\u003Cp>&nbsp;\u003C/p>\u003Cp>\u003Ca href=\"https://www.mobileye.com/press-kit/mobileye-is-once-again-a-public-company/\" rel=\"noopener noreferrer\" target=\"_blank\">Press kit: Mobileye Is Once Again a Public Company\u003C/a>\u003C/p>","2022-10-26T00:00:00.000Z","News, From our CEO, Opinion",{"id":1374,"type":24,"url":1375,"title":1376,"description":1377,"primary_tag":28,"author_name":1378,"is_hidden":32,"lang":1341,"meta_description":1377,"image":1379,"img_alt":1369,"content":1380,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1371,"tags":444},175,"to-our-customers-a-letter-from-our-ceo-he","מכתב ללקוחות","\"חלומנו הוא שבכל רכב תהיה טכנולוגיית סיוע לנהג המצילה חיים\"","פרופ' אמנון שעשוע","https://static.mobileye.com/website/us/corporate/images/cc5ff76441d8158913308a7f82c6a1fe_1666800692934.jpg","\u003Cp>לקוחותינו היקרים,\u003C/p>\u003Cp>היום פתחנו פרק חדש ומרגש בהיסטוריה של מובילאיי, כשהפכנו שוב לחברה ציבורית. מניית \"MBLY\" החלה להיסחר בבורסת נאסד\"ק ממש לפני זמן קצר, וזהו דבר מרגש מאוד.\u003C/p>\u003Cp>מובילאיי חוזרת לעיני הציבור, באופן גלוי ומשמעותי יותר משהייתה בעבר. בחמש השנים האחרונות, תחת אינטל גדלנו ושגשגנו, וכיום אנו יוצאים גדולים וחזקים יותר, מוכנים לספק את צרכי השוק, החל ממערכות הבטיחות הפופולריות כיום ועד לפתרונות נהיגה אוטונומית מלאה.\u003C/p>\u003Cp>תקופה זו, בה היינו חברת בת מלאה של אינטל, סייעה בהאצת העסקים שלנו ואפשרה לנו לצאת מוכנים לקראת פרק חדש זה. בחמש השנים האחרונות פיתחנו וסיפקנו בהצלחה גרסאות מתקדמות ביותר של טכנולוגיות הליבה הדרושות למערכות המתקדמות ביותר לסיוע לנהגים (ADAS): טכנולוגיות המיפוי Road Experience Management™ (REM™) ; מדיניות הנהיגה Responsibility-Sensitive Safety (RSS)&nbsp;; מערכת החישה True Redundancy™ ; והדורות המתקדמים ביותר שלנו של מערכות על שבב&nbsp;(SoC), הנקראות EyeQ® , המשלב סיליקון ותוכנה. כל אחת מטכנולוגיות אלו משפרת את היכולות של המערכות לסיוע לנהג הבסיסיות (ADAS) ומאפשרת שרשרת רחבה של פתרונות פרמיום המייצרים ביקוש צרכני חדש.\u003C/p>\u003Cp>מאז ה-EyeQ הראשון שלנו, יותר מ-800 דגמי רכב&nbsp;יוצרו&nbsp;בשילוב מערכות של מובילאיי בתוכם, וסה\"כ יותר מ-125 מיליון רכבים המצוידים בטכנולוגיה שלנו. חלומנו הוא שבכל רכב תהיה טכנולוגיית סיוע לנהג המצילה חיים. יחד נוכל להפוך את החזון הזה למציאות.\u003C/p>\u003Cp>המסע של מובילאיי התחיל לפני יותר משני עשורים, מתוך אמונתי כי טכנולוגיית ראייה ממוחשבת יכולה לסייע במניעת תאונות דרכים ולהציל חיים. התחלנו בקטן וגדלנו במהירות לחברה גלובלית המספקת מערכות סיוע ליותר מ-50 יצרניות רכב. אני צופה דרך ארוכה לפנינו, ומחויב מתמיד להמשיך את העבודה המשותפת ולייצר יחד עתיד בטוח יותר. הפכתי את האתגרים הללו למטרה שלי, ואני מצפה להוביל את מובילאיי להגשמת ההזדמנות שלפנינו.\u003C/p>\u003Cp>אני רוצה להודות באופן אישי לכל חברי הצוות של אינטל ומובילאיי, יחד עם השותפים החיצוניים שלנו, שעבדו ללא לאות במשך חודשים רבים, כדי לאפשר ליום הזה לקרות.\u003C/p>\u003Cp>זוהי לא טריטוריה חדשה עבורנו. פעלנו באופן עצמאי לפני ובמהלך תקופתנו עם אינטל, מימנו בעצמנו את המו\"פ שלנו, ואת הפיכתנו מספק טכנולוגיה, לכזה המאפשר מוביליות. אני בטוח ביכולת שלנו להמשיך לצמוח ולשגשג כחברה בורסאית. אנו מצפים ליצור את עתיד התחבורה באמצעות מערכות היחסים המוצלחות שלנו איתכם,&nbsp;הלקוחות המוערכים שלנו.\u003C/p>\u003Cp>&nbsp;\u003C/p>\u003Cp>בכבוד רב,\u003C/p>\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/common/images/amnon_sig_200.png\" alt=\"the signature of Prof. Amnon Shashua, founder and CEO of Mobileye\">\u003C/p>\u003Cp>פרופ' אמנון שעשוע\u003C/p>\u003Cp>מנכ\"ל מובילאיי.\u003C/p>",{"id":1382,"type":24,"url":1383,"title":1384,"description":1385,"primary_tag":1170,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1385,"image":1386,"img_alt":1384,"content":1387,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1388,"tags":1175},170,"mobileye-ipo-pricing","Mobileye Announces Pricing of Initial Public Offering","Explore Mobileye's recent IPO details, including the offering price and trading start date. Learn about the new \"MBLY\" stock symbol and where we're headed.","https://static.mobileye.com/website/us/corporate/images/27e5fdcc1bdd9baddddf558d7f5f099b_1666606099770.jpg","\u003Cp>\u003Cstrong>Jerusalem, \u003C/strong>October 26, 2022 &ndash; Mobileye Global Inc. (&ldquo;Mobileye&rdquo;) today announced the pricing of its initial public offering of 41,000,000 shares of its Class A common stock at an initial public offering price of $21.00 per share.&nbsp;The shares are expected to begin trading on the Nasdaq Global Select Market on October 26, 2022, under the symbol &ldquo;MBLY,&rdquo; and the offering is expected to close on October 28, 2022, subject to customary closing conditions. In addition, Mobileye has granted the underwriters a 30-day option to purchase up to an additional 6,150,000 shares of Class A common stock at the initial public offering price, less underwriting discounts and commissions. The net proceeds from the offering to Mobileye, after deducting underwriting discounts and commissions and estimated offering expenses payable by Mobileye, are expected to be approximately $0.8 billion, excluding any exercise of the underwriters&rsquo; option to purchase additional shares.&nbsp;A significant portion of the net proceeds from this offering will be used for repayment on a note owed to Mobileye&rsquo;s parent company, Intel Corporation, and Mobileye intends to use the remaining net proceeds for working capital and general corporate purposes.\u003C/p>\n\u003Cp>In addition to the shares of Class A common stock sold in the public offering, Mobileye announced that General Atlantic will purchase 4,761,905 shares of Class A common stock in a private placement at a price per share equal to the initial public offering price, for gross proceeds of $100 million, subject to customary closing conditions. The sale of these shares will not be registered under the Securities Act of 1933, as amended. The closing of the initial public offering is not conditioned upon the closing of the private placement.\u003C/p>\n\u003Cp>Goldman Sachs &amp; Co. LLC and Morgan Stanley are acting as joint lead book-running managers for the proposed offering. Evercore ISI, Barclays, Citigroup, BofA Securities, RBC Capital Markets, Mizuho, Wolfe | Nomura Alliance and BNP PARIBAS are acting as book-running managers for the offering. Cowen, Siebert Williams Shank, PJT Partners, MUFG, Needham &amp; Company, Raymond James, Loop Capital Markets, Blaylock Van, LLC, Academy Securities, Drexel Hamilton, Independence Point Securities LLC, CICC, Cabrera Capital Markets LLC and Guzman &amp; Company are acting as co-managers for the offering.\u003C/p>\n\u003Cp>A registration statement relating to the shares being sold in this offering was filed with the Securities and Exchange Commission and became effective on October 25, 2022. The offering is being made only by means of a prospectus, copies of which may be obtained, when available, from: the SEC at www.sec.gov, and from: Goldman Sachs &amp; Co. LLC, Prospectus Department, 200 West Street, New York, NY 10282, telephone: 1-866-471-2526, or by emailing prospectus-ny@ny.email.gs.com; or Morgan Stanley &amp; Co. LLC, Attn: Prospectus Department, 180 Varick Street, Second Floor, New York, NY 10014.\u003C/p>\n\u003Cp>This press release shall not constitute an offer to sell or the solicitation of an offer to buy these securities, nor shall there be any sale of these securities in any state or jurisdiction in which such offer, solicitation or sale would be unlawful prior to registration or qualification under the securities laws of any such state or jurisdiction.\u003C/p>\n\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\n\u003Cp>Mobileye is a leader in the development and deployment of advanced driver-assistance systems (ADAS) and autonomous driving technologies and solutions. Mobileye pioneered ADAS technology more than 20 years ago and has continuously expanded the scope of its ADAS offerings, while leading the evolution to autonomous driving solutions.&nbsp;Mobileye&rsquo;s portfolio of solutions is built upon a comprehensive suite of purpose-built software and hardware technologies designed to provide the capabilities to make the future of ADAS and autonomous driving a reality. These technologies can be harnessed to deliver mission-critical capabilities at the edge and in the cloud, advancing the safety of road users, and revolutionizing the driving experience and the movement of people and goods globally.\u003C/p>\n\u003Cp>\u003Cstrong>Contacts\u003C/strong>\u003C/p>\n\u003Cp>Danielle Mann\u003C/p>\n\u003Cp>Intel Media Relations\u003C/p>\n\u003Cp>\u003Ca href=\"mailto:danielle.mann@intel.com\" target=\"_blank\" rel=\"noopener noreferrer\">danielle.mann@intel.com\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>John Pitzer\u003C/p>\n\u003Cp>Intel Investor Relations\u003C/p>\n\u003Cp>\u003Ca href=\"mailto:John.pitzer@intel.com\" target=\"_blank\" rel=\"noopener noreferrer\">John.pitzer@intel.com\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>Dan Galves\u003C/p>\n\u003Cp>Mobileye Investor Relations\u003C/p>\n\u003Cp>\u003Ca href=\"mailto:dan.galves2@mobileye.com\" target=\"_blank\" rel=\"noopener noreferrer\">dan.galves2@mobileye.com\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>Justin Hyde\u003C/p>\n\u003Cp>Corporate Communications and Media Relations\u003C/p>\n\u003Cp>\u003Ca href=\"mailto:justin.hyde@mobileye.com\" target=\"_blank\" rel=\"noopener noreferrer\">justin.hyde@mobileye.com\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>[**]gallery:mobileye's-logo[**]\u003C/p>\n\u003Cp>\u003Cstrong>&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>&ldquo;Wolfe | Nomura Alliance&rdquo; is the marketing name used by Wolfe Research Securities and Nomura Securities International, Inc. in connection with certain equity capital markets activities conducted jointly by the firms. Both Nomura Securities International, Inc. and WR Securities, LLC are serving as underwriters in the offering described herein. In addition, WR Securities, LLC and certain of its affiliates may provide sales support services, investor feedback, investor education, and/or other independent equity research services in connection with this offering.\u003C/p>","2022-10-25T07:00:00.000Z",{"id":1390,"type":5,"url":1391,"title":1392,"description":1393,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1393,"image":1394,"img_alt":1395,"content":1396,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1397,"tags":563},169,"pedestrian-safety-month-protection-detection","Pedestrian Safety Month: Protection Begins with Detection","Discover how Mobileye’s industry-leading computer-vision technology safeguards the most vulnerable of road users.","https://static.mobileye.com/website/us/corporate/images/d90b8822f888f9510de756ebd9d46040_1666603963340.jpg","Mobileye's pedestrian safety technology is based on the principle that to protect pedestrians, you first have to be able to detect them. ","\u003Cp>You&rsquo;re crossing the street when you see a vehicle headed towards you, and you hesitate: is the driver going to stop? Does the driver even see me?\u003C/p>\n\u003Cp>This is the operative question. Because, as a \u003Ca href=\"https://www.mobileye.com/blog/how-adas-and-data-can-lead-the-way-in-pedestrian-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">pedestrian\u003C/a>, you can seldom tell whether or not a driver sees you. His or her vision could be limited by any number of factors &ndash; and to see you, of course, the driver needs to be paying attention in the first place. And that, unfortunately, is not always the case.\u003C/p>\n\u003Cp>Computer vision, on the other hand, doesn&rsquo;t get distracted or lose focus. It doesn&rsquo;t get drowsy or intoxicated. It&rsquo;s always on, and always paying attention to the road and everything (and everyone) on it.\u003C/p>\n\u003Cp>Here&rsquo;s how our computer-vision technology works to protect pedestrians.\u003C/p>\n\u003Cp>\u003Cstrong>Detection for Protection\u003C/strong>\u003C/p>\n\u003Cp>In order \u003Ca href=\"https://www.mobileye.com/blog/avs-and-the-drive-for-pedestrian-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">to \u003Cem>protect\u003C/em> pedestrians\u003C/a>, you first have to be able to \u003Cem>detect\u003C/em> them. This is the fundamental principle behind the computer-vision driver-assistance technology we develop to safeguard the most vulnerable of road users.\u003C/p>\n\u003Cp>Our technology enables an array of driver-assistance features designed to keep pedestrians safe from being hit by moving motor vehicles. These include passive features &ndash; such as Forward Collision Warning and Blind-Spot Monitoring systems &ndash; which alert the driver to the proximity of pedestrians (among other road users, obstacles, and hazards). And it includes active features &ndash; like Automatic Emergency Braking (AEB) &ndash; designed to intervene if an imminent collision with a pedestrian (or other obstacle) is detected.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/7ef630a6737d24854b8184460bcc7d5f_1666604222228.jpg\" alt=\"Mobileye's computer-vision technology employs an array of algorithms engineered to identify pedestrians and other vulnerable road users.\" />\u003C/p>\n\u003Cp>Fortunately, Advanced Driver-Assistance Systems (ADAS) are becoming more commonplace in new cars, and many jurisdictions around the world are \u003Ca href=\"https://www.mobileye.com/blog/intelligent-speed-assist-general-safety-regulation/\" target=\"_blank\" rel=\"noopener noreferrer\">mandating them as standard equipment\u003C/a> on new models. A significant number of vehicles on the road and on the market \u003Ca href=\"https://www.mobileye.com/news/mobileye-tech-makes-the-grade-under-euro-ncaps-new-assisted-driving-standard/\" target=\"_blank\" rel=\"noopener\">employ our technology\u003C/a> to enable such features, and the capabilities of our technology are constantly advancing. And the more prolific and advanced these systems grow, the safer our roads become for pedestrians, cyclists, and other vulnerable road users.\u003C/p>\n\u003Cp>The \u003Ca href=\"https://www.iihs.org/topics/bibliography/ref/2243\" target=\"_blank\" rel=\"noopener noreferrer\">Insurance Institute for Highway Safety (IIHS) found\u003C/a> that AEB can reduce the rate of pedestrian collisions by over a quarter. And \u003Ca href=\"https://www.iihs.org/topics/fatality-statistics/detail/pedestrians\" target=\"_blank\" rel=\"noopener noreferrer\">it reports\u003C/a> that 6,516 pedestrians were killed in motor vehicle crashes in the United States in 2020, with some 55,000 more injured. Had the vehicles involved in those collisions been equipped with AEB, some 1,700 lives might have been saved and approximately 14,000 injuries prevented in that one year in the United States alone. And that&rsquo;s just one of the many ADAS features enabled by our solutions, in one of the many markets in which vehicles equipped with our technology are sold.\u003C/p>\n\u003Cp>\u003Cstrong>Safety Through Redundancy\u003C/strong>\u003C/p>\n\u003Cp>To increase the accuracy of detection and enhance the safety of such vulnerable road users, Mobileye&rsquo;s computer-vision ADAS technology employs not just one method of detection, but several operating in parallel &ndash; each processing the same camera feeds:\u003C/p>\n\u003Cul>\n\u003Cli class=\"ql-indent-1\">\u003Cstrong>Classic Pattern Recognition\u003C/strong> enables the system to automatically identify and classify objects and other road users.\u003C/li>\n\u003Cli class=\"ql-indent-1\">\u003Cstrong>Full Image Detection\u003C/strong> does the same for larger objects in close proximity to the vehicle.\u003C/li>\n\u003Cli class=\"ql-indent-1\">The\u003Cstrong> Segmentation method\u003C/strong> labels individual and groups of pixels to better identify smaller elements in the driving environment (such as pedestrians and cyclists).\u003C/li>\n\u003Cli class=\"ql-indent-1\">The\u003Cstrong> Top-View Free Space method\u003C/strong> identifies other objects and road users as distinct from the road surface.\u003C/li>\n\u003Cli class=\"ql-indent-1\">\u003Cstrong>Wheel Detection\u003C/strong> classifies other vehicles by identifying their wheels.\u003C/li>\n\u003Cli class=\"ql-indent-1\">\u003Cstrong>Vidar\u003C/strong> employs cutting-edge deep learning to create a lidar-like 3D model of the driving environment for increased situational awareness.\u003C/li>\n\u003C/ul>\n\u003Cp>On top of these, we implement specific algorithms dedicated to detecting baby strollers, wheelchairs, and open car doors &ndash; particularly important elements that vehicles are likely to encounter in their driving environment. Further algorithms identify and monitor the orientation, posture, and gestures of pedestrians to better recognize their situation and predict what they might do next.&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"Reacting to a Pedestrian on the Road &ndash; Mobileye SuperVision&trade;\" src=\"https://player.vimeo.com/video/733569961?h=4b0656d56a&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"640\" height=\"360\" frameborder=\"0\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Cp>By employing these varied methods and dedicated algorithms in parallel, we aim to significantly increase the probability of detection of pedestrians and other vulnerable road users.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"Reacting to a Vehicle on the Shoulder with an Open Door &ndash; Mobileye SuperVision&trade;\" src=\"https://player.vimeo.com/video/733570009?h=6fd03549f1&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"640\" height=\"360\" frameborder=\"0\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Cp>Of course, technological solutions form just one part of the equation in the effort to reduce the incidence of pedestrian collisions. We&rsquo;ve all witnessed distracted pedestrians stepping out into the street without looking, and we can hardly over-stress the importance of observing basic \u003Ca href=\"https://www.nhtsa.gov/pedestrian-safety/how-pedestrians-can-walk-safely\" target=\"_blank\" rel=\"noopener noreferrer\">pedestrian safety tips\u003C/a> &ndash; like sticking to sidewalks, crossing at designated crosswalks, observing signals, staying alert, and watching out for moving vehicles. But the next time you&rsquo;re at the crosswalk, you can rest assured that if the car or truck approaching is one of the 125 million on the road equipped with our technology, the vehicle will see you... even if the driver doesn&rsquo;t.\u003C/p>","2022-10-24T07:00:00.000Z",{"id":1399,"type":69,"url":1400,"title":1401,"description":1402,"primary_tag":73,"author_name":16,"is_hidden":11,"lang":1341,"meta_description":1402,"image":1403,"img_alt":1404,"content":1405,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1397,"tags":1344},172,"mobileyeautomotive-visual-assets-he","מובילאיי / קטלוג ויזואלי בתחום הרכב ","תמונות וסרטונים של מובילאיי","https://static.mobileye.com/website/us/corporate/images/f4cc2e3c27cedcdb5391e40c1f7c0846_1666597543645.png","(התמונה מציגה את צי המכוניות האוטונומיות של מובילאיי בישראל. (באדיבות מובילאיי","\u003Cp>במודל העסקי של מובילאיי ניתן למצוא מוצרים המתאימים לכל שלב בספקטרום הנהיגה האוטומטית. החל מהמצלמה הקדמית, על בסיסה פועלות מרבית מערכות העזר לנהג כיום (ADAS) ; נהיגה אוטונומית מוגבלת, הידועה בתור רמה 2+; ומערכת נהיגה אוטונומית מלאה (SDS), עבור רובוטקסי וכלי רכב אוטונומיים לצרכן הפרטי (Avs). מובילאיי מובילה בכל אחת מהקטגוריות הללו, באמצעות טכנולוגיית חישת הראייה המתקדמת ביותר בתעשייה, יכולת מיפוי ע\"י מיקור המונים ומדיניות הנהיגה Responsibility-Sensitive Safety (RSS).\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>המכוניות של מובילאיי\u003C/strong>\u003C/p>\n\u003Cp>&nbsp;[**]gallery:press-kits-autonomous-driving-gallery-1[**]\u003C/p>\n\u003Cp>\u003Cstrong>המעבדות של מובילאיי\u003C/strong>\u003C/p>\n\u003Cp>&nbsp;[**]gallery:press-kits-autonomous-driving-gallery-2[**]\u003C/p>\n\u003Cp>\u003Cstrong>נסיעות מבחן\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:press-kits-autonomous-driving-gallery-3[**]\u003C/p>\n\u003Cp>\u003Cstrong>Moovit\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:press-kits-autonomous-driving-gallery-4[**]\u003C/p>\n\u003Cp>\u003Cstrong>שותפים\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:press-kits-autonomous-driving-gallery-5[**]\u003C/p>\n\u003Cp>\u003Cstrong>CES 2021\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:press-kits-autonomous-driving-gallery-6[**]\u003C/p>\n\u003Cp>\u003Cstrong>אירועים של מובילאיי\u003C/strong>\u003C/p>\n\u003Cp>&nbsp;[**]gallery:press-kits-autonomous-driving-gallery-7[**]\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>סרטונים\u003C/strong>\u003C/p>\n\u003Cp>\u003Cstrong>&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>\u003Cstrong>CES 2021\u003C/strong>\u003C/p>\n\u003Cp>\u003Cstrong>&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://www.youtube.com/watch?v=fDiivbomPHA&amp;ab_channel=IntelNewsroom\" target=\"_blank\" rel=\"noopener noreferrer\">&nbsp;CES 2021: תומס פרידמן ופרופ' אמנון שעשוע משוחחים על בינה מלאכותית\u003C/a>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://www.youtube.com/watch?v=B7YNj66GxRA&amp;ab_channel=IntelNewsroom\" target=\"_blank\" rel=\"noopener noreferrer\">CES 2021: \"Under the Hood\" עם פרופ' אמנון שעשוע\u003C/a>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://www.intel.com/content/www/us/en/newsroom/news/ces-2021-news-conference-prof-amnon-shashua.html#gs.gkze56\" target=\"_blank\" rel=\"noopener noreferrer\">&nbsp;&nbsp;CES 2021: מסיבת עיתונאים של אינטל &ndash; אמנון שעשוע: \"זה הזמן ללכת\"\u003C/a>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://player.vimeo.com/video/498540549\" target=\"_blank\" rel=\"noopener noreferrer\">סיור מצולם של רכב אוטונומי של מובילאיי\u003C/a>\u003C/p>\n\u003Cp>&nbsp;&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>CES 2020 &ndash; אירועים של מובילאיי\u003C/strong>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://vimeo.com/711276379/f0a859a45d\" target=\"_blank\" rel=\"noopener\">&nbsp;&nbsp;CES 2020: ראייה ממוחשבת של מובילאיי (שידור חוזר)\u003C/a>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://vimeo.com/709278475/705ee30fbe\" target=\"_blank\" rel=\"noopener\">&nbsp;&nbsp;CES 2020: נסיעה לא ערוכה ברכב אוטונומי מונחה מצלמה של מובילאיי (B-Roll)\u003C/a>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://vimeo.com/709552074/24a1c1f458\" target=\"_blank\" rel=\"noopener\">&nbsp;CES 2020: מובילאיי ממפה את לאס וגאס (B-Roll)\u003C/a>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://vimeo.com/763339766/c093d52c16\" target=\"_blank\" rel=\"noopener\">CES 2020: דוכן מובילאיי (B-Roll)\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>יום משקיעים 2019 &ndash; אירועים של מובילאיי\u003C/strong>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://vimeo.com/763344638/93c804fef1\" target=\"_blank\" rel=\"noopener\">ועידת הפסגה של משקיעי מובילאיי 2019 (שידור חוזר של האירוע)\u003C/a>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://www.youtube.com/watch?v=KNh6i-35wK0&amp;ab_channel=IntelNewsroom\" target=\"_blank\" rel=\"noopener noreferrer\">&nbsp;יום המשקיעים של מובילאיי 2019\u003C/a>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://www.youtube.com/watch?v=V55ZhZJ2FFM&amp;ab_channel=IntelNewsroom\" target=\"_blank\" rel=\"noopener noreferrer\">&nbsp;אינטל הולכת \"בכל הכוח\" על הזדמנות של רובוטקסי\u003C/a>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://vimeo.com/763342706/e7e6981aca\" target=\"_blank\" rel=\"noopener\">אירוע משקיעים &ndash; RSS (מצגת של ג'ק וויסט)\u003C/a>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://vimeo.com/709569284/14af8027cb\" target=\"_blank\" rel=\"noopener\">&nbsp;&nbsp;צילום של פנים מכונית אוטונומית המופעלת על ידי מובילאיי (B-Roll)\u003C/a>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://vimeo.com/709625649/293e50947d\" target=\"_blank\" rel=\"noopener\">מכונית אוטונומית המופעלת על ידי מובילאיי מתמרנת בכבישי ישראל (B-Roll)\u003C/a>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://player.vimeo.com/video/371142023?dnt=1&amp;app_id=122963\" target=\"_blank\" rel=\"noopener noreferrer\">ועידת הפסגה של משקיעי מובילאיי 2019: גבי חיון\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>מבחנים בדטרויט &ndash; נסיעות מבחן\u003C/strong>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://vimeo.com/709635888/0af49bb591\" target=\"_blank\" rel=\"noopener\">רכב מבחן אוטונומי של מובילאיי בדטרויט (B-Roll)\u003C/a>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://vimeo.com/709648885/f7cd374f85\" target=\"_blank\" rel=\"noopener\">&nbsp;&nbsp;רכב מבחן אוטונומי של מובילאיי מתמרן ברחובות דטרויט (B-Roll)\u003C/a>\u003C/p>\n\u003Cp>&nbsp;&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>מבחנים במינכן &ndash; נסיעות מבחן\u003C/strong>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://vimeo.com/709653794/724c9381b9\" target=\"_blank\" rel=\"noopener\">&nbsp;רכב מבחן אוטונומי של מובילאיי מתמרן ברחובות מינכן (B-Roll)\u003C/a>\u003C/p>\n\u003Cp>&nbsp;&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>מבחנים בירושלים &ndash; נסיעות מבחן\u003C/strong>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://vimeo.com/709664539/b097c46580\" target=\"_blank\" rel=\"noopener\">&nbsp;&nbsp;אינטל ומובילאיי בוחנות כלי רכב אוטונומיים ברחובות ירושלים (B-Roll)\u003C/a>\u003C/p>\n\u003Cp>\u003Ca href=\"https://vimeo.com/709664539\" target=\"_blank\" rel=\"noopener noreferrer\">&nbsp;\u003C/a>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>סימן- טוב - שותפים\u003C/strong>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://www.youtube.com/watch?v=yV30JmGIf84&amp;ab_channel=MobileyeanIntelCompany\" target=\"_blank\" rel=\"noopener noreferrer\">&nbsp;&nbsp;סיפור הצלחה: חברת האוטובוסים סימן טוב הפחיתה את שיעור התאונות\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>המעבדות של מובילאיי&nbsp;\u003C/strong>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://www.youtube.com/watch?v=yV30JmGIf84&amp;ab_channel=MobileyeanIntelCompany\" target=\"_blank\" rel=\"noopener noreferrer\">&nbsp;מאחורי הקלעים של מובילאיי (B-Roll)\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Ordnance Survey - שותפים\u003C/strong>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://vimeo.com/710454028/44ab44d56f\" target=\"_blank\" rel=\"noopener\">&nbsp;&nbsp;&nbsp;מובילאיי ו-Ordnance Survey מציעות פרטי תשתית מפורטים\u003C/a>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://vimeo.com/710487628/2d9186de47\" target=\"_blank\" rel=\"noopener\">מובילאיי ו-Ordnance Survey מתחילות ניסויי מיפוי בבריטניה (B-Roll)\u003C/a>\u003C/p>\n\u003Cp>&middot;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;\u003Ca href=\"https://vimeo.com/710962365/5b52499b85\" target=\"_blank\" rel=\"noopener\">&nbsp;מובילאיי ו-Ordnance Survey מספקות נתוני מיקום לסוכנויות ועסקים בבריטניה (B-Roll)\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>",{"id":1407,"type":24,"url":1408,"title":1409,"description":1410,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1410,"image":1411,"img_alt":1412,"content":1413,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":11,"featured":11,"publish_date":1414,"tags":444},171,"geely-holding-group-expands-mobileye-collaboration","Geely Holding Group Expands Mobileye Collaboration","Three additional Geely brands to leverage Mobileye SuperVision for advanced ADAS, building off the successful launch of the ZEEKR 001.","https://static.mobileye.com/website/us/corporate/images/f2fac34c25b0e0edf42e296664323488_1666776487045.jpg","Mobileye and Geely Holding Group are expanding their collaboration to bring Mobileye SuperVision to three additional Geely brands, with additional plans to bring the technology to other models in the future. The expansion follows the successful launch of Geely Group’s ZEEKR 001, a premium electric vehicle also equipped with the Mobileye full-stack advanced driver-assistance system. (Credit: Mobileye, an Intel Company) tecture and Engineering Group","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\u003Cp>NEWS HIGHLIGHTS\u003C/p>\u003Cul>\u003Cli>Mobileye and Geely Holding Group aim to expand their collaboration to include three additional Geely brands.\u003C/li>\u003Cli>Collaboration builds off the successful launch of the ZEEKR 001 electric vehicle with Mobileye SuperVision.\u003C/li>\u003Cli>ZEEKR to expand Mobileye SuperVision to additional models, as well.\u003C/li>\u003C/ul>\u003Cp>JERUSALEM and SHANGHAI, Sept. 26, 2022 — Mobileye and Geely Holding Group announced today the expansion of their ongoing collaboration for advanced driver-assistance systems (ADAS) and autonomous vehicle technology. The announcement follows the successful launch of the ZEEKR 001 premium electric vehicle (EV) with Mobileye SuperVision™ technology, with more than 40,000 vehicles already on the road and ahead of an over-the-air (OTA) update that will unlock SuperVision’s full capabilities.\u003C/p>\u003Cp>Building on the success with the premium electric mobility technology brand ZEEKR, three additional brands under the Geely Holding Group umbrella are set to globally launch electric vehicle models with Mobileye SuperVision technology beginning next year. ZEEKR will also introduce Mobileye SuperVision on two new EV models, as well as developing new lidar-based features with Mobileye.\u003C/p>\u003Cp>“We have proudly worked with ZEEKR, our strategic partner, on the first consumer deployment of SuperVision technology, demonstrating both its on-road capabilities and its ability to evolve through over-the-air updates,” said Prof. Amnon Shashua, Mobileye president and chief executive officer. “This is only the beginning of potential applications for this technology, and with these new projects, we will demonstrate how SuperVision can be adapted to any brand’s specific needs.”\u003C/p>\u003Cp>An&nbsp;Conghui, president of Geely Holding Group and CEO of ZEEKR Intelligent Technology, said: \"Mobileye is an important strategic partner for ZEEKR. As the cooperation between ZEEKR and Mobileye continues to deepen, Mobileye's globally leading intelligent driving technologies will be used in more ZEEKR models in the future. ZEEKR is committed to openness and integrating future technologies in various fields to provide our users with better intelligent drive experiences.\"\u003C/p>\u003Cp>ZEEKR 001 customers already benefit from&nbsp;constantly upgrading surround vision-based, highway-assist capabilities with special safety features.&nbsp;The system is expected to receive full SuperVision capabilities through over-the-air updates by the end of this year that will bring ZEEKR customers‘ driving experience to the next level.&nbsp;&nbsp;\u003C/p>\u003Cp>Mobileye SuperVision is powered by two 7 nanometer EyeQ®5 system-on-chip. It supports point-to-point assisted driving under a wide range of road types – from highway, arterial and rural to urban. Mobileye SuperVision enables the vehicle to change lanes autonomously, navigate intersections and manage key driving priorities, as well as powering automated parking and preventive steering and braking. The Mobileye SuperVision system uses 11 high-resolution cameras – seven long-range and four parking cameras – to provide full visual coverage surrounding the vehicle.\u003C/p>\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\u003Cp>Mobileye is a global leader in the development of computer vision and machine learning, data analysis, localization and mapping for advanced driver-assistance systems and autonomous driving. Mobileye’s technology helps keep passengers safer on the roads, reduces the risks of traffic accidents, saves lives and has the potential to revolutionize the driving experience by enabling autonomous driving. Mobileye’s proprietary software algorithms and EyeQ® chips perform detailed interpretations of the visual field to anticipate possible collisions with other vehicles, pedestrians, cyclists, animals, debris and other obstacles.&nbsp;Mobileye is a wholly owned subsidiary of&nbsp;Intel (Nasdaq: INTC). For more information, visit&nbsp;\u003Ca href=\"https://www.mobileye.com/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: rgb(0, 104, 181); background-color: transparent;\">https://www.mobileye.com\u003C/a>.\u003C/p>\u003Cp>\u003Cstrong>Forward-Looking Statements\u003C/strong>\u003C/p>\u003Cp>Statements in this press release that refer to future plans and expectations are forward-looking statements that involve a number of risks and uncertainties. Words such as “anticipates,” “expects,” “intends,” “goals,” “plans,” “believes,” “seeks,” “estimates,” “continues,” “may,” “will,” “would,” “should,” “could,” and variations of such words and similar expressions are intended to identify such forward-looking statements. Statements that refer to or are based on estimates, forecasts, projections, uncertain events or assumptions, including statements relating to future products and technology and the availability and benefits of such products and technology, expectations regarding customers, market opportunity, and anticipated trends in our businesses or the markets relevant to them, also identify forward-looking statements. Such statements are based on current expectations and involve many risks and uncertainties that could cause actual results to differ materially from those expressed or implied in these forward-looking statements. Important factors that could cause actual results to differ materially are set forth in Intel Corporation’s (“Intel”) SEC filings, including Intel’s most recent reports on Forms 10-K and 10-Q, which may be obtained by visiting Intel’s Investor Relations website at www.intc.com or the SEC’s website at www.sec.gov. Mobileye and Intel do not undertake, and expressly disclaims any duty, to update any statement made in this press release, whether as a result of new information, new developments or otherwise, except to the extent that disclosure may be required by law.\u003C/p>","2022-09-26T00:00:00.000Z",{"id":1416,"type":24,"url":1417,"title":1418,"description":1419,"primary_tag":16,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1419,"image":1420,"img_alt":1421,"content":1422,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1423,"tags":928},136,"autonomous-vehicle-detroit-united-states","Level 4 Autonomy Now Testing in Detroit with Mobileye Drive™","Our fully autonomous vehicle reaches American roads for the first time with True Redundancy™ sensing subsystems: one based on cameras and another on radar and lidar.","https://static.mobileye.com/website/us/corporate/images/2f49505e45882ee08906fe494b3f7d20_1664775234337.png","NIO ES8 autonomous vehicle equipped with Mobileye Drive arrives in Detroit, Michigan, for testing in the United States.","\u003Cp>Over the past few years, Mobileye has run one of the most ambitious testing programs in the world for autonomous vehicles. To date, our development AVs have operated in about 20 cities across ten countries on three continents around the globe. Now, this program is entering its next phase as \u003Ca href=\"https://www.mobileye.com/blog/robotaxi-night-drive-jerusalem-unedited-video/\" target=\"_blank\" rel=\"noopener noreferrer\">our fully autonomous vehicle equipped with Mobileye Drive&trade;\u003C/a> has begun testing in Detroit \u003Cspan style=\"background-color: #ffffff; color: #000000;\">&ndash;&nbsp;\u003C/span>the first time our future Level 4 self-driving solution has hit U.S. roads.\u003C/p>\n\u003Cp>Integrated into the all-electric NIO ES8 sport-utility vehicle, \u003Ca href=\"https://www.mobileye.com/blog/mobileye-drive-self-driving-system/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Drive\u003C/a> combines the power of our unique approach to autonomous mobility with our experience in delivering driver-assistance systems in nearly 120 million vehicles. The system benefits from unique Mobileye innovations, including \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">True Redundancy&trade;\u003C/a> sensing, \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">Road Experience Management&trade;\u003C/a> (REM&trade;) crowdsourced mapping, and \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">Responsibility-Sensitive Safety\u003C/a> (RSS) driving policy.\u003C/p>\n\u003Cp>Where our \u003Ca href=\"https://youtu.be/vL_QNy25n74\" target=\"_blank\" rel=\"noopener noreferrer\">prior test cars\u003C/a> used either cameras alone or a combination of lidars and radars alone, the commercial version of Mobileye Drive employs both &ndash; a 360-degree suite of advanced sensing technology containing 11 cameras, 6 radars, 3 long-range lidars and 6 short-range lidars &ndash; all powered by our industry-leading \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ&reg; chips\u003C/a>. Under our True Redundancy approach, the camera subsystem operates independently of the radar/lidar subsystem, providing more robust sensing of road conditions and other traffic, while also providing safety-critical redundancy as each subsystem complements its fellow subsystem.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/7972ba7a570cd5ecf164a006930f3ada_1662541681368.jpg\" alt=\"A fleet of NIO ES8 autonomous vehicles equipped with Mobileye Drive at our headquarters in Jerusalem, Israel.\" />\u003C/p>\n\u003Cp>\u003Cem>Pictured above: a fleet of NIO ES8 electric crossover utility vehicles outside Mobileye headquarters in Jerusalem (for illustration).\u003C/em>\u003C/p>\n\u003Cp>\u003Cstrong>Testing Globally, Driving Locally\u003C/strong>\u003C/p>\n\u003Cp>Rather than deploying autonomous mobility in limited geographic areas, we believe that self-driving technology will need to be widely usable across many different types of roads and situations. This means that our technology should be adaptable not just to different locations, but to different climates and driving cultures as well.\u003C/p>\n\u003Cp>Anyone who&rsquo;s ever driven in a foreign country knows you need not just a map, but also an understanding of the local &ldquo;rules of the road.&rdquo; The REM-powered Mobileye Roadbook&trade; helps us gather data on the general behavior of traffic in different places, and RSS adapts the Mobileye Drive system to local behavior in those places. By testing Mobileye Drive in Detroit, we&rsquo;ll expose Mobileye Drive to the everyday challenges of American driving, and some unique local roadway characteristics (like &ldquo;\u003Ca href=\"https://www.michigan.gov/mdot/Travel/safety/Road-Users/michigan-lefts\" target=\"_blank\" rel=\"noopener noreferrer\">Michigan lefts\u003C/a>&rdquo;) to further verify its capabilities.\u003C/p>\n\u003Cp>&ldquo;Our Detroit testing of Mobileye Drive is helping us ensure that the system can bring forward the global commercialization of autonomous driving technology and deliver on its promise to vastly improve road safety,&rdquo; said Johann &ldquo;JJ&rdquo; Jungwirth, Senior Vice President of Autonomous Vehicles at Mobileye. &ldquo;We take the challenge of proving the capabilities of our technology seriously. By testing in the birthplace of the American automotive industry, we expect to make major progress toward our goals.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>Safety for Everyone\u003C/strong>\u003C/p>\n\u003Cp>The \u003Ca href=\"https://www.mobileye.com/blog/self-driving-robotaxi-sixt-germany-iaa/\" target=\"_blank\" rel=\"noopener noreferrer\">NIO ES8 equipped with Mobileye Drive\u003C/a> serves as a test platform in the development of our turnkey self-driving solutions for both commercial and future consumer AVs. Fleets of this vehicle will also form the basis of the robotaxi services we&rsquo;re preparing to roll out in the coming months with our partners in Germany and Jerusalem.\u003C/p>\n\u003Cp>Mobileye has worked closely with the U.S. National Highway Traffic Safety Administration to ensure safe operation of these vehicles on U.S. roads. A highly trained safety driver is behind the wheel of the Mobileye vehicles while testing in Detroit. Our current test plan does not include giving rides to members of the public, or testing without safety drivers.\u003C/p>\n\u003Cp>In the months ahead, we&rsquo;ll have more milestones to announce around the progress of Mobileye Drive.\u003C/p>","2022-09-07T07:00:00.000Z",{"id":1425,"type":5,"url":1426,"title":1427,"description":1428,"primary_tag":16,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1428,"image":1429,"img_alt":1430,"content":1431,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1432,"tags":1433},135,"intelligent-speed-assist-general-safety-regulation","General Safety Regulation Mandates Intelligent Speed Assist, Mobileye Stands Ready","The latest European Union (EU) General Safety Regulation mandates intelligent speed assist on all new vehicles. Mobileye stands ready to support this with industry-leading technologies, including a camera-only solution as well as additional redundancy through Road Experience Management™ HD maps.","https://static.mobileye.com/website/us/corporate/images/646e3e88de2fe5fc191a94ae0ebdbcec_1660547033071.png","Intelligent Speed Assist is now required on all new vehicles being sold in the European Union, and in many other countries.","\u003Cp>Around the world, speeding ranks among the leading causes of road crashes and deaths. In Europe alone, research shows that speeding contributes to one third of all fatal collisions. In response, European Union safety regulators have developed rules to help drivers stick to speed limits &ndash; by ensuring that even if drivers don&rsquo;t know what the speed limit is on a given road, their vehicles will.\u003C/p>\n\u003Cp>Earlier this summer, the EU updated its automotive \u003Ca href=\"https://ec.europa.eu/commission/presscorner/detail/en/IP_22_4312\" target=\"_blank\" rel=\"noopener noreferrer\">General Safety Regulation (GSR)\u003C/a> to mandate intelligent speed assist, or ISA. As of July 2022, all vehicle models in new production lines are required to be equipped with ISA, and as of July 2024, vehicle models in running production lines will also need to be equipped with ISA.\u003C/p>\n\u003Cp>ISA works either passively or actively. In a passive system, ISA simply alerts drivers when they exceed posted speed limits. In an active system, ISA intervenes to gently slow a vehicle down towards the posted speed limit. The EU expects ISA systems to reduce collisions by as much as 30 percent, and fatalities by up to 20 percent. ISA also has the potential to help drivers avoid speeding tickets, to lower insurance premiums, and to reduce carbon emissions (although drivers can override the ISA system or even turn it off completely).\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/cc6a27f94fd363634459927880ef5f64_1660547538645.png\" alt=\"Automatically recognizing traffic signs is the essential first step to helping drivers stay within the legal speed limit.\" />\u003C/p>\n\u003Cp>Saying that a vehicle should know what the speed limit is on any given stretch of road sounds simple &ndash; but to meet the new standards, ISA systems will have to recognize permanent and temporary signs (both their presence and what they say), across dozens of countries, in all weather conditions, day and night. To meet these challenges, Mobileye has developed two approaches to ISA, building on its decades of experience and technology in the field of advanced driver-assistance systems.\u003C/p>\n\u003Cp>\u003Cstrong class=\"ql-size-large\">Driven by Vision\u003C/strong>\u003C/p>\n\u003Cp>Mobileye&rsquo;s first route to supporting ISA builds on our industry-leading computer-vision technology. Our \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ&reg; systems-on-chip\u003C/a> offer state-of-the-art traffic sign recognition (TSR) features to identify, in real-time, all manner of road signage required to support ISA &ndash; including both explicit speed-limit signs as well as other information that implicitly indicates the legal speed limit, such as road type.\u003C/p>\n\u003Cp>The significant diversity in speed-limit and other signage across the EU poses a major challenge for automakers. Mobileye tackles this problem with cutting-edge computer vision technology and a \u003Ca href=\"https://www.intc.com/news-events/press-releases/detail/1518/mobileyes-self-driving-secret-200pb-of-data\" target=\"_blank\" rel=\"noopener noreferrer\">database of over 200 petabytes of video clips\u003C/a> that have been collected over the course of 15 years. While the appearance of these signs can vary from one country to the next, Mobileye&rsquo;s computer vision algorithms are programmed to recognize all types of signs used in the constituent member states of the EU (and other parts of the world).\u003C/p>\n\u003Cp>&ldquo;Our development of a robust ISA solution leverages our 200-pb database, which includes 23 million video clips collected over years of driving on urban, highway, and arterial roads in over 80 countries,&rdquo; said \u003Ca href=\"https://www.mobileye.com/news/gaby-hayon-business-insider-av-power-player/\" target=\"_blank\" rel=\"noopener\">Dr. Gaby Hayon\u003C/a>, Executive Vice President of Research &amp; Development at Mobileye.\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/55756820bec3af32e02787f403895a94_1660547680389.png\" alt=\"Mobileye provides industry-leading computer vision technology to many of the world&rsquo;s leading automakers to support advanced driver-assistance systems.\" />\u003C/p>\n\u003Cp>To verify compliance with the new standards, the GSR includes stringent testing requirements. Vehicles will be required to undergo a reliability test, covering 400 kilometers of real-world driving on a mix of urban, non-urban, and highway roads, including at least 15 percent in \u003Ca href=\"https://www.mobileye.com/blog/robotaxi-night-drive-jerusalem-unedited-video/\" target=\"_blank\" rel=\"noopener noreferrer\">nighttime conditions\u003C/a>. To pass the test, vehicles will need to correctly determine the speed limit on at least 90 percent of the total distance, and no less than 80 percent on each road type. Vehicles will also have to undergo additional testing to determine the capabilities of their sign-recognition systems in identifying both explicit speed-limit signs and implicit signs (such as those signaling construction areas, school zones, and highway on-ramps) that indicate a change in the speed limit.\u003C/p>\n\u003Cp>These requirements are difficult to meet, but due to the proven capabilities of Mobileye's computer vision technology, vehicles incorporating our solution will be able to pass the EU tests using on-board cameras alone. Nevertheless, safety-critical applications require robust redundancy, which is why we&rsquo;re also supporting ISA through our innovative crowdsourced mapping technology.\u003C/p>\n\u003Cp>\u003Cstrong class=\"ql-size-large\">The Power of the Crowd\u003C/strong>\u003C/p>\n\u003Cp>Mobileye has developed \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">Road Experience Management&trade;\u003C/a> (REM&trade;), which pulls snippets of data about details of the road from millions of vehicles around the world equipped with our technology. We then process this data to create an incredibly rich supplemental layer of information on the driving environment which we feed back to vehicles.\u003C/p>\n\u003Cp>As REM automatically crowdsources data from vehicles equipped with our technology, it delivers far more reliable and up-to-date information about current road conditions than map data gathered by traditional means (such as data-collection teams). So when humans&rsquo; and cameras&rsquo; line of sight is obstructed &ndash; for example, if a traffic sign is out of the field of view, blocked by an overhanging tree, or worn down by weather &ndash; an ISA system using REM data would still be able to inform the vehicle of the applicable speed limit, thus supplementing real-time computer vision.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/a0f7cb6f3e769dddd343969226f5a8a3_1660547715419.png\" alt=\"Mobileye&rsquo;s REM crowdsourced mapping technology gathers a wealth of important information on the driving environment, including traffic signs and signals and their relevance to each lane.\" />\u003C/p>\n\u003Cp>This \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">redundant approach\u003C/a> (vision, backed by crowdsourced \"memory\") can deliver a more reliable and more effective intelligent speed assist system &ndash; one that lives up to the promise of reducing crashes and fatalities and improving safety for everyone.\u003C/p>","2022-08-24T07:00:00.000Z","Industry, ADAS, Mapping & REM",{"id":1435,"type":5,"url":1436,"title":1437,"description":1438,"primary_tag":16,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1438,"image":1439,"img_alt":1440,"content":1441,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1442,"tags":563},134,"mobileye-supervision-test-demo-road-trip","Mobileye SuperVision™ Put to the Test in European Road Trip","A recent long-distance, real-world test, covering some 2,000 kilometers from Spain to Germany, demonstrated the capabilities of our next-generation driver-assist system.","https://static.mobileye.com/website/us/corporate/images/e2248b8c88e1389714353d2db9303f0d_1660125523459.jpg","Out on the open road in one of our development vehicles running Mobileye SuperVision™ – our Level 2++ driver-assist solution.","\u003Cp>Suppose that you were developing some new technological innovation. You&rsquo;d need to test its capabilities, but just how hard would you want to push it? And how tightly would you want to control the variables and parameters of your test?\u003C/p>\n\u003Cp>Your answer would likely depend largely on the level of confidence you have in your technology and how far along you are in its development. With \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye SuperVision&trade;\u003C/a>, our confidence has been bolstered by the results of ongoing development over the course of \u003Ca href=\"https://www.mobileye.com/news/mobileye-av-stack/\" target=\"_blank\" rel=\"noopener noreferrer\">the past two years\u003C/a>. And our conviction in its capabilities was only augmented further after completing this latest test.\u003C/p>\n\u003Cp>\u003Cstrong>2,000 Kilometers, Six Countries, Four Days\u003C/strong>\u003C/p>\n\u003Cp>Just weeks ago, as part of a demonstration for customers, we completed a multi-day, transcontinental road trip that put Mobileye SuperVision&trade; &ndash; our next-generation driver-assist system &ndash; to the test. In the span of four days, we covered nearly 2,000 kilometers, passing through six countries in southern and central Europe &ndash; eschewing the confines of a controlled environment to venture out on roads that our technology had only mapped (but our test vehicles had never driven on) before.&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/796c2660f5de90165246a2a8ff07f69e_1660219156895.png\" alt=\"The 2,000-kilometer road trip passed through Spain, France, Monaco, Italy, Austria, and Germany.\" />\u003C/p>\n\u003Cp>From our starting point in Barcelona, we traversed the Spanish and French Rivieras, then drove through Monaco, northern Italy, Austria, and much of \u003Ca href=\"https://www.mobileye.com/news/germany-level-4-autonomous-vehicle-law-regulations/\" target=\"_blank\" rel=\"noopener\">Germany\u003C/a> (where the journey concluded). All told, the trip encompassed nearly 40 hours of driving, including some 300 kilometers at night, in heat as high as 40 degrees Celsius, on a combination of packed city streets, twisting country roads, and high-speed interurban highways.\u003C/p>\n\u003Cp>We performed the test in one of our hybrid-sedan development vehicles &ndash; not unlike those we&rsquo;ve tested&nbsp;in Jerusalem, Tel Aviv, \u003Ca href=\"https://www.mobileye.com/blog/paris-ratp-autonomous-vehicle-testing-pilot/\" target=\"_blank\" rel=\"noopener noreferrer\">Paris\u003C/a>, Munich, \u003Ca href=\"https://www.mobileye.com/news/autonomous-vehicle-testing-miami-stuttgart/\" target=\"_blank\" rel=\"noopener\">Stuttgart\u003C/a>, Detroit, New York, Miami, Tokyo, and Shanghai. Only this one was upgraded with our latest 8-megapixel camera system, providing high-resolution, 360-degree computer-vision coverage. And far from conducting the test in secret, we invited representatives from some of our closest OEM partners to join us for the journey.\u003C/p>\n\u003Cp>\u003Cstrong>Spectacular Performance\u003C/strong>\u003C/p>\n\u003Cp>The test made extensive use of the \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Roadbook&trade;, our AV map fueled by REM&trade;\u003C/a>, which provided the vehicle with a wealth of information about what to expect in its driving environment. To transparently demonstrate REM&rsquo;s adaptability and capability, we even let our guests choose waypoints along the route &ndash; so the route could be set or reset with minimal notice. Also, to show just how well the computer-vision system alone works, we performed a significant portion of the driving in &ldquo;mapless mode&rdquo; (without the benefit of the Mobileye Roadbook), relying strictly on the vehicle&rsquo;s onboard cameras instead. And we&rsquo;re proud to report that the system performed impressively throughout &ndash; requiring only occasional and minimal human intervention, even after dark on non-illuminated roads and on pavement with worn-away lane markings.\u003C/p>\n\u003Cp>In the Italian city of Genoa, for example, our vehicle spent hours crawling through heavy urban traffic during a record heatwave, without any need for human intervention. On another night, the car drove itself out of Monaco, even after our team got lost with no cellular reception.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/2cd1696ee61e011ccf4c13a062f94862_1660125737870.jpg\" alt=\"A Mobileye SuperVision development vehicle in Monte Carlo, Monaco, in the middle of a 2,000-kilometer road trip.\" />\u003C/p>\n\u003Cp>&ldquo;This four-day road trip across Europe was a real showcase of the capabilities of Mobileye SuperVision,&rdquo; said Nimrod Nehushtan, Senior Vice President for Business Development &amp; Strategy and co-manager of REM at Mobileye. &ldquo;A test this rigorous would have uncovered serious faults in inferior technology. But the performance of our system met or exceeded all expectations and proved itself in very real-world conditions.\u003C/p>\n\u003Cp>&ldquo;A driver-assist system &ndash; especially one this advanced &ndash; needs to be able to work on any road, in a wide variety of conditions, and handle even highly unlikely driving scenarios. This test aptly demonstrated just how capable Mobileye SuperVision really is.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>We&rsquo;re Just Getting Started\u003C/strong>\u003C/p>\n\u003Cp>This promising and significant development comes shortly after the over-the-air update issued just two weeks ago, which \u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-zeekr-ota-update/\" target=\"_blank\" rel=\"noopener noreferrer\">beamed new highway-assist capabilities to tens of thousands of Zeekr 001 EVs\u003C/a> already in the hands of customers. In the Zeekr, Mobileye SuperVision incorporates 11 cameras, dual EyeQ&reg;5 High chips on an integrated Mobileye SuperVision ECU, and an evolving suite of software to enable one of the most advanced (and continually advancing) feature sets on the market.\u003C/p>\n\u003Cp>The success of this latest test paves the road for the adoption of Mobileye SuperVision in additional vehicles. And we have more demonstrations like this one scheduled for the coming weeks in other parts of the world, opening the door for even further implementations of this compelling and comprehensive new driver-assist solution in the near future.\u003C/p>","2022-08-11T07:00:00.000Z",{"id":1444,"type":24,"url":1445,"title":1446,"description":1447,"primary_tag":16,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1447,"image":1448,"img_alt":1449,"content":1450,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1451,"tags":1175},133,"q2-2022-financial-results","Mobileye Reports Q2 2022 Financial Results","Revenue of $460 million represents a 41% increase year-over-year as business continues to grow, as disclosed by our parent company Intel last night.","https://static.mobileye.com/website/us/corporate/images/7ee45e3331c108f6ec192a88f7bdec42_1659045140080.png","Mobileye shipped 16 million units in the first half of 2022, and won new business projected at more than twice that amount.","\u003Cp>Mobileye had a very successful second quarter, as disclosed by Intel last night. Revenue of $460 million was up 41% year-over-year (YOY), outperforming the rate of increase in global automotive production which was relatively flat YOY. Profitability was also robust with $190 million of Operating Income (up 43% YOY) and representing a 41% operating income margin.\u003C/p>\n\u003Cp>Mobileye&rsquo;s future business backlog continues to grow as well, with first half 2022 design wins generating nearly 37 million units of projected future business, compared to about 16 million units actually shipped in the first half. \u003Ca href=\"https://www.mobileye.com/solutions/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye&rsquo;s advanced product portfolio\u003C/a>, which stretches across the entire ADAS and AV spectrum, continues to gain traction.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/04f073bded81aaa72afdfe34f1bbe604_1659045183998.png\" alt=\"Mobileye's revenue over Q2 2022 was up 41 percent year-over-year, with a 43-percent increase in operating income.\" />\u003C/p>\n\u003Cp>Amidst late-stage discussions with several major OEM&rsquo;s, proof points on our \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye SuperVision&trade;\u003C/a> product included a \u003Ca href=\"https://www.mobileye.com/blog/mobileye-supervision-zeekr-ota-update/\" target=\"_blank\" rel=\"noopener noreferrer\">major feature upgrade to Zeekr delivered over-the-air last week\u003C/a>. We also successfully completed a 2,000+ kilometer proof-of-concept expedition for Mobileye SuperVision with a major European OEM on all road types, including night-time driving in several urban centers, utilizing only our 360-degree camera system, \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">RSS\u003C/a>-supported driving policy, and the existing \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">REM&trade;-based high-definition map\u003C/a>.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/d6b95e55aad1f7344bc3cb38e81fccca_1659045240962.png\" alt=\"Mobileye's Road Experience Management continues to map the world's roadways for autonomous vehicles and advanced driver-assistance applications.\" />\u003C/p>\n\u003Cp>Finally, we&rsquo;d like to highlight the growth in REM, our crowd-sourced high-definition mapping product, since our last update at CES in early January. As of the end of last quarter, we had collected 8.6 billion miles of road data from, based on our estimates, approximately 1.5 million REM-enabled vehicles worldwide, and were analyzing up to 43 million miles of road data per day, with the size of the REM-enabled fleet increasing daily.&nbsp;The scale of REM becomes clear when compared to manually generated high-definition road maps from competitors that currently cover 1-5% of US roads. After years of development, REM is now a fully operational product, becoming a key source of information, and a differentiator, across our entire portfolio from \u003Ca href=\"https://www.mobileye.com/blog/cloud-enhanced-driver-assist/\" target=\"_blank\" rel=\"noopener noreferrer\">Cloud-Enhanced Driver-Assist\u003C/a> to Mobileye SuperVision to autonomous vehicles both for consumers and future \u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">Mobility-as-a-Service\u003C/a> networks.\u003C/p>","2022-07-29T07:00:00.000Z",{"id":1453,"type":5,"url":1454,"title":1455,"description":1456,"primary_tag":9,"author_name":1367,"is_hidden":11,"lang":12,"meta_description":1456,"image":1457,"img_alt":1458,"content":1459,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1460,"tags":1461},132,"mobileye-supervision-zeekr-ota-update","Mobileye and Zeekr OTA Update Opens a New Chapter in Advanced Driver Assist","Unparalleled safety and comfort features delivered over-the-air, at the push of a button, to tens of thousands of Zeekr 001 electric vehicles.","https://static.mobileye.com/website/us/corporate/images/43bdc97a923d0d2e8093677cf4c5b485_1658840446609.png","The Zeekr 001 features Mobileye SuperVision, our next-generation premium driver-assist system.","\u003Cp>Last week, Mobileye and Zeekr delivered one of the world's most advanced highway assist packages over-the-air to tens of thousands of Zeekr 001 EV owners. Having already been equipped with seven 8-megapixel cameras (on top of four parking cameras) offering 360-degree surround perception, and two 7-nm EyeQ&reg;5 High Systems-on-Chip, the hardware was already in place to enable a massive feature update, at the push of a button.\u003C/p>\n\u003Cp>It&rsquo;s not just the future of safety, but a demonstration of how vehicles will gain new capabilities through software advancements. All this comes less than two years after we \u003Ca href=\"https://www.mobileye.com/opinion/our-new-deal-with-geely-is-a-game-changer-says-shashua/\" target=\"_blank\" rel=\"noopener\">announced\u003C/a> the Geely Group&rsquo;s choice of \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye&rsquo;s SuperVision&trade; advanced driver-assistance system (ADAS)\u003C/a> for its Zeekr electric vehicles.\u003C/p>\n\u003Cp>\u003Cstrong>A More Human-Like Highway Assist \u003C/strong>\u003C/p>\n\u003Cp>Those who follow the development of ADAS software will know that today, most systems for adaptive cruise control, highway assist, and similar systems meant to manage routine highway driving are designed based on one main input: the vehicle ahead of you. However, if you think about how humans drive, and many everyday driving scenarios, taking just the car ahead of you into account is not enough. Human drivers think about the broader scene around a vehicle before making a decision. A very common example is coming up on a traffic jam &ndash; the car immediately in front of you may not have decelerated yet, but a vigilant driver would notice that the cars up ahead have begun to slow or stop, and will come to a gradual and smooth stop.\u003C/p>\n\u003Cp>Its 360⁰ high-resolution surround-vision provided by seven 8-megapixel cameras and four parking cameras enables the Mobileye algorithm to consider the changing driving environment and intelligently react to numerous inputs and objects (in addition to\u003Cem> \u003C/em>the behavior of the lead vehicle). For example, the decision-making can account for cars ahead of the lead vehicle, or a vehicle on the shoulder with an open door (which requires a slight lateral maneuver), or even a \u003Ca href=\"https://www.mobileye.com/blog/how-adas-and-data-can-lead-the-way-in-pedestrian-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">pedestrian\u003C/a> on the side of the road (which requires slowing down while passing by).\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"Reacting to Slowed or Stopped Vehicles Up Ahead &ndash; Mobileye SuperVision&trade;\" src=\"https://player.vimeo.com/video/733569869?h=2ead40c2e3&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"640\" height=\"360\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Ch3>&nbsp;\u003C/h3>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"Reacting to a Lateral Cut-In &ndash; Mobileye SuperVision&trade;\" src=\"https://player.vimeo.com/video/733569922?h=9a34469535&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"640\" height=\"360\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Ch3>&nbsp;\u003C/h3>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"Reacting to a Pedestrian on the Road &ndash; Mobileye SuperVision&trade;\" src=\"https://player.vimeo.com/video/733569961?h=4b0656d56a&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"640\" height=\"360\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Ch3>&nbsp;\u003C/h3>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"Reacting to a Vehicle on the Shoulder with an Open Door &ndash; Mobileye SuperVision&trade;\" src=\"https://player.vimeo.com/video/733570009?h=6fd03549f1&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"640\" height=\"360\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>Combined with the ability to operate at up to 130 kph (81 mph) on any road with clear lane markings &ndash; while also taking road curvature into account, and offering the latest in camera-radar fusion capabilities &ndash; this update has provided \u003Ca href=\"http://zgh.com/media-center/news/2021-04-15/?lang=en\" target=\"_blank\" rel=\"noopener noreferrer\">Zeekr 001\u003C/a> owners, overnight, with one of the most advanced highway-assist feature-sets available on the market today... and it's only the beginning.\u003C/p>\n\u003Cp>\u003Cstrong>The Next Era of ADAS \u003C/strong>\u003C/p>\n\u003Cp>Silicon and software are the next frontiers at the heart of car design, architecture, and utility. Car companies will differentiate themselves not through horsepower, but through the quality of software and the architectural design around it. What started as a handful of driving assistance features has evolved into the essence of what will make a modern car; and with OTA updates becoming a routine reality, the speed of innovation for vehicles will only increase. Consumers will come to expect the latest features via regular software updates, just as they do with their smartphones.\u003C/p>\n\u003Cp>With our deeply vertically integrated system with Zeekr &ndash; from the \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">SoC\u003C/a> and ECU hardware to the computer vision and fusion algorithms and integration of our driving policy applications &ndash; delivering new content over the air will become second nature over time. Upon the addition of \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">high-definition maps\u003C/a>, set for our future update, even more \u003Ca href=\"https://www.mobileye.com/blog/cloud-enhanced-driver-assist/\" target=\"_blank\" rel=\"noopener noreferrer\">cloud-enhanced\u003C/a> features become possible &ndash; including full navigate-on-pilot, not to mention additional future roll outs including driving assist on urban roads and parking features.\u003C/p>\n\u003Cp>To meet our goals with Zeekr, we have invested in extensive R&amp;D, engineering, and testing capabilities in China. And we have moved quickly, delivering hardware, software, and services in less than two years &ndash; a feat possible only because we integrate the entire technology chain, from chip design to sensor integration, along with the smart content and applications that tie it all together. This is the beginning of a new era for ADAS, and we are proud to \u003Ca href=\"https://www.mobileye.com/news/mobileye-zeekr-expand-future-cars-partnership/\" target=\"_blank\" rel=\"noopener noreferrer\">partner with Zeekr\u003C/a> in blazing this trail together.\u003C/p>","2022-07-27T07:00:00.000Z","News, From our CEO, ADAS, Opinion",{"id":1463,"type":24,"url":1464,"title":1465,"description":1466,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1466,"image":1467,"img_alt":1468,"content":1469,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1470,"tags":236},130,"shashua-mobility-innovator-award-automotive-hall-of-fame","Automotive Hall of Fame Awards Prof. Shashua for Innovation","“Entering the Automotive Hall of Fame is an incredible honor,” said Mobileye CEO Prof. Amnon Shashua at induction ceremony in Detroit.","https://static.mobileye.com/website/us/corporate/images/bd9ade080d4e388cd6434dc69ed9dce1_1658827956505.jpg","Mobileye CEO Prof. Amnon Shashua receives the 2022 Mobility Innovator Award during the Automotive Hall of Fame Induction and Awards Ceremony in Detroit.","\u003Cp>As mobility technology plays an increasingly important role in the advancement of the automobile, industry organizations have sought new ways to identify the most significant new developments and the people behind them.\u003C/p>\n\u003Cp>The Mobility Innovator Award, introduced last year by the \u003Ca href=\"https://www.automotivehalloffame.org/\" target=\"_blank\" rel=\"noopener noreferrer\">Automotive Hall of Fame\u003C/a>, seeks to recognize &ldquo;the outstanding work individuals have accomplished introducing new technologies and services that are redefining mobility.&rdquo; This year, the esteemed honor went to our chief executive, \u003Ca href=\"https://www.mobileye.com/amnon-shashua/\" target=\"_blank\" rel=\"noopener noreferrer\">Professor Amnon Shashua\u003C/a>.\u003C/p>\n\u003Cp>&ldquo;\u003Cspan style=\"color: black;\">I am truly overwhelmed.&nbsp;\u003C/span>\u003Ca href=\"https://www.automotivehalloffame.org/honoree/amnon-shashua/\" target=\"_blank\" rel=\"noopener noreferrer\">Entering the Automotive Hall of Fame\u003C/a> and being awarded the Mobility Innovator of the Year is an incredible honor,&rdquo; Prof. Shashua said during the induction and awards ceremony in Detroit on Thursday. &ldquo;I am privileged to have the chance to work in and influence such an exciting industry.&rdquo;\u003C/p>\n\u003Cp>Shashua founded&nbsp;Mobileye in 1999. At the time, he said, &ldquo;no one in the automotive industry believed that a single front-facing camera could achieve the level of performance and robustness needed for a system that would prevent or mitigate collisions and other safety functions. I must say that it gave me great pleasure to do something that everybody said could not be done.&rdquo;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/o1FmAg1OTGA\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&ldquo;Amnon&rsquo;s biggest contribution to the automotive industry is actually laying the groundwork for a safer, greener, and data-driven future &ndash; decades before such a vision would be deemed anything but a fantasy,&rdquo; Mobileye&rsquo;s chief legal officer and general counsel Liz Cohen-Yerushalmi attested in the tribute video above. &ldquo;I don&rsquo;t think it would be an exaggeration to say that he has revolutionized the automotive industry forever.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>An Honorable History\u003C/strong>\u003C/p>\n\u003Cp>Since its establishment in 1939, the Automotive Hall of Fame has bestowed more than 750 awards upon industry leaders and innovators. Its long list of inductees is a veritable \u003Cem>Who's Who\u003C/em> of notable names, identifiable with their eponymous brands to this day, such as W.O. Bentley, Ettore Bugatti, David D. Buick, Louis Chevrolet, Walter P. Chrysler, Andr&eacute; Citro&euml;n, Enzo Ferrari, Soichiro Honda, Armand Peugeot, Ferdinand Porsche, Louis Renault, and Ratan Tata &ndash; as well as brothers Horace E. and John F. Dodge, three generations of Fords, three Toyodas [\u003Cem>sic\u003C/em>], and five Opels.\u003C/p>\n\u003Cp>This year&rsquo;s new inductees include the late sportscar magnate Ferruccio Lamborghini, Chinese entrepreneur Lu Guanqiu, pioneering production engineer Taiichi Ohno, trailblazing female racing driver Lyn St. James, and Alma and Victor Green (authors of \u003Cem>The Green Book\u003C/em>). Alongside Prof. Shashua, the organization also honored \u003Ca href=\"https://youtu.be/J0SVWiDienk\" target=\"_blank\" rel=\"noopener noreferrer\">Ford CEO Jim Farley\u003C/a> as Industry Leader of the Year.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/d113ad90ae299d3079c796724eb0c864_1658474431123.png\" alt=\"Mobileye CEO Prof. Amnon Shashua with Ford CEO Jim Farley at the 2022 Automotive Hall of Fame induction and awards ceremony.\" />\u003C/p>\n\u003Cp>&ldquo;The automotive industry is experiencing revolutionary change driven by innovators who are shaping the future of mobility. The Mobility Innovator Award celebrates individuals and their impact,&rdquo; noted Automotive Hall of Fame president Sarah Cook. &ldquo;We are thrilled to recognize Amnon Shashua for his industry-leading contributions to advanced driving assist systems and other autonomous driving solutions.&rdquo;\u003C/p>\n\u003Cp>This award is the latest in a string of citations bestowed upon our founder and chief executive. Late last year, Shashua was named \u003Ca href=\"https://www.mobileye.com/news/amnon-shashua-automated-driving-executive-of-the-year-automotive-news/\" target=\"_blank\" rel=\"noopener\">Automated Driving Executive of the Year by \u003Cem>Automotive News\u003C/em>\u003C/a>. He received the \u003Ca href=\"https://www.mobileye.com/news/prof-amnon-shashua-wins-the-dan-david-prize/\" target=\"_blank\" rel=\"noopener\">Dan David Prize\u003C/a> in 2020, and was named \u003Ca href=\"https://www.imaging.org/site/IST/IST/Conferences/EI/EI_Scientist_of_the_Year.aspx\" target=\"_blank\" rel=\"noopener noreferrer\">Electronic Imaging Scientist of the Year\u003C/a> in 2019 by the Society for Imaging Science and Technology.\u003C/p>\n\u003Cp>In addition to his role as president and CEO of Mobileye, Prof. Shashua is a senior vice president at Intel. He holds the Sachs Chair in Computer Science at the Hebrew University of Jerusalem, has published more than 160 scientific papers, and holds over 90 patents.&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: white; color: #242424;\">A pioneering innovator in the field of artificial intelligence, Prof. Shashua leads multiple ventures to apply AI to tackle real-world challenges. In parallel&nbsp;to Mobileye,&nbsp;\u003C/span>\u003Ca style=\"background-color: white; color: #4f52b2;\" href=\"https://www.orcam.com/\" target=\"_blank\" rel=\"noopener noreferrer\">OrCam Technologies\u003C/a>\u003Cspan style=\"background-color: white; color: #242424;\">&nbsp;develops wearable devices to assist the visually and hearing impaired;&nbsp;the&nbsp;\u003C/span>\u003Ca style=\"background-color: white; color: #4f52b2;\" href=\"https://www.onezerobank.com/\" target=\"_blank\" rel=\"noopener noreferrer\">One Zero Digital Bank\u003C/a>\u003Cspan style=\"background-color: white; color: #242424;\">&nbsp;harnesses AI to enable smarter management of our finances; and&nbsp;\u003C/span>\u003Ca style=\"background-color: white; color: #4f52b2;\" href=\"https://www.ai21.com/\" target=\"_blank\" rel=\"noopener noreferrer\">AI21 Labs\u003C/a>\u003Cspan style=\"background-color: white; color: #242424;\">&nbsp;is working to revolutionize how we read and write, unlocking new forms of communication and expression. We look forward to seeing Shashua&rsquo;s groundbreaking work yield tangible breakthroughs in wearables, fintech, natural language processing, and more in the years to come &ndash; much as he has in the automotive industry.\u003C/span>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/6bbb4abd3c4dab2e911ded29725e62b2_1658828020200.jpg\" alt=\"Mobileye CEO Prof. Amnon Shashua with his fellow award-recipients and inductees to the Automotive Hall of Fame.\" />\u003C/p>","2022-07-22T07:00:00.000Z",{"id":1472,"type":5,"url":1473,"title":1474,"description":1475,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1475,"image":1476,"img_alt":1477,"content":1478,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1479,"tags":732},129,"eyeq-kit-sdk","EyeQ Kit™ Unlocks the Power of Our System-on-Chip","Our new software development kit (SDK) empowers automakers to develop and deploy their own differentiated market offerings on top of Mobileye’s efficient and scalable EyeQ® SoC.","https://static.mobileye.com/website/us/corporate/images/17635ed8e5c1a3721e5239f2d18b3405_1669106212111.jpg","EyeQ Kit™ opens up new possibilities for automobile manufacturers using Mobileye's hardware.","\u003Cp>Over the course of the past two decades, the world’s leading automakers have come to count on the efficiency, scalability, and performance of EyeQ®. This automotive-grade family of Systems-on-Chip serves as the brain behind everything Mobileye does. And now we’re unlocking the power of EyeQ further with the introduction of the \u003Ca href=\"https://www.mobileye.com/solutions/eyeq-kit/\" rel=\"noopener noreferrer\" target=\"_blank\">EyeQ Kit™\u003C/a>.\u003C/p>\u003Cp>This new software development kit (SDK) is designed to \u003Cspan style=\"color: black;\">enable our OEM customers to realize the full potential of \u003C/span>EyeQ\u003Cem> \u003C/em>to power their own applications. \u003Cspan style=\"color: black;\">Based on the highly efficient and heterogeneous compute architecture of our \u003C/span>\u003Ca href=\"https://www.mobileye.com/blog/eyeq6-system-on-chip/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: black;\">powerful new EyeQ6\u003C/a>\u003Cspan style=\"color: black;\"> and \u003C/span>\u003Ca href=\"https://www.mobileye.com/news/mobileye-ces-2022-tech-news/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: black;\">EyeQ\u003C/a>\u003Ca href=\"https://www.mobileye.com/news/mobileye-ces-2022-tech-news/\" rel=\"noopener noreferrer\" target=\"_blank\">\u003Cem> \u003C/em>\u003C/a>\u003Ca href=\"https://www.mobileye.com/news/mobileye-ces-2022-tech-news/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: black;\">Ultra\u003C/a>\u003Cspan style=\"color: black;\"> processors, EyeQ Kit allows automotive manufacturers to build their own applications on top of \u003C/span>\u003Ca href=\"https://www.mobileye.com/technology/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: black;\">Mobileye's cutting-edge core technologies\u003C/a>\u003Cspan style=\"color: black;\">.\u003C/span>\u003C/p>\u003Cp>\u003Cstrong style=\"color: black;\">Proven Technology, Greater Flexibility\u003C/strong>\u003C/p>\u003Cp>\u003Cspan style=\"color: black;\">From general-purpose CPU cores to high-compute-density accelerators – including deep-learning neural networks – \u003C/span>\u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: black;\">EyeQ\u003C/a>\u003Cspan style=\"color: black;\"> has the scalable and modular architecture to deliver high performance, while \u003C/span>\u003Ca href=\"https://www.mobileye.com/blog/why-tops-arent-tops-when-it-comes-to-av-processors/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: black;\">requiring modest TOPS (Trillions of Operations per Second)\u003C/a>, all within a low power envelope\u003Cspan style=\"color: black;\">. These are the advantages that make EyeQ\u003C/span>\u003Cem> \u003C/em>\u003Cspan style=\"color: black;\">the System-on-Chip that’s trusted by dozens of the world’s leading automakers to power the ADAS features in hundreds of models – amounting to more than \u003C/span>\u003Ca href=\"https://www.mobileye.com/blog/100-million-eyeq-chips/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: black;\">100 million vehicles\u003C/a>\u003Cspan style=\"color: black;\"> (as of the end of 2021). And now, as more \u003C/span>\u003Ca href=\"https://www.mobileye.com/solutions/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: black;\">advanced driver-assistance features evolve into autonomous driving\u003C/a>\u003Cspan style=\"color: black;\">, EyeQ Kit provides automakers with a platform for brand expression on top of Mobileye’s proven technology stack.\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: black;\">EyeQ Kit allows automakers to benefit from our \u003C/span>EyeQ \u003Cspan style=\"color: black;\">architecture and the technologies that we have built upon it – including \u003C/span>industry-leading computer vision capabilities\u003Cspan style=\"color: black;\">, REM™ crowdsourced mapping, and RSS-based driving policy. Using EyeQ Kit, OEMs can further leverage the power of our System-on-Chip for what \u003C/span>\u003Cem style=\"color: black;\">they \u003C/em>\u003Cspan style=\"color: black;\">need so that they can concentrate on what matters most: implementing the latest technological functions to enhance the driving experience, with a look and feel unique to their products. \u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: black;\">These functions include advanced driver-assistance features such as Adaptive Cruise Control, Traffic Jam Assist/Pilot, Highway Assist/Pilot, Full ODD In-Path Assist, and much more. Beyond driver-assistance, EyeQ Kit also \u003C/span>enables co-hosting of a broad range of visualization and driver-monitoring applications directly on the EyeQ SoC. Integrating these increasingly sought-after functions directly onto EyeQ removes the need for \u003Cspan style=\"color: black;\">additional safety-critical ECUs and collateral integration, which unlocks the potential for cost-savings and reduced complexity. \u003C/span>EyeQ Kit also enables additional functions including automated parking, augmented and virtual reality displays, and human-machine interfaces.\u003C/p>\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/b6006d861e6908b90f7f221bf2ff3474_1657016205850.png\" alt=\"EyeQ Kit uses standardized APIs, development platforms, and operating systems to enable smooth access by Mobileye's OEM customers.\">\u003C/p>\u003Cp>\u003Cstrong style=\"color: black;\">EyeQ Kit Speaks Your Language\u003C/strong>\u003C/p>\u003Cp>\u003Cspan style=\"color: black;\">EyeQ Kit is based on standard APIs – such as OpenCL and TensorFlow – and X86 development platforms. By \"speaking\" these common languages, EyeQ Kit allows OEMs to develop their own applications conveniently and efficiently, without the need for specialized skills or specific hardware vendors. This, of course, carries the potential to reduce development costs, accelerate time to market, and enable hardware-vendor flexibility for the full development cycle – from functional bring-up to deployment and performance-tuning.\u003C/span>\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>In short, EyeQ Kit opens up a whole new world of possibilities for the automakers whose vehicles we’re committed to enhancing with our cutting-edge technologies. Visit the \u003Ca href=\"https://www.mobileye.com/solutions/eyeq-kit/\" rel=\"noopener noreferrer\" target=\"_blank\">EyeQ Kit\u003C/a> and \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" rel=\"noopener noreferrer\" target=\"_blank\">EyeQ SoC\u003C/a> pages on the Mobileye website to learn more.\u003C/p>","2022-07-05T00:00:00.000Z",{"id":1481,"type":5,"url":1482,"title":1483,"description":1484,"primary_tag":16,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1484,"image":1485,"img_alt":1486,"content":1487,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":32,"publish_date":1488,"tags":1489},128,"cloud-enhanced-driver-assist","Cloud-Enhanced Driver-Assist™ Takes ADAS to the Next Level","Mobileye’s latest solution combines our expertise in advanced driver-assistance systems with the power of our cloud-based, crowdsourced mapping technology.","https://static.mobileye.com/website/us/corporate/images/0b09cc249fce9cd71ff576f58e79e03f_1655801167544.png","Cloud-Enhanced Driver-Assist™ brings the power of REM to driver assistance.","\u003Cp>As our world grows more connected, the challenge that arises is not merely collecting information, but getting the right information to the right place at the right time. And that challenge grows even more pressing when information has the potential to enhance road safety. Our answer to that call is encapsulated in Cloud-Enhanced Driver-Assist&trade;.\u003C/p>\n\u003Cp>The latest addition to \u003Ca href=\"https://www.mobileye.com/solutions/\" target=\"_blank\" rel=\"noopener noreferrer\">our portfolio of mobility solutions\u003C/a>, Cloud-Enhanced Driver-Assist brings together two core areas of our expertise &ndash; ADAS and REM&trade; &ndash; and packages them in an innovative new solution, taking driver-assistance technology to the next level.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/d91146d8379525f1bfd9894bada6720a_1655803522487.png\" alt=\"Cloud-Enhanced Driver-Assist networks between vehicles, the the cloud, and the Mobileye Roadbook.\" />\u003C/p>\n\u003Cp>\u003Cstrong>The Next Leap in Driver Assistance \u003C/strong>\u003C/p>\n\u003Cp>Imagine you&rsquo;re driving along a multi-lane road. Maybe it&rsquo;s snowing, or raining, or foggy, or dark. Maybe the lane markers have worn away with time and haven&rsquo;t been repainted in a while. You can&rsquo;t see them with your own eyes, and the vehicle&rsquo;s onboard sensors may not be able to, either. But Mobileye&rsquo;s system knows where they are, and our Cloud-Enhanced Driver-Assist solution helps the vehicle to stay centered in the lane &ndash; keeping you on the proverbial straight and narrow.\u003C/p>\n\u003Cp>Or picture arriving at a busy intersection. There&rsquo;s a whole mess of traffic lights &ndash; one or more sets for each lane in each direction of traffic. Cloud-Enhanced Driver-Assist knows exactly which lights are relevant for the lane you&rsquo;re in, enabling the vehicle to alert you (or even apply the brakes itself) if you&rsquo;re about to \u003Ca href=\"https://www.iihs.org/topics/red-light-running/\" target=\"_blank\" rel=\"noopener noreferrer\">roll through a red light\u003C/a>.\u003C/p>\n\u003Cp>That construction zone around the next bend or over the next rise? A toll gate coming up fast on that highway exit ramp? The speed at which traffic actually travels (independent of the posted limit) on a given stretch of road? Or the line drivers typically take around a corner? Cloud-Enhanced Driver-Assist can access all that information &ndash; thanks to innovations from our autonomous-vehicle program &ndash; to augment the ADAS performance in today&rsquo;s human-driven vehicles.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/ec9d916d6f0c48c4e95908c83eaf4851_1655803556219.png\" alt=\"Road Experience Management (REM) gathers information about the driving environment to create and update the Mobileye Roadbook.\" />\u003C/p>\n\u003Cp>\u003Cstrong>The Power of the Crowd, Delivered Through the Cloud\u003C/strong>\u003C/p>\n\u003Cp>The key to the enhanced capabilities of Cloud-Enhanced Driver-Assist is our \u003Ca href=\"https://www.mobileye.com/news/mobileye-wins-prestigious-2020-pace-award-for-rem-mapping-tech/\" target=\"_blank\" rel=\"noopener\">award-winning\u003C/a> Road Experience Management&trade; (REM) technology. Originally developed for autonomous vehicles, REM is applied here to augment driver assistance in a manifestation of the mutually reinforcing relationship between our technologies.\u003C/p>\n\u003Cp>REM unlocks the power nascent in the proliferation of our advanced computer-vision driver-assistance technologies. A rapidly growing crowd of more than 1.5 million vehicles equipped with \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">our latest chips\u003C/a> continuously feed REM with a stream of data on the roads they travel. REM uploads that anonymous data in small packets to the cloud, where it&rsquo;s automatically compiled into the \u003Ca href=\"https://www.mobileye.com/blog/av-maps-vs-hd-maps/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Roadbook&trade;\u003C/a> &ndash; our highly precise, continuously refreshing map of the worldwide driving environment.\u003C/p>\n\u003Cp>We&rsquo;ve developed this method to be more efficient than the typical industry practice of dispatching dedicated LiDAR mapping vehicles. And, of course, it provides more up-to-date information that is&nbsp;relevant both \u003Ca href=\"https://www.mobileye.com/blog/rem-mapping-avs/\" target=\"_blank\" rel=\"noopener noreferrer\">for autonomous vehicles\u003C/a>, and \u003Ca href=\"https://www.mobileye.com/blog/rem-adas-data/\" target=\"_blank\" rel=\"noopener noreferrer\">for enhanced driver-assistance\u003C/a> as well.\u003C/p>\n\u003Cp>Think of it, in essence, as another sensor &ndash; millions of sensors, really &ndash; on top of the vehicle&rsquo;s onboard cameras. Only it&rsquo;s delivered purely by software, without the need for any additional hardware.\u003C/p>\n\u003Cp>Mobileye&rsquo;s Nimrod Nehushtan speaks about \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">Road Experience Management\u003C/a> and its application to ADAS in the video below.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/dnPbTwM4UD4\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>In our Cloud-Enhanced Driver-Assist solution, REM and the Mobileye Roadbook combine with more than two decades of leadership in the realm of advanced driver-assistance systems and the experience of \u003Ca href=\"https://www.mobileye.com/blog/100-million-eyeq-chips/\" target=\"_blank\" rel=\"noopener noreferrer\">more than 100 million vehicles\u003C/a> equipped with our ADAS technologies. Dozens of \u003Ca href=\"https://www.mobileye.com/blog/toyota-zf-adas/\" target=\"_blank\" rel=\"noopener noreferrer\">the world&rsquo;s leading automakers count on our tech\u003C/a> to enable the ADAS features in hundreds of models they sell in markets around the world. And several of those automakers have already begun to adopt our new Cloud-Enhanced Driver-Assist solution.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/e3587fccad43f54452ec103ce3b2a495_1658131382047.png\" alt=\"The elements that make up Mobileye's Cloud-Enhanced Driver-Assist solution.\" />\u003C/p>\n\u003Cp>\u003Cstrong>Coming to a Showroom Near You\u003C/strong>\u003C/p>\n\u003Cp>At CES earlier this year, we \u003Ca href=\"https://www.mobileye.com/news/mobileye-ces-2022-partner-news/\" target=\"_blank\" rel=\"noopener noreferrer\">announced deals with two major automakers\u003C/a> to enhance their most advanced driver-assistance systems with our mapping technology.\u003C/p>\n\u003Cp>The Volkswagen Group is employing the Mobileye Roadbook to enable its Travel Assist with Swarm Data. The system is already offered across Europe in electric vehicles (based on VW&rsquo;s MEB platform) from the Volkswagen, Seat, and Skoda brands. This technology helped the \u003Ca href=\"https://www.volkswagen-newsroom.com/en/press-releases/the-new-id5-achieves-highest-score-in-the-euro-ncap-driver-assistance-test-7957\" target=\"_blank\" rel=\"noopener noreferrer\">Volkswagen ID.5 achieve the highest possible score\u003C/a> in Euro NCAP&rsquo;s \u003Ca href=\"https://www.euroncap.com/en/press-media/press-releases/nissan-qashqai-and-vw-id5-excel-in-highway-assist/\" target=\"_blank\" rel=\"noopener noreferrer\">recent evaluation of highway assistance systems\u003C/a>.\u003C/p>\n\u003Cp>Volkswagen Group chief executive Dr. Herbert Diess and Mobileye CEO Prof. Amnon Shashua discussed the partnership (and went for a drive in a VW ID.4 equipped with the system) in the video below.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/SnJXdfrybLI\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/ford-bets-big-on-mobileye-tech/\" target=\"_blank\" rel=\"noopener noreferrer\">Ford\u003C/a> is similarly utilizing the power of our REM technology to expand the scope of its BlueCruise hands-free highway pilot system. &ldquo;REM mapping technology in our future versions of BlueCruise are really important for our hands-free driving solutions,&rdquo; Ford CEO Jim Farley told Prof. Shashua in the video below. &ldquo;We just couldn&rsquo;t offer the systems we do at Ford without you, and we&rsquo;re betting on Mobileye for our future.&rdquo;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/J0SVWiDienk\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>These are just the first two out of nine automakers currently working to implement our Cloud-Enhanced Driver-Assist solution into their vehicles.\u003C/p>\n\u003Cp>\u003Cstrong>The Full Spectrum of Mobileye Solutions\u003C/strong>\u003C/p>\n\u003Cp>With the arrival of Cloud-Enhanced Driver-Assist, our range grows to five distinct (yet mutually reinforcing) solutions for driver assistance and autonomous driving.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/solutions/\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/58715a93c0009733151f005d0ca7dbab_1658131337587.png\" alt=\"The full spectrum of mobility solutions from Mobileye, from driver assistance to autonomous driving.\" />\u003C/a>\u003C/p>\n\u003Cp>Our spectrum now encompasses base driver-assist, Cloud-Enhanced Driver-Assist&trade;, Mobileye SuperVision&trade; (our next-generation premium driver-assist system), Mobileye Chauffeur&trade; (our turnkey solution for consumer autonomous vehicles), and \u003Ca href=\"https://www.mobileye.com/blog/mobileye-drive-self-driving-system/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Drive&trade;\u003C/a> (for commercial autonomous vehicles).\u003C/p>\n\u003Cp>Visit \u003Ca href=\"https://www.mobileye.com/solutions/\" target=\"_blank\" rel=\"noopener noreferrer\">our new solutions page\u003C/a> to discover the full spectrum, and watch this space for more to come.\u003C/p>","2022-06-22T07:00:00.000Z","Mapping & REM, ADAS",{"id":1491,"type":5,"url":1492,"title":1493,"description":1494,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1494,"image":1495,"img_alt":1496,"content":1497,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1498,"tags":997},125,"autonomous-vehicle-technology-everywhere-in-every-way-for-everyone","Autonomous Vehicle Technology: Everywhere, in Every Way, for Everyone","To celebrate National Autonomous Vehicle Day, we delve into the scalable approach behind the various aspects of our self-driving technologies.","https://static.mobileye.com/website/us/corporate/images/d664f6c35a83753fee650c4fc90cb7af_1653991677612.png","Mobileye is working to bring the benefits of self-driving technology everywhere, in every way, for everyone.","\u003Cp>At Mobileye, \u003Ca href=\"https://www.mobileye.com/blog/what-drives-us/\" target=\"_blank\" rel=\"noopener noreferrer\">we&rsquo;re on a mission\u003C/a> to bring the benefits of self-driving technology everywhere, in every way, for everyone.\u003C/p>\n\u003Cp>This is not just a marketing slogan for us here at Mobileye. It&rsquo;s the central philosophy that guides us in developing our technologies to scale: to locations around the world, in a variety of applications, and for the mass market.\u003C/p>\n\u003Cp>To celebrate National Autonomous Vehicle Day in the United States, we&rsquo;re pleased to expand on what we mean by &ldquo;everywhere, in every way, for everyone.&rdquo; Join us for a glimpse behind the curtain at our scalable-by-design approach to self-driving technologies.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/17965c01ad1c5facf03199b36fec2eb0_1653991518574.png\" alt=\"Mobileye is working to bring the benefits of self-driving technology everywhere\" />\u003C/p>\n\u003Cp>\u003Cstrong>Testing Around the World\u003C/strong>\u003C/p>\n\u003Cp>To pave the way towards the rapidly approaching future of autonomous vehicles, Mobileye has spent the past several years testing our AVs not in a single, geo-fenced environment, but in real-world conditions, across a variety of locations around the globe.\u003C/p>\n\u003Cp>To date, our AV test fleet has tackled roads in Jerusalem, Germany, France, the United States, Japan, and China. Each \u003Ca href=\"https://www.mobileye.com/news/autonomous-vehicle-testing-miami-stuttgart/\" target=\"_blank\" rel=\"noopener\">new location\u003C/a> poses fresh challenges for our technology to overcome and new environments to put our tech to the test, whether on rural, urban, suburban, or interurban roadways. And we don&rsquo;t steer clear of the toughest city-center conditions in places like \u003Ca href=\"https://youtu.be/lWExNC25Pd4\" target=\"_blank\" rel=\"noopener noreferrer\">Manhattan\u003C/a>, \u003Ca href=\"https://youtu.be/Q69tBNCVJa0\" target=\"_blank\" rel=\"noopener noreferrer\">Paris\u003C/a>, and \u003Ca href=\"https://youtu.be/HLczErXDujI\" target=\"_blank\" rel=\"noopener\">Jerusalem\u003C/a>.\u003C/p>\n\u003Cp>\u003Cstrong>Road Experience Management&trade;\u003C/strong>\u003C/p>\n\u003Cp>The breadth and scope of these locations are enabled by underlying technologies designed for global scalability and adaptability. Our \u003Ca href=\"https://www.mobileye.com/blog/rem-mapping-avs/\" target=\"_blank\" rel=\"noopener noreferrer\">Road Experience Management (REM&trade;)\u003C/a> mapping system, for example, crowdsources data from the cameras in some 1.5 million REM-enabled vehicles already on the road equipped with our computer-vision technology. We&rsquo;ve found this combination of cost-effective cameras and the power of the crowd to be far more efficient and scalable than the typical method of scanning by dedicated LiDAR mapping vehicles.\u003C/p>\n\u003Cp>REM compiles this data into the Mobileye Roadbook&trade;, our highly precise AV map of the driving environment worldwide. REM has already mapped billions of kilometers of roadway around the world, and is currently mapping new roads (and continuously updating the existing map) at a rate of millions of kilometers every day.\u003C/p>\n\u003Cp>When our AVs reach a new location that hasn&rsquo;t been added to the Mobileye Roadbook yet, all we need to do is push a proverbial button to compile the maps from data we have already collected, and our autonomous vehicles are good to go.\u003C/p>\n\u003Cp>\u003Cstrong>Responsibility-Sensitive Safety\u003C/strong>\u003C/p>\n\u003Cp>Our Responsibility-Sensitive Safety model (RSS) is similarly developed with scalability at its core. RSS is our open mathematical model for AV safety, designed to engender public trust in self-driving technologies.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">RSS consists of five universal &ldquo;rules of the road&rdquo;\u003C/a> through which all of the AV&rsquo;s decisions are filtered. Because these rules are transparent and independently verifiable, RSS is designed to be integrated into a broad array of standards and regulations across industry and government. And its parameters can be adjusted to fit local driving cultures in different parts of the world.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/5e3c56eaf008c40187ac5b60a20c0ca5_1653991540577.png\" alt=\"Mobileye is working to unlock the benefits of self-driving technology in every way\" />\u003C/strong>\u003C/p>\n\u003Cp>\u003Cstrong>Different Solutions for Different Applications\u003C/strong>\u003C/p>\n\u003Cp>The product of all our research and development of self-driving technologies takes \u003Ca href=\"https://www.mobileye.com/solutions/\" target=\"_blank\" rel=\"noopener noreferrer\">many forms\u003C/a>.\u003C/p>\n\u003Cp>Mobileye Chauffeur&trade;, for example, is designed for \u003Cem>consumer\u003C/em> autonomous vehicles &ndash; like the one we&rsquo;re \u003Ca href=\"http://zgh.com/media-center/news/2022-01-05/?lang=en\" target=\"_blank\" rel=\"noopener noreferrer\">currently developing with Zeekr\u003C/a>.\u003C/p>\n\u003Cp>Meanwhile, Mobileye Drive&trade; is designed to enable autonomous \u003Cem>commercial\u003C/em> vehicles &ndash; such as robotaxis, self-driving shuttles, and autonomous delivery platforms. It&rsquo;s already being integrated by a range of customers and partners including Udelv, Transdev, Lohr, Benteler, Beep, Schaeffler, Moovit, and Sixt.\u003C/p>\n\u003Cp>Both of these turnkey self-driving solutions incorporate passive (cameras) and active (radar and LiDAR) sensors, developed and operating independently of each other, under an approach we call \u003Ca href=\"https://www.mobileye.com/blog/av-safety-demands-true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">True Redundancy&trade;\u003C/a>. This method tasks each of the two parallel subsystems with creating complete and independent models of the environment on which the driving policy can then base its decisions. Compared to the typical industry approach of sensor fusion (which relies on one combined model of the driving environment), this approach results in a more robust sensing system and creates an additional failsafe.\u003C/p>\n\u003Cp>An added benefit of the True Redundancy approach is that we&rsquo;re able to take the camera-only subsystem from our developmental AVs and adapt it into Mobileye SuperVision&trade;. This \u003Cspan style=\"color: black;\">hands-free/eyes-on\u003C/span> premium driver-assistance system is already in production in the Zeekr 001.\u003C/p>\n\u003Cp>\u003Cstrong>The EyeQ&reg; Family of SoCs\u003C/strong>\u003C/p>\n\u003Cp>All of these solutions and more are built around \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ\u003C/a>, our family of automotive-grade Systems-on-Chip.\u003C/p>\n\u003Cp>Now in its \u003Ca href=\"https://www.mobileye.com/blog/eyeq6-system-on-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">sixth generation\u003C/a>, EyeQ chips have been incorporated into \u003Ca href=\"https://www.mobileye.com/blog/100-million-eyeq-chips/\" target=\"_blank\" rel=\"noopener noreferrer\">more than 100 million\u003C/a> vehicles to date. Each successive iteration is based on the same core expertise and architecture, and builds upon the accumulated experience of the generations that have come before.\u003C/p>\n\u003Cp>EyeQ is a highly efficient, extensively proven, and broadly scalable family of SoCs that&rsquo;s trusted by customers around the world to handle a broad spectrum of advanced mobility applications. It&rsquo;s the brain behind everything Mobileye does, from driver-assistance and autonomous-driving systems to retrofit collision-avoidance devices and data services for transportation infrastructure and smart city planning.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/48c9fef3861d2e08777605816c66d399_1653991563018.png\" alt=\"Mobileye is working to bring the benefits of self-driving technology to everyone\" />\u003C/p>\n\u003Cp>\u003Cstrong>Self-Driving Mobility for the Masses\u003C/strong>\u003C/p>\n\u003Cp>From the dawn of the automobile more than a century ago through to the present day, the use of a private automobile has generally been available only to those with the means to buy one, and the ability (and license) to drive one. By removing the driver from the equation, however, self-driving Mobility-as-a-Service stands to open up private mobility, at a more attainable cost, to a far wider userbase &ndash; including children, the elderly, and individuals with physical and mental disabilities.\u003C/p>\n\u003Cp>Mobileye is working with an array of partners to bring self-driving mobility services to locations around the world. We&rsquo;ve already run our first \u003Ca href=\"https://www.mobileye.com/blog/paris-ratp-autonomous-vehicle-testing-pilot/\" target=\"_blank\" rel=\"noopener noreferrer\">self-driving MaaS pilot project in France\u003C/a>, and have broader \u003Ca href=\"https://www.mobileye.com/blog/self-driving-robotaxi-sixt-germany-iaa/\" target=\"_blank\" rel=\"noopener noreferrer\">services due to commence in both Germany\u003C/a> and Jerusalem before the end of this year &ndash; with additional projects already in varying states of progress with partners worldwide.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>After years of development and decades of leadership, we&rsquo;ve come to embrace the power of technology and the promise of the autonomous future to transform the way people and goods get around. And we&rsquo;re working to make that long-held dream a reality &ndash; not just somewhere, in some ways, for some people, but everywhere, in every way, for everyone.\u003Cbr />\u003Cbr />\u003C/p>","2022-05-31T07:00:00.000Z",{"id":1500,"type":5,"url":1501,"title":1502,"description":1503,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1503,"image":1504,"img_alt":1505,"content":1506,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1507,"tags":563},124,"eyeq6-system-on-chip","Meet EyeQ®6: Our Most Advanced Driver-Assistance Chips Yet","The latest additions to our family of Systems-on-Chip, EyeQ®6L and EyeQ®6H bring new levels of performance and efficiency to core and premium ADAS.","https://static.mobileye.com/website/us/corporate/images/3a88e49ba0fc0d78fce7bda449debb97_1652799157545.png","EyeQ6 High System-on-Chip for premium advanced driver-assistance systems","\u003Cp>Driving the road to the future of automobility demands multi-focal vision &ndash; \u003Ca href=\"https://www.mobileye.com/blog/what-drives-us/\" target=\"_blank\" rel=\"noopener noreferrer\">the kind of vision that Mobileye is built upon\u003C/a>. So, while we motor on ahead towards the rapidly approaching horizon of self-driving vehicles, we&rsquo;re also constantly looking back in our mirrors at the road we&rsquo;ve traveled to get here, watching the road we&rsquo;re on right now, and preparing for what&rsquo;s just around the next bend.\u003C/p>\n\u003Cp>What we see surrounding us on that road are cars and trucks enhanced by our driver-assist technology &ndash; and we&rsquo;re out to make them even better. Enter: EyeQ6, our latest generation of Systems-on-Chip for advanced driver-assistance systems.\u003C/p>\n\u003Cp>\u003Cstrong>Built on Decades of Experience\u003C/strong>\u003C/p>\n\u003Cp>As the newest member of \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">the EyeQ&reg; family\u003C/a>, EyeQ6 builds upon the five generations of our SoCs that have come before. Since the first iteration began production in 2007, \u003Ca href=\"https://www.mobileye.com/blog/100-million-eyeq-chips/\" target=\"_blank\" rel=\"noopener noreferrer\">we have shipped over 100 million EyeQs to date\u003C/a> (and counting). The bulk of those chips have gone into consumer vehicles produced by dozens of \u003Ca href=\"https://www.mobileye.com/blog/toyota-zf-adas/\" target=\"_blank\" rel=\"noopener noreferrer\">the world&rsquo;s leading automakers\u003C/a> to power the driver-assistance features in hundreds of models currently on sale around the world.\u003C/p>\n\u003Cp>By employing highly efficient hardware accelerator architecture, EyeQ achieves state-of-the-art computer-vision performance, while requiring relatively \u003Ca href=\"https://www.mobileye.com/blog/why-tops-arent-tops-when-it-comes-to-av-processors/\" target=\"_blank\" rel=\"noopener noreferrer\">modest TOPS numbers\u003C/a> and consuming low levels of power &ndash; delivering an extraordinary cost/performance ratio.\u003C/p>\n\u003Cp>EyeQ6 benefits from the expertise we&rsquo;ve accrued over the course of the past couple of decades. That experience and know-how are encapsulated in this new chip, which comes in two versions, each designed for a different type of ADAS application. And each delivers even greater performance and efficiency than any previous iteration of EyeQ.\u003C/p>\n\u003Cp>\u003Cstrong>EyeQ6 Lite: high efficiency for core ADAS\u003C/strong>\u003C/p>\n\u003Cp>Engineered to support Level 1-2 driver-assistance, EyeQ6L (or &ldquo;Lite&rdquo;) boasts the best combination of high performance, low power consumption, and optimal cost efficiency of any SoC we have ever made. It&rsquo;s a one-box windshield solution capable of supporting all core ADAS applications.\u003C/p>\n\u003Cp>EyeQ6 Lite offers four-and-a-half times more compute power than EyeQ4 Mid, yet it utilizes similar levels of power consumption, and all in a package roughly half the size.\u003C/p>\n\u003Cp>As our newest solution for core ADAS applications, EyeQ6L stands to become the most prolific new member of the EyeQ family.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/f9d4fd7293dcda0f7f35ae19cdd0e4b7_1652799310152.jpg\" alt=\"EyeQ6 Lite System-on-Chip for core ADAS applications\" />\u003C/p>\n\u003Cp>\u003Cstrong>EyeQ6 High: centralized chip for premium ADAS\u003C/strong>\u003C/p>\n\u003Cp>To power Level 2+ systems and above, we present EyeQ6H (or &ldquo;High&rdquo;) &ndash; the ultimate compute platform for premium driver-assistance and partial autonomous driving.\u003C/p>\n\u003Cp>EyeQ6H boasts three times the compute power of the preceding EyeQ5H chip, yet consumes just 25% more power. With such a dramatic increase in its performance-to-consumption ratio, our newest premium ADAS chip is capable of supporting even more advanced driver-assistance features than its predecessor. But EyeQ6H doesn&rsquo;t just do \u003Cem>better\u003C/em> than previous iterations of EyeQ &ndash; it also does \u003Cem>more\u003C/em>.\u003C/p>\n\u003Cp>We&rsquo;ve built a dedicated image signal processor (ISP), graphics processing unit (GPU), and video encoder into EyeQ6H. And we&rsquo;ve opened up our internal development tools to allow our customers to host third-party applications directly on the SoC. So EyeQ6H can support full-surround cameras not only for driver assistance features, but also for visualization for the human driver.\u003C/p>\n\u003Cp>Such functions might include a bird&rsquo;s-eye-view display and video recording, as well as driver monitoring and automated parking &ndash; all hosted directly on the SoC. That means the vehicle requires fewer additional electronic control units (ECUs), which in turn means less engineering and lower cost. It also means that, once channeled into EyeQ6H, we can run computer-vision algorithms on these feeds for human-machine interface, augmented reality, and virtual reality display &ndash; just as we do with our surround cameras for driver assistance.\u003C/p>\n\u003Cp>This powerful new feature set elevates EyeQ6H into an all-encompassing, centralized, single-chip solution for all premium ADAS applications. The pairing of two EyeQ6H chips is slated to power the next generation of \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye SuperVision&trade;\u003C/a> &ndash; our hands-free/eyes-on Level 2++ system &ndash; and can even support Level 3 and Level 4 systems.\u003C/p>\n\u003Cp>\u003Cstrong>Eyes on the Road Ahead\u003C/strong>\u003C/p>\n\u003Cp>Engineering samples of EyeQ6L have already been delivered to customers, with production anticipated to commence around this time next year. As to EyeQ6H, we expect the first samples later this year, with volume production scheduled to begin by the end of 2024.\u003C/p>\n\u003Cp>With greater performance, efficiency, and cost-effectiveness than ever before, EyeQ6 stands to further enhance the driver-assistance technology market that Mobileye has been leading for the past two decades &ndash; and help drive the evolution towards autonomous mobility.\u003C/p>","2022-05-26T00:00:00.000Z",{"id":1509,"type":24,"url":1510,"title":1511,"description":1512,"primary_tag":954,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1512,"image":1513,"img_alt":1514,"content":1515,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1516,"tags":1517},122,"autonomous-vehicle-testing-miami-stuttgart","Autonomous on the Streets of Miami and Stuttgart","With the addition of two more locations in Florida and Germany, we have tested AVs in ten cities in six countries across three continents around the world.","https://static.mobileye.com/website/us/corporate/images/2e1f3ee26f2da03071bbd121f032c0fb_1651055587131.jpg","Mobileye autonomous vehicle testing in Miami, Florida","\u003Cp>An autonomous vehicle capable of operating in one place may be of benefit to those in that particular location &ndash; but Mobileye&rsquo;s stated goal is to develop self-driving technology to be easily deployable anywhere and everywhere.\u003C/p>\n\u003Cp>That&rsquo;s why we&rsquo;ve been spending the past several years \u003Ca href=\"https://www.mobileye.com/blog/robotaxi-night-drive-jerusalem-unedited-video/\" target=\"_blank\" rel=\"noopener noreferrer\">testing our self-driving vehicles\u003C/a> not in carefully selected and geofenced environments, but rather alongside actual traffic, in real-life conditions, in locations around the world. And now we&rsquo;ve expanded our global footprint yet again by commencing testing of our autonomous vehicles in both Miami and Stuttgart.\u003C/p>\n\u003Cp>\u003Cstrong>AV Everywhere\u003C/strong>\u003C/p>\n\u003Cp>Beyond its image of exotic cars cruising sun-soaked streets, driving in downtown Miami is characterized by heavy congestion, with significant \u003Ca href=\"https://www.mobileye.com/blog/avs-and-the-drive-for-pedestrian-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">pedestrian\u003C/a>, cyclist, even skateboard and inline-skating traffic.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/ipc1o5M0lRQ\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>Meanwhile Stuttgart &ndash; aside from being widely known as the birthplace of the automobile and a major hub of the global automotive industry &ndash; presents a complex mix of steep landscapes and driving on urban, rural, and high-speed (often even derestricted) Autobahn highways.\u003C/p>\n\u003Cp>These latest locations join our AV testing programs previously launched in \u003Ca href=\"https://youtu.be/vL_QNy25n74\" target=\"_blank\" rel=\"noopener noreferrer\">Detroit\u003C/a>, \u003Ca href=\"https://www.mobileye.com/press-kit/press-kit-mobileye-new-york-city/\" target=\"_blank\" rel=\"noopener\">New York\u003C/a>, \u003Ca href=\"https://www.mobileye.com/news/mobileye-releases-unedited-footage-of-a-40-minute-av-ride-through-jerusalem/\" target=\"_blank\" rel=\"noopener\">Jerusalem\u003C/a>, \u003Ca href=\"https://youtu.be/ZSihbQDg2HA\" target=\"_blank\" rel=\"noopener noreferrer\">Tel Aviv\u003C/a>, \u003Ca href=\"https://www.mobileye.com/blog/munich-av-video/\" target=\"_blank\" rel=\"noopener noreferrer\">Munich\u003C/a>, \u003Ca href=\"https://www.mobileye.com/blog/paris-ratp-autonomous-vehicle-testing-pilot/\" target=\"_blank\" rel=\"noopener noreferrer\">Paris\u003C/a>, Tokyo, and Shanghai. With the addition of Miami and Stuttgart, we have had testing programs in ten cities in six countries across three continents around the world.\u003C/p>\n\u003Cp>&ldquo;Mobileye is scaling its AV testing through the addition of new locations,&rdquo; notes Johann &ldquo;JJ&rdquo; Jungwirth, Vice President of Mobility-as-a-Service (Maas) at Mobileye. &ldquo;Adding Miami and Stuttgart to our testing program represents a commitment to ensuring, first and foremost, safety, and, second, a &lsquo;quality&rsquo; rider experience, and at the same time strengthens the foundation for the large-scale commercialization of autonomous mobility services worldwide by confronting and coping with broad and complex driving scenarios, traffic rules, and unique conditions in various cities.&rdquo;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/lVQuUPvy9GU\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>\u003Cstrong>Scalable by Design\u003C/strong>\u003C/p>\n\u003Cp>As with each city our autonomous vehicles have reached, the unique parameters presented by these latest locations pose a novel set of challenges and conditions for our autonomous vehicles to adapt to. And they&rsquo;ve adapted rather quickly: in fact, our AVs were cruising the streets of Miami within just two weeks of arrival.\u003C/p>\n\u003Cp>Central to that quick turnaround and Mobileye&rsquo;s &ldquo;AV everywhere&rdquo; strategy is our \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">Road Experience Management&trade;\u003C/a> mapping technology. By crowdsourcing data from millions of vehicles already on the road equipped with our technology, REM&trade; can map new roadways quickly, efficiently, and exactingly &ndash; at the push of a button and in near-real time. The resulting Mobileye Roadbook&trade; provides our autonomous vehicles with \u003Ca href=\"https://www.mobileye.com/blog/rem-mapping-avs/\" target=\"_blank\" rel=\"noopener noreferrer\">highly detailed, precise, and up-to-date information\u003C/a> about the driving environment, supplementing the vehicle&rsquo;s onboard sensors.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/73b5342ec7a22fb9e21bc13e48a058e1_1651055665297.jpg\" alt=\"Mobileye autonomous vehicle testing in Miami, Florida\" />\u003C/p>\n\u003Cp>The scalability of our approach to self-driving technology is further reinforced by our \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">Responsibility-Sensitive Safety\u003C/a> model. RSS yields both a lean driving policy that accounts for differences in local driving culture and an open framework for regulators to pave the way for \u003Ca href=\"https://www.mobileye.com/blog/av-scale-erez-dagan-jack-weast-reuters/\" target=\"_blank\" rel=\"noopener noreferrer\">the deployment of autonomous vehicles at scale\u003C/a>.\u003C/p>\n\u003Cp>From mapping and driving policy to sensing and processing, everything Mobileye builds, we build to scale. That&rsquo;s what enables us to bring the benefits of self-driving technology to everywhere, in every way, for everyone &ndash; from Miami to Stuttgart and beyond.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/991d3b4affc461445e64b3ab374b2ae7_1651055633626.jpg\" alt=\"Mobileye autonomous vehicle testing in Miami, Florida\" />\u003C/p>","2022-04-27T07:00:00.000Z","Autonomous Driving, Video, News",{"id":1519,"type":5,"url":1520,"title":1521,"description":1522,"primary_tag":954,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1522,"image":1523,"img_alt":1524,"content":1525,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":32,"publish_date":1526,"tags":1527},121,"robotaxi-night-drive-jerusalem-unedited-video","Robotaxi Night Drive Shows Full Sensing Suite in Action","Our fifth unedited autonomous-driving video follows our robotaxi on a nighttime cruise through Jerusalem, demonstrating the capabilities of True Redundancy™ sensing.","https://static.mobileye.com/website/us/corporate/images/2a917a2dd79ed4a885d1f7eb123d0239_1649762415256.png","Mobileye robotaxi on an autonomous night drive through Jerusalem","\u003Cp>Robotaxi deployment doesn&rsquo;t require a single technological innovation; it demands several &ndash; including processors, driving policy, maps, and multiple types of sensors &ndash; all working in unison.\u003C/p>\n\u003Cp>You may have seen some of these technologies demonstrated in our \u003Ca href=\"https://www.eetimes.com/is-av-software-driver-detecting-what-we-are-seeing/\" target=\"_blank\" rel=\"noopener noreferrer\">previous unedited autonomous-driving videos\u003C/a>. Now we&rsquo;ve released another. Only this time, it doesn&rsquo;t just show \u003Cem>some\u003C/em> of our self-driving technologies in action. It shows \u003Cem>all\u003C/em> of them, demonstrating how our robotaxi fleet will function in the real world.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/98ce77f9b83f78c22352e157a03f8bb5_1649754353821.jpg\" alt=\"Robotaxi fleet outside Mobileye headquarters in Jerusalem\" />\u003C/p>\n\u003Cp>\u003Cstrong>True Redundancy&trade; in Action\u003C/strong>\u003C/p>\n\u003Cp>The previous unedited AV videos we&rsquo;ve released to date have all been filmed in our camera-only developmental autonomous vehicles. This latest video, however, showcases our fully configured robotaxi (\u003Ca href=\"https://www.mobileye.com/blog/self-driving-robotaxi-sixt-germany-iaa/\" target=\"_blank\" rel=\"noopener noreferrer\">revealed at IAA last September\u003C/a>) in action.\u003C/p>\n\u003Cp>This vehicle incorporates an array of radar and LiDAR sensors, operating in tandem with our camera-based computer-vision subsystem, and demonstrating the capabilities of our \u003Ca href=\"https://www.mobileye.com/blog/av-safety-demands-true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">True Redundancy\u003C/a> approach to autonomous-vehicle sensing. Instead of &ldquo;fusing&rdquo; the feeds from all the sensors into a single model of the driving environment, True Redundancy separates them into two parallel subsystems to create two separate models of the driving environment &ndash; complementing each other and creating an additional safety net.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/6454f2f687acbaddf1d19d0c276cecc7_1649762433652.png\" alt=\"Johann &quot;JJ&quot; Jungwirth next to a Mobileye robotaxi at night in Jerusalem\" />\u003C/p>\n\u003Cp>\u003Cstrong>Nighttime Driving in a Challenging Environment\u003C/strong>\u003C/p>\n\u003Cp>The integration of the \u003Ca href=\"https://www.mobileye.com/blog/radar-lidar-next-generation-active-sensors/\" target=\"_blank\" rel=\"noopener noreferrer\">active sensors\u003C/a> isn&rsquo;t the only element that makes this drive stand out. It was also shot at night, forcing the sensing suite to cope with a difficult combination of low visibility and glaring lights. Like \u003Ca href=\"https://www.mobileye.com/news/mobileye-releases-unedited-footage-of-a-40-minute-av-ride-through-jerusalem/\" target=\"_blank\" rel=\"noopener\">our first unedited drive video\u003C/a>, it was also shot in Jerusalem &ndash; a city known for its difficult combination of challenging driving culture and winding, undulating roads. But our robotaxi handled it all smoothly.\u003C/p>\n\u003Cp>The multi-stop drive also simulated how our robotaxis are designed to operate in the real world. Commercial operations are slated to commence in both \u003Ca href=\"https://www.mobileye.com/blog/munich-av-video/\" target=\"_blank\" rel=\"noopener noreferrer\">Germany\u003C/a> and Jerusalem later this year, picking up and dropping off passengers as they make their way across town.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/news/mobileye-avs-now-driving-with-full-sensing-suite/\" target=\"_blank\" rel=\"noopener noreferrer\">Read the news release\u003C/a> and watch the full, unedited 40-minute video below to see how our robotaxi is driving us that much closer to our goal of delivering self-driving mobility everywhere, in every way, for everyone.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/pDyMzz8HMIc\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2022-04-12T07:00:00.000Z","Video, Autonomous Driving, Driverless MaaS",{"id":1529,"type":24,"url":1530,"title":1531,"description":1532,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1532,"image":1533,"img_alt":1534,"content":1535,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1526,"tags":1536},167,"mobileye-avs-now-driving-with-full-sensing-suite","Mobileye AVs Now Driving with Full Sensing Suite","Now equipped with True Redundancy sensing, Mobileye AVs display human-like driving skill in a new, unedited driving video.","https://static.mobileye.com/dev/website/us/corporate/images/3d8be41c9e388e164080ec03ce233f26_1663241886991.jpg","​Mobileye Vice President Johann Jungwirth takes viewers on a virtual drive through the hectic streets of Jerusalem in a robotaxi.","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #262626;\">What&rsquo;s New:\u003C/strong>\u003Cspan style=\"background-color: #ffffff; color: #262626;\"> Mobileye, an Intel company, today showed its True Redundancy&trade; sensing system operating hands-free&ndash; a major milestone in preparation for the debut of its planned robotaxi services in Jerusalem and Germany. A new, \u003C/span>\u003Ca style=\"background-color: #ffffff; color: #0068b5;\" href=\"https://youtu.be/pDyMzz8HMIc\" target=\"_blank\" rel=\"noopener noreferrer\">unedited video\u003C/a>\u003Cspan style=\"background-color: #ffffff; color: #262626;\">&nbsp;shows the vehicle operating in autonomous mode while mimicking the multi-stop behavior of a ride-hailing service with humanlike skill.\u003C/span>\u003C/p>\n\u003Cp>\u003Cem>&ldquo;Mobileye Drive&trade; with True Redundancy defies industry norms with separate sensing subsystems that act as backups to one another. The very normal way in which the vehicle navigates very complex scenarios proves the value in this approach.&rdquo;\u003C/em>\u003C/p>\n\u003Cp>&ndash;Johann Jungwirth, vice president of mobility-as-a-service at Mobileye\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #262626;\">What It Means:&nbsp;\u003C/strong>\u003Cspan style=\"background-color: #ffffff; color: #262626;\">The video shows the Mobileye AV going through the motions of a robotaxi service, driving to multiple destinations and pausing where it might pick up and drop off passengers. In this fifth installment of the&nbsp;\u003C/span>\u003Ca style=\"background-color: #ffffff; color: #0068b5;\" href=\"https://www.youtube.com/playlist?list=PLWCfS_Yhbvs5MtQIjNfN-xLU30mc1qjEc\" target=\"_blank\" rel=\"noopener noreferrer\">unedited drive series\u003C/a>\u003Cspan style=\"background-color: #ffffff; color: #262626;\">, the capabilities of&nbsp;\u003C/span>\u003Ca style=\"background-color: #ffffff; color: #0068b5;\" href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener\">True Redundancy\u003C/a>\u003Cspan style=\"background-color: #ffffff; color: #262626;\">, Mobileye&rsquo;s alternative approach to autonomous vehicle (AV) sensor fusion, are on full display as the Mobileye AV robotaxi navigates the complex streets of Jerusalem at night. While previous unedited videos have shown the AV driving only with the camera subsystem, this new installment comes from the fully configured&nbsp;\u003C/span>\u003Ca style=\"background-color: #ffffff; color: #0068b5;\" href=\"https://www.mobileye.com/news/mobileye-moves-garage-streets/\" target=\"_blank\" rel=\"noopener noreferrer\">AV\u003C/a>\u003Cspan style=\"background-color: #ffffff; color: #262626;\">&nbsp;that Mobileye is planning to use in commercial robotaxi deployments.&nbsp;&nbsp;\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #262626;\">How It Works:\u003C/strong>\u003Cspan style=\"background-color: #ffffff; color: #262626;\"> True Redundancy is Mobileye&rsquo;s unique approach to environmental sensing whereby two independent subsystems &ndash; one camera-only and the other a lidar-radar combination &ndash; each serve as backups to each other instead of as complementary systems. The result is a sensing solution believed to deliver a higher mean time between failures. Prototype AVs now in operation are Mobileye&rsquo;s first to combine the two systems in a single vehicle, demonstrating how the robotaxi is expected to perform in real-world operations.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #262626;\">In the 40-minute unedited video, the Mobileye AV is seen completing complex, real-world driving maneuvers despite harsh nighttime roadway lighting and complicated road signs. The very humanlike driving behavior of the AV comes across as remarkably unremarkable in that it handles very challenging maneuvers smoothly.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #262626;\">Making it look easy, the AV negotiates with human drivers when executing a left turn at an unprotected junction (4:04; 06:48); safely and successfully navigates around jaywalking pedestrians (08:28; 10:42); seamlessly handles illegal maneuvers by other drivers (04:34); completes a 180-degree turn in an intersection with multiple traffic signals (25:18); navigates around vehicles blocking proper lane usage (25:39); rides through a roundabout with pedestrians (26:44); and completes other more regular driving maneuvers.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #262626;\">Why It Matters:\u003C/strong>\u003Cspan style=\"background-color: #ffffff; color: #262626;\">&nbsp;The demonstration of True Redundancy on real roads helps to dispel past industry skepticism that doubted whether Mobileye&rsquo;s cutting-edge approach to environmental sensing could work. More remarkable is the almost mundane quality to the video. The AV handles the drive more or less as a human would (and in some cases better), showing its near-readiness for planned robotaxi operations. Building on the already vast capabilities of Mobileye&rsquo;s camera-first AV development fleet, the addition of radar-lidar to its sensor suite is the final piece to achieving what the company set out to do with its differentiated AV technology.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #262626;\">Operationalizing the True Redundancy system is a crucial milestone toward Mobileye&rsquo;s planned robotaxi service scheduled for later this year in&nbsp;\u003C/span>\u003Ca style=\"background-color: #ffffff; color: #0068b5;\" href=\"https://www.mobileye.com/news/mobileye-sixt-plan-new-robotaxi-service/\" target=\"_blank\" rel=\"noopener noreferrer\">Germany\u003C/a>\u003Cspan style=\"background-color: #ffffff; color: #262626;\">. Mobileye has started the permit and regulatory approval process in both countries to enable the company to begin removing safety drivers on public roads.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #262626;\">More Context:&nbsp;\u003C/strong>\u003Cspan style=\"background-color: #ffffff; color: #262626;\">Mobileye Drive &ndash; Mobileye&rsquo;s&nbsp;self-driving system &ndash; combines Mobileye&rsquo;s industry-leading technologies, including&nbsp;\u003C/span>\u003Ca style=\"background-color: #ffffff; color: #0068b5;\" href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener\">Road Experience Management&trade;\u003C/a>\u003Cspan style=\"background-color: #ffffff; color: #262626;\">, the company&rsquo;s proprietary approach to mapping that leverages crowdsourced data from mass-market advanced driver-assistance systems to build AV maps on short notice; the&nbsp;\u003C/span>\u003Ca style=\"background-color: #ffffff; color: #0068b5;\" href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener\">Responsibility-Sensitive Safety&nbsp;(RSS)\u003C/a>\u003Cspan style=\"background-color: #ffffff; color: #262626;\">&nbsp;driving policy that implements a mathematical model to enhance safety through improved adaption to unique driving environments; and&nbsp;\u003C/span>\u003Ca style=\"background-color: #ffffff; color: #0068b5;\" href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener\">True Redundancy,\u003C/a>\u003Cspan style=\"background-color: #ffffff; color: #262626;\">&nbsp;which combines two independent perception sub-systems powered by cameras and radar-lidar, with each alone capable of developing full models and ultimately supporting full end-to-end autonomous capabilities.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\n\u003Cp>Mobileye, an Intel Company, is leading the mobility revolution with its autonomous driving and driver-assist technologies, harnessing world-renowned expertise in computer vision, machine learning, mapping and data analysis. Our technology enables self-driving vehicles and mobility solutions, powers industry-leading advanced driver-assistance systems and delivers valuable intelligence to optimize mobility infrastructure. Mobileye pioneered such groundbreaking technologies as True Redundancy&trade; sensing, REM&trade; crowdsourced mapping, and Responsibility Sensitive Safety (RSS) technologies that are driving the ADAS and AV fields toward the future of mobility.&nbsp;For more information:&nbsp;\u003Ca style=\"color: #0068b5; background-color: transparent;\" href=\"https://www.mobileye.com/\" target=\"_blank\" rel=\"noopener\">www.mobileye.com\u003C/a>.\u003C/p>","News, Autonomous Driving",{"id":1538,"type":5,"url":1539,"title":1540,"description":1541,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1541,"image":1542,"img_alt":1543,"content":1544,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1545,"tags":1546},120,"racing-for-innovation","Racing for Innovation","Virtual racing and automotive safety technology have more in common than you might think. Presenting the Mobileye GT World Challenge Esports Championships.","https://static.mobileye.com/website/us/corporate/images/9ddf96471319d14cbfb0dd1141996566_1648459161522.png","Mobileye-branded BMW M4 GT3 in the Mobileye GT World Challenge Esports Championship","\u003Cp>If you had to choose two of the most pivotal innovations in modern history, what would they be? For us here at Mobileye, they&rsquo;d surely be the automobile and the computer. It&rsquo;s at the confluence of these two that we continue to innovate &ndash; and we find ourselves in good company.\u003C/p>\n\u003Cp>Today the \u003Ca href=\"https://www.gt-world-challenge-europe.com/news/2272\" target=\"_blank\" rel=\"noopener noreferrer\">SRO Motorsports Group\u003C/a> announced our title sponsorship of the \u003Ca href=\"https://lp2.mobileye.com/esports\" rel=\"noopener noreferrer\">\u003Cu>Mobileye GT World Challenge Esports Championships\u003C/u>\u003C/a>. This marks our first foray into the exciting realms of motor racing and esports. But just what, exactly, does virtual racing have to do with the automotive safety tech for which Mobileye is known? A lot more than you might think....\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/855a189bed9234cded12065e1f9857d4_1648459208548.png\" alt=\"emblems for the Mobileye GT World Challenge eSports Championships and Mobileye Intercontinental GT Challenge eSports Championship\" />\u003C/p>\n\u003Cp>\u003Cstrong>Autonomous Vehicles and Virtual Racing\u003C/strong>\u003C/p>\n\u003Cp>Autonomous vehicles represent an emerging development that leverages new \u003Ca href=\"https://www.mobileye.com/blog/ceo-amnon-shashua-on-the-technological-megashifts-impacting-our-world/\" target=\"_blank\" rel=\"noopener noreferrer\">advancements in computational technology\u003C/a> to significantly disrupt the way we interact with motor vehicles. The same can be said of virtual racing... but that&rsquo;s not where the similarities end.\u003C/p>\n\u003Cp>Both AV tech and sim racing revolve around the digitization of the automobile and driving environment. We&rsquo;re transforming cars into a computer on wheels; virtual racing puts the car onto computer. We&rsquo;re \u003Ca href=\"https://www.mobileye.com/blog/rem-mapping-avs/\" target=\"_blank\" rel=\"noopener noreferrer\">digitally mapping the world&rsquo;s roadways\u003C/a>, while racing simulators digitally replicate the racetrack. The real commonality, however, lies in the fundamental change being unlocked by both developments.\u003C/p>\n\u003Cp>Competing in motorsports has traditionally been open only to a select few. In addition to skill, drivers need to be in the right age bracket, physical condition, and geographic location to compete seriously &ndash; not to mention the substantial financial resources required to engage in such an inherently expensive activity. By contrast, moving into the virtual domain opens competition to a much wider talent pool, where the cost of entry is limited principally to a decent computer (or console) and wheel/pedal setup.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/e3d36607a2387e614d0be993aa739059_1648460233059.jpg\" alt=\"a virtual racing driver in the Fanatec Esports GT Pro Series\" />\u003C/p>\n\u003Cp>That paradigm shift closely parallels the promise that self-driving vehicles hold, to \u003Ca href=\"https://www.mobileye.com/blog/national-autonomous-vehicle-day-how-avs-will-change-your-life/\" target=\"_blank\" rel=\"noopener noreferrer\">democratize independent automobility\u003C/a> for the masses. Because individuals don&rsquo;t need a driver&rsquo;s license to be transported via AV, their use will be open to everyone &ndash; regardless of age or level of physical or mental capability. Furthermore, the advent of \u003Ca href=\"https://www.mobileye.com/blog/self-driving-robotaxi-sixt-germany-iaa/\" target=\"_blank\" rel=\"noopener noreferrer\">self-driving Mobility-as-a-Service\u003C/a> will remove the considerable cost of car ownership from the user, too.\u003C/p>\n\u003Cp>\u003Cstrong>Safety in Racing\u003C/strong>\u003C/p>\n\u003Cp>Motor-racing might seem like the polar opposite of the automotive safety that Mobileye seeks to enhance. But in modern racing, safety is paramount.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/f4ebbfc5f296253b7ce151286c5c5577_1648460346563.png\" alt=\"Racing cars in a dark and rainy pitlane in the Fanatec GT World Challenge Powered by AWS\" />\u003C/p>\n\u003Cp>Professional racing drivers know that &ldquo;to finish first, you first have to finish.&rdquo; In other words, reckless driving (whether on road or track) is not only dangerous, but probably counterproductive, too. What&rsquo;s more is that \u003Cem>virtual\u003C/em> racing eliminates the risk to life and limb entirely, leaving only the competitive spirit and enthusiasm for automotive technological advancement which \u003Ca href=\"https://www.mobileye.com/blog/what-drives-us/\" target=\"_blank\" rel=\"noopener noreferrer\">we wholeheartedly share\u003C/a>.\u003C/p>\n\u003Cp>Much like Mobileye, motor racing has become a substantial incubator for advancements in automotive safety. \u003Ca href=\"https://www.mobileye.com/blog/toyota-zf-adas/\" target=\"_blank\" rel=\"noopener noreferrer\">Manufacturers\u003C/a> and organizers involved in motorsports are also constantly innovating ways to make racing safer. And \u003Ca href=\"https://www.freep.com/story/money/cars/mark-phelan/2018/05/23/racing-innovations-cars-safety/471396002/\" target=\"_blank\" rel=\"noopener noreferrer\">many of those advancements\u003C/a> then make their way into standard production cars.\u003C/p>\n\u003Cp>\u003Cstrong>Ladies and Gentlemen, Start Your Engines\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/1da0184a0eac24f5a5e02e7a84fec401_1648460425108.png\" alt=\"Mobileye-branded BMW M4 GT3 in the Mobileye GT World Challenge eSports Championship\" />\u003C/p>\n\u003Cp>With so much shared in common, we&rsquo;re excited to team up with SRO and to engage with the growing legions of virtual-racing fans around the world.\u003C/p>\n\u003Cp>The partnership encompasses the three individual \u003Ca href=\"https://sro-esports.com/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye GT World Challenge Esports Championships\u003C/a> in Europe, Asia, and America. It also includes the exciting new \u003Ca href=\"https://intercontinentalgt.sro-esports.com/\" target=\"_blank\" rel=\"noopener noreferrer\">Intercontinental GT Challenge Esports Championship Powered by Mobileye\u003C/a>, which brings teams of competitors from around the world together for virtual endurance races. Guest racers will compete in our virtual branded car. Fans will be able to vote for the most impressive racing maneuvers with the new Mobileye Innovation Award. And we&rsquo;ll be present as well in the real-life Fanatec GT World Challenge Europe Powered by AWS and its virtual counterpart, the Fanatec Esports GT Pro Series.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.gt-world-challenge-europe.com/news/2272\" target=\"_blank\" rel=\"noopener noreferrer\">Read the full press release\u003C/a> from the SRO Motorsports Group and visit \u003Ca href=\"https://lp2.mobileye.com/esports\" rel=\"noopener noreferrer\">\u003Cu>our esports webpage\u003C/u>\u003C/a> for more. Check out this year's updated calendar of events at bottom, and tune in to watch the virtual races in the playlist below.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/videoseries?list=PLnnFwnhryIkQ6Iu0oBdWVwEZ33iCQlhei\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/a817642bfa07310445ad38073c0c1667_1654092673402.jpg\" alt=\"2022 online calendar for Mobileye GT World Challenge Esports Championships\" width=\"1650\" height=\"928\" />\u003C/p>","2022-03-28T07:00:00.000Z","Events",{"id":1548,"type":5,"url":1549,"title":1550,"description":1551,"primary_tag":954,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1551,"image":1552,"img_alt":1553,"content":1554,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1555,"tags":1556},118,"ces-2022-videos-demos","Mobileye Tech Takes Center Stage at CES 2022","Take a look at everything we put together for this year’s big tech expo to encounter the groundbreaking technologies propelling the self-driving revolution.","https://static.mobileye.com/website/us/corporate/images/be944ec32aafee7ad7db56dcb4da1f15_1641886826570.jpg","AV Everywhere, in Every Way, for Everyone: Mobileye's main stage show for CES 2022.","\u003Cp>The worldwide technology industry kicked off the new year, as it does each year, with CES last week. Mobileye&rsquo;s presence was strictly virtual, and featured a full array of video demonstrations and insights from our senior leadership. And you can watch them all right here.\u003C/p>\n\u003Cp>\u003Cstrong>AV Everywhere, in Every Way, for Everyone\u003C/strong>\u003C/p>\n\u003Cp>Our main stage show encapsulated the diverse ways in which Mobileye is delivering on the promise of self-driving technology everywhere, in every way, for everyone. If you have time to watch just one video to see where Mobileye is at and where we&rsquo;re going (without getting too technical), this is it.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/1dJJXscXJqI\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Go Deeper with Mobileye Leadership\u003C/strong>\u003C/p>\n\u003Cp>Our chief executive Prof. Amnon Shashua updated viewers on the latest from Mobileye &ndash; both in his annual &ldquo;Under the Hood&rdquo; address and at the Intel press conference the previous day. His remarks included some big news announcements, including the \u003Ca href=\"https://www.mobileye.com/news/mobileye-ces-2022-tech-news/\" target=\"_blank\" rel=\"noopener\">latest iterations of our EyeQ chip\u003C/a>, a new \u003Ca href=\"https://www.mobileye.com/news/udelv-unveils-autonomous-cab-less-transporter/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomous delivery vehicle\u003C/a>, a \u003Ca href=\"https://www.mobileye.com/news/zeekr-mobileye-working-together/\" target=\"_blank\" rel=\"noopener noreferrer\">forthcoming consumer AV\u003C/a>, and \u003Ca href=\"https://www.mobileye.com/news/mobileye-ces-2022-partner-news/\" target=\"_blank\" rel=\"noopener noreferrer\">new projects with major automakers\u003C/a>.\u003C/p>\n\u003Cp>You can still watch both in our earlier blog post. But Shashua wasn&rsquo;t the only member of our leadership who had valuable insights to share at CES 2022. We also recorded videos with our heads of product and strategy, Mobility-as-a-Service, digital mapping, and data services &ndash; each of whom detailed Mobileye&rsquo;s differentiated approaches to advanced mobility technology.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/videoseries?list=PLWCfS_Yhbvs6kl2NhPnvdPxi44_E6miA0\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>\u003Cstrong>Dive into the Tech\u003C/strong>\u003C/p>\n\u003Cp>You can also check out the hardware that powers our solutions in this video...\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/aYfRBRmHwBQ\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>... explore different applications for our \u003Ca href=\"https://www.autonews.com/awards/2020-mobileye-rem-road-experience-management\" target=\"_blank\" rel=\"noopener noreferrer\">award-winning REM&trade; mapping technology\u003C/a>...\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/hoC40qS4p20\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>... and take a deeper dive into our &ldquo;lean&rdquo; driving policy with our CTO, Prof. Shai Shalev-Shwartz.\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/ViGL0z1BULs\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>","2022-01-11T08:00:00.000Z","Video, Events",{"id":1558,"type":5,"url":1559,"title":1560,"description":1561,"primary_tag":954,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1561,"image":1562,"img_alt":1563,"content":1564,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1565,"tags":1566},116,"ces-2022-under-the-hood-prof-amnon-shashua","Prof. Shashua Takes Us 'Under the Hood' at CES 2022","Our CEO covered everything from business and data to our latest chips and next-generation sensors during his hour-long deep dive and Intel's press conference.","https://static.mobileye.com/website/us/corporate/images/78a72893f2bd5f7d8607e2c657b13c26_1641477197265.jpg","Prof. Amnon Shashua delivering his Under the Hood address during CES 2022.","\u003Cp>Every year during \u003Ca href=\"https://www.mobileye.com/news/ces-2022-livestream-press-conference-schedule/\" target=\"_blank\" rel=\"noopener\">CES\u003C/a>, our chief executive updates inquiring minds on the state of automotive technology in general and the tremendous progress being made at Mobileye in particular. And conditions notwithstanding, this year was no exception &ndash; covering everything from business and data to next-generation sensors and our latest chips.\u003C/p>\n\u003Cp>In the 2022 rendition of his annual &ldquo;Under the Hood&rdquo; address, \u003Ca href=\"https://www.mobileye.com/blog/mobileye-ces-2021-recap/\" target=\"_blank\" rel=\"noopener noreferrer\">Professor Shashua\u003C/a> looked back on a successful 2021 for Mobileye, with over 28 million of our \u003Ca href=\"https://www.mobileye.com/blog/100-million-eyeq-chips/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ&reg; chips shipped\u003C/a> to our customers around the world and design wins for 50 million more. And he provided a glimpse into \u003Ca href=\"https://www.mobileye.com/blog/mobileye-ces-2022-self-driving-secret-data/\" target=\"_blank\" rel=\"noopener\">the wealth of data we&rsquo;re working with\u003C/a>, at over 200 petabytes. (You'd need over 400,000 of your typical 512-gigabyte smartphones to store all that data!)\u003C/p>\n\u003Cp>Shashua also revealed our newest Systems-on-Chip: EyeQ6 and \u003Ca href=\"https://www.mobileye.com/news/mobileye-ces-2022-tech-news/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ Ultra\u003C/a>. He pointed to some of the latest vehicles coming out with our technology, highlighted the revolutionary innovations integral to our strategy, including our inherently lean driving policy, updated on our development of next-generation sensors, and much more.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/1mXy0oi8d60\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>Watch the full hour-long recording above or the highlights below to go Under the Hood with Prof. Amnon Shashua &ndash; president and CEO of Mobileye, senior vice president at Intel Corporation, \u003Ca href=\"https://www.mobileye.com/news/prof-amnon-shashua-wins-the-dan-david-prize/\" target=\"_blank\" rel=\"noopener\">prize-winning scientist\u003C/a>, and leading mind of the mobility tech industry.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/4EPUQaU72Ao\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>Prof. Shashua also presented during the One Intel press conference. There he \u003Ca href=\"https://www.mobileye.com/news/mobileye-ces-2022-partner-news/\" target=\"_blank\" rel=\"noopener noreferrer\">announced new developments\u003C/a> with the CEOs of Ford and Volkswagen, dove into our mapping tech, explored our global AV footprint, revealed details on the implementation of Mobileye SuperVision&trade; and \u003Ca href=\"https://www.mobileye.com/news/zeekr-mobileye-working-together/\" target=\"_blank\" rel=\"noopener noreferrer\">plans for a new Level 4 consumer AV\u003C/a> with Zeekr, and spoke of the past and future of our EyeQ chips. Watch the 19-minute segment in the video below.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/oCF7V6blenM\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2022-01-06T08:00:00.000Z","Video, Events, From our CEO",{"id":1568,"type":24,"url":1569,"title":1570,"description":1571,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1571,"image":1572,"img_alt":1571,"content":1573,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1574,"tags":1575},157,"2022-ces-mobileye-event-livestream-replay","Mobileye’s ‘Under the Hood’ (Replay)","During CES 2022, Mobileye CEO Prof. Amnon Shashua offers an update on how the company will deliver economically viable consumer autonomous vehicles. ","https://static.mobileye.com/website/us/corporate/images/3c34f1076fd9a3f382b71669c77785dc_1666086402268.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\u003Cp>&nbsp;\u003C/p>\u003Ciframe class=\"ql-video\" frameborder=\"0\" allowfullscreen=\"true\" src=\"https://www.youtube.com/embed/1mXy0oi8d60\" height=\"315\" width=\"560\">\u003C/iframe>\u003Ch6>\u003Cbr>\u003C/h6>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>On Jan. 5,&nbsp;Mobileye CEO Prof. Amnon Shashua explained&nbsp;how the company will deliver economically viable consumer autonomous vehicles (AV) to the world. He unveiled new chip technology, shared progress on radar and lidar technology, and, for the first time, disclosed details about Mobileye’s approach to enabling fully autonomous solutions across vehicle types and use cases around the globe. During the 2022 “Under the Hood” session, Shashua showed how Mobileye is rewriting the AV playbook.\u003C/p>\u003Cp>\u003Cstrong>Presentation Deck:&nbsp;\u003C/strong> \u003Ca href=\"https://static.mobileye.com/website/common/files/Mobileye-CES-2022-UnderTheHood+%28compressed%291.pdf\" rel=\"noopener noreferrer\" target=\"_blank\">Mobileye’s ‘Under the Hood’\u003C/a>\u003C/p>","2022-01-05T19:00:00.000Z","Events, Video, News, From our CEO",{"id":1577,"type":5,"url":1578,"title":1579,"description":1580,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1580,"image":1581,"img_alt":1580,"content":1582,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1583,"tags":928},156,"mobileye-ces-2022-self-driving-secret-data","Mobileye’s Self-Driving Secret? 200PB of Data","Powerful computer vision tech and natural language models turn industry’s leading dataset into AV training gold mine.","https://static.mobileye.com/website/us/corporate/images/0af0d2849fa453af84ac0782cb49f203_1666085520056.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>What&rsquo;s New:\u003C/strong>&nbsp;Mobileye is sitting on a virtual treasure trove of driving data &ndash; some 200 petabytes worth. When combined with Mobileye&rsquo;s state-of-the-art computer vision technology and extremely capable natural language understanding (NLU) models, the dataset can deliver thousands of results within seconds, even for incidents that fall into the &ldquo;long tail&rdquo; of rare conditions and scenarios. This helps the AV and state-of-the-art computer vision system handle edge cases and thereby achieve the very high mean time between failure (MTBF) rate targeted for self-driving vehicles.\u003C/p>\n\u003Cp>&ldquo;Data and the infrastructure in place to harness it is the hidden complexity of autonomous driving. Mobileye has spent 25 years collecting and analyzing what we believe to be the industry&rsquo;s leading database of real-world and simulated driving experience, setting Mobileye apart by enabling highly capable AV solutions that meet the high bar for mean time between failure.&rdquo; &ndash;Prof. Amnon Shashua, Mobileye president and chief executive officer\u003C/p>\n\u003Cp>\u003Cstrong>How It Works:\u003C/strong>&nbsp;Mobileye&rsquo;s&nbsp;database &ndash; believed to be the world&rsquo;s largest automotive dataset &ndash; comprises more than 200 petabytes of driving footage, equivalent to 16 million 1-minute driving clips from 25 years of real-world driving. Those 200 petabytes are stored between Amazon Web Services (AWS) and on-premise systems. The sheer size of Mobileye&rsquo;s dataset makes the company one of AWS&rsquo;s largest customers by volume stored globally.&nbsp;\u003C/p>\n\u003Cp>Large-scale data labeling is at the heart of building powerful computer vision engines needed for autonomous driving. Mobileye&rsquo;s rich and relevant dataset is annotated both automatically and manually by a team of more than 2,500 specialized annotators. The compute engine relies on 500,000 peak CPU cores at the AWS cloud to crunch 50 million datasets monthly &ndash; the equivalent to 100 petabytes being processed every month related to 500,000 hours of driving.\u003C/p>\n\u003Cp>\u003Cstrong>Why It Matters:\u003C/strong>&nbsp;Data is only valuable if you can make sense of it and put it to use. This requires deep comprehension of natural language along with state-of-the-art computer vision, Mobileye&rsquo;s long-standing strength.\u003C/p>\n\u003Cp>Every AV player faces the &ldquo;long tail&rdquo; problem in which a self-driving vehicle encounters something it has not seen or experienced before. This long tail contains large datasets, but many do not have the tools to effectively make sense of it. Mobileye&rsquo;s state-of-the-art computer vision technology combined with extremely capable NLU models enable Mobileye to query the dataset and return thousands of results within the long tail within seconds. Mobileye can then use this to train its computer vision system and make it even more capable. Mobileye&rsquo;s approach dramatically accelerates the development cycle.\u003C/p>\n\u003Cp>\u003Cstrong>What Is Included:&nbsp;\u003C/strong>Mobileye&rsquo;s team uses an in-house search engine database with millions of images, video clips and scenarios. They include anything from &ldquo;tractor covered in snow&rdquo; to &ldquo;traffic light in low sun,&rdquo; all collected by Mobileye and feeding its algorithms. (See sample images).\u003C/p>\n\u003Cp>\u003Cstrong>More Context:&nbsp;\u003C/strong>With access to the industry&rsquo;s highest-quality data and the talent required to put it to use, Mobileye&rsquo;s driving policy can make sound, informed decisions deterministically, an approach that removes the uncertainty of artificial intelligence-based decisions and yields a statistically high mean time between failure rate. At the same time, the dataset hastens the development cycle to bring the lifesaving promise of AV technology to reality more quickly.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\n\u003Cp>Mobileye is leading the mobility revolution with its autonomous driving and driver-assist technologies, harnessing world-renowned expertise in computer vision, machine learning, mapping and data analysis. Our technology enables self-driving vehicles and mobility solutions, powers industry-leading advanced driver-assistance systems and delivers valuable intelligence to optimize mobility infrastructure. Mobileye pioneered such groundbreaking technologies as True Redundancy&trade; sensing, REM&trade; crowdsourced mapping, and Responsibility Sensitive Safety (RSS) technologies that are driving the ADAS and AV fields toward the future of mobility.&nbsp;For more information:&nbsp;\u003Ca href=\"https://www.mobileye.com/\" target=\"_blank\" rel=\"noopener\">www.mobileye.com\u003C/a>.\u003C/p>\n\u003Cp>\u003Cstrong>Forward-Looking Statements\u003C/strong>\u003C/p>\n\u003Cp>Statements in this media release that refer to future plans and expectations are forward-looking statements that involve a number of risks and uncertainties. Words such as &ldquo;anticipates,&rdquo; &ldquo;expects,&rdquo; &ldquo;intends,&rdquo; &ldquo;goals,&rdquo; &ldquo;plans,&rdquo; &ldquo;believes,&rdquo; &ldquo;seeks,&rdquo; &ldquo;estimates,&rdquo; &ldquo;continues,&rdquo; &ldquo;may,&rdquo; &ldquo;will,&rdquo; &ldquo;would,&rdquo; &ldquo;should,&rdquo; &ldquo;could,&rdquo; and variations of such words and similar expressions are intended to identify such forward-looking statements. Statements that refer to or are based on estimates, forecasts, projections, uncertain events or assumptions, including statements relating to future products and technology and the availability and benefits of such products and technology, expectations regarding customers, market opportunity, and anticipated trends in our businesses or the markets relevant to them, also identify forward-looking statements. Such statements are based on current expectations and involve many risks and uncertainties that could cause actual results to differ materially from those expressed or implied in these forward-looking statements. Important factors that could cause actual results to differ materially are set forth in Intel&rsquo;s SEC filings, including the company&rsquo;s most recent reports on Forms 10-K and 10-Q, which may be obtained by visiting our Investor Relations website at www.intc.com or the SEC&rsquo;s website at www.sec.gov. Intel does not undertake, and expressly disclaims any duty, to update any statement made in this news release, whether as a result of new information, new developments or otherwise, except to the extent that disclosure may be required by law.&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>[**]gallery:mobileye-ces-2022-self-driving-secret-data-gallery-1[**]\u003C/p>","2022-01-05T08:00:00.000Z",{"id":1585,"type":24,"url":1586,"title":1587,"description":1588,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1588,"image":1589,"img_alt":1588,"content":1590,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1591,"tags":928},138,"zeekr-mobileye-working-together","CES: Mobileye, Zeekr Collaborate on Consumer AV","Together, Mobileye and Zeekr aim to deliver the world’s first consumer autonomous vehicle with L4 capabilities by 2024.","https://static.mobileye.com/website/us/corporate/images/47789c8048ca30177d73a102a43886c2_1666086631415.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\u003Cp>LAS VEGAS, Jan. 4, 2022 — Mobileye, an Intel company, and Zeekr, the global premium electric mobility technology brand from Geely Holding Group, today announced plans to expand their strategic technology partnership with the goal of developing a new all-electric consumer vehicle with&nbsp;\u003Ca href=\"https://static.mobileye.com/website/common/files/Mobileye-Drive-Fact-Sheet-675839.pdf\" rel=\"noopener noreferrer\" target=\"_blank\">Level 4 (L4) capabilities\u003C/a>.\u003C/p>\u003Cp>The planned autonomous vehicle (AV) will be powered by six EyeQ®5 system-on-chips to process Mobileye True Redundancy™ sensing, Responsibility-Sensitive Safety (RSS)-based driving policy and a new open collaboration model on&nbsp;Road Experience Management™&nbsp;(REM™) mapping&nbsp;technology. The vehicle will utilize Geely SEA architecture’s redundant braking, steering and power, under an “open EyeQ” concept that allows efficient integration with Zeekr software technologies. In addition, Mobileye will enhance its China-related research and development capabilities, establishing a local data center and enhancing its local teams to support its rapidly growing China activities.\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>“The broadening scope of our partnership reflects just how closely Mobileye and Zeekr are aligned on the vision for future mobility,” said Prof. Amnon Shashua, Mobileye president and chief executive officer. “Zeekr’s confidence in Mobileye as a technology partner demonstrates our ability to execute toward joint goals and further solidify our industry leadership.”&nbsp;&nbsp;\u003C/p>\u003Cp>With an anticipated debut in China in 2024, the new all-electric Zeekr vehicle is expected to be the world’s first consumer AV with L4 autonomous capability. Built using&nbsp;\u003Ca href=\"https://static.mobileye.com/website/common/files/Mobileye-Drive-Fact-Sheet-675839.pdf\" rel=\"noopener noreferrer\" target=\"_blank\">Mobileye Drive™\u003C/a>&nbsp;technology, together with the open SEA architecture, and expanding on Zeekr’s existing models, this new vehicle is expected to eventually roll out across global markets.\u003C/p>\u003Cp>Mobileye and Zeekr’s new collaboration expands on the companies’ long-term strategic technology partnership, which includes the development of advanced ADAS (advanced driver-assistance systems) with enhanced capabilities for a variety of Zeekr models. The joint efforts are supported by open collaboration with technologies including REM, as Mobileye and Zeekr work toward achieving a safer and more sophisticated future on the roads.&nbsp;&nbsp;\u003C/p>\u003Cp>“Mobileye has been a&nbsp;strategic partner&nbsp;to our mission of delivering lifestyle vehicles fit for a more sustainable and autonomous future of transportation,” said Andy An, CEO of Zeekr Intelligent Technology. “Our partnership supports Zeekr and Mobileye’s shared ambitions for leading the global ADAS and AV industry. Zeekr welcomes open collaboration that enables the integration of technological expertise to create a more sophisticated autonomous mobility experience for our customers.”\u003C/p>\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\u003Cp>Mobileye is a global leader in the development of computer vision and machine learning, data analysis, localization and mapping for Advanced Driver Assistance Systems and autonomous driving. Mobileye’s technology helps keep passengers safer on the road, reduces the risks of traffic accidents, saves lives and has the potential to revolutionize the driving experience by enabling autonomous driving. Mobileye’s proprietary software algorithms and EyeQ® chips perform detailed interpretations of the visual field in order to anticipate possible collisions with other vehicles, pedestrians, cyclists, animals, debris and other obstacles. Mobileye’s products are also able to detect roadway markings such as lanes, road boundaries, barriers and similar items; identify and read traffic signs, directional signs and traffic lights; create a Mobileye Roadbook™ of localized drivable paths and visual landmarks using REM™, and provide mapping for autonomous driving.\u003C/p>\u003Cp>\u003Cstrong>About ZEEKR\u003C/strong>\u003C/p>\u003Cp>Zeekr is the global premium electric mobility technology brand from Geely Holding Group. Zeekr aims to create a fully integrated user ecosystem with innovation as standard. The brand utilizes Sustainable Experience Architecture (SEA) and includes its own battery technologies, battery management systems, electric motor technologies and electric vehicle supply chain. Zeekr’s value is equality, diversity, and sustainability. Its ambition is to become a true mobility solution’s provider. Zeekr began delivery of its first product, Zeekr 001, in October 2021. It will launch at least two new model offerings each year to satisfy the rapidly expanding global EV industry. For more information regarding Zeekr, please refer to the official website at&nbsp;\u003Ca href=\"http://www.zgh.com/\" rel=\"noopener noreferrer\" target=\"_blank\">https://zeekrlife.com/\u003C/a>\u003C/p>\u003Cp>\u003Cstrong>Forward-looking statements:\u003C/strong>\u003C/p>\u003Cp>Statements in this press release that refer to future plans and expectations are forward-looking statements that involve a number of risks and uncertainties. Words such as “anticipates,” “expects,” “intends,” “goals,” “plans,” “believes,” “seeks,” “estimates,” “continues,” “may,” “will,” “would,” “should,” “could,” and variations of such words and similar expressions are intended to identify such forward-looking statements. Statements that refer to or are based on estimates, forecasts, projections, uncertain events or assumptions, including statements relating to future products and technology and the availability and benefits of such products and technology, expectations regarding customers, market opportunity, and anticipated trends in our businesses or the markets relevant to them, also identify forward-looking statements. Such statements are based on current expectations and involve many risks and uncertainties that could cause actual results to differ materially from those expressed or implied in these forward-looking statements. Important factors that could cause actual results to differ materially are set forth in Intel’s SEC filings, including the company’s most recent reports on Forms 10-K and 10-Q, which may be obtained by visiting our Investor Relations website at www.intc.com or the SEC’s website at www.sec.gov. Intel does not undertake, and expressly disclaims any duty to update any statement made in this press release, whether as a result of new information, new developments or otherwise, except to the extent that disclosure may be required by law.\u003C/p>","2022-01-04T15:00:00.000Z",{"id":1593,"type":24,"url":1594,"title":1595,"description":1596,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1596,"image":1597,"img_alt":1596,"content":1598,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1599,"tags":1600},137,"mobileye-ces-2022-partner-news","CES: Top Automakers Double-Down on Mobileye","Volkswagen Group applies mapping tech to ADAS, while Ford and Zeekr announce new Mobileye-based production programs.","https://static.mobileye.com/website/us/corporate/images/a2c08f07a26bc294caf4eb0e7a29870b_1666087124855.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>\u003Cstrong>News Highlights\u003C/strong>\u003C/p>\n\u003Cul>\n\u003Cli>Mobileye is joined by VW, Ford and Zeekr at CES 2022 to announce new programs based on Mobileye&rsquo;s mapping, advanced driver-assistance systems (ADAS) and autonomous vehicle (AV) tech.\u003C/li>\n\u003Cli>With Travel Assist 2.5, Volkswagen Group leverages Mobileye&rsquo;s crowd-sourced mapping technology to enhance ADAS comfort features like lane-keeping/centering in VW, &Scaron;koda and Seat brand vehicles.\u003C/li>\n\u003Cli>Mobileye&rsquo;s long-standing relationship with Ford deepens into strategic collaboration to add Road Experience Management&trade; (REM&trade;) mapping technology to a future version of Ford BlueCruise and bring Level 2-plus (L2+) hands-free ADAS solutions across multiple makes and models.\u003C/li>\n\u003Cli>Mobileye and Zeekr, the global premium electric mobility technology brand from Geely Holding Group, announce plans to offer consumers an all-electric self-driving vehicle powered by Mobileye Drive&trade; by 2024.\u003C/li>\n\u003C/ul>\n\u003Cp>LAS VEGAS, Jan. 4, 2022 &mdash; During an Intel news conference today, Intel subsidiary Mobileye revealed multiple new strategic collaborations designed to transform driver and passenger experiences globally. Deals with Volkswagen Group, Ford and Zeekr were brought to light to illustrate the breadth and innovation of Mobileye&rsquo;s ADAS-to-AV technology.&nbsp;Mobileye also revealed its&nbsp;\u003Ca href=\"https://www.mobileye.com/news/mobileye-ces-2022-tech-news/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ&reg; Ultra &ndash; Mobileye's first AV-on-chip\u003C/a>&nbsp;(AVoC) purpose-built for&nbsp;\u003Ca href=\"https://static.mobileye.com/dev/website/us/corporate/images/be013ffc23e3d75babbda0ed4a5019ea_1663241548611.jpg\" target=\"_blank\" rel=\"noopener noreferrer\">Level 4\u003C/a>&nbsp;self-driving vehicles.\u003C/p>\n\u003Cp>&ldquo;Our customers are demonstrating that innovation is at the center of their future strategies and leaning on Mobileye to help execute their visions,&rdquo; said Prof. Amnon Shashua, Mobileye president and chief executive officer. &ldquo;As a trusted collaborator, Mobileye is firing on all cylinders to deliver scalable ADAS-to-AV solutions that exceed the expectations of our customers and, at the same time, push the industry forward. We&rsquo;re grateful for our ongoing collaborations and look forward to setting more new industry standards together.&rdquo;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>The new deals and programs revealed include:\u003C/p>\n\u003Cul>\n\u003Cli>\u003Cstrong>First global application of &ldquo;swarm data&rdquo; for ADAS:&nbsp;&nbsp;\u003C/strong>Volkswagen Group is the first original equipment manufacturer (OEM) to apply Mobileye&rsquo;s mapping data to enhance the comfort and safety of ADAS features globally. Mobileye Roadbook&trade; is a crowd-sourced, cloud-generated database of highly precise, high-definition maps. Swarm data is collected via Mobileye-equipped vehicles globally, and VW is now using that data to greatly enhance the driver experience via Travel Assist 2.5. For example, where available, lane-keeping assistance will be provided in many areas without visible lane markings. Mobileye&rsquo;s proprietary&nbsp;\u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener\">Road Experience Management\u003C/a>&nbsp;technology automatically aggregates and generates AV maps in the cloud, delivering a truly global and scalable mapping solution for automated vehicles. The Roadbook-enhanced Travel Assist feature will be available soon in Volkswagen, &Scaron;koda and Seat electric vehicle (EV) models based on Volkswagen&rsquo;s MEB platform.\u003C/li>\n\u003Cli>Prof. Shashua and Dr. Herbert Diess, chairman of the board for Volkswagen Group, recently tested Volkswagen&rsquo;s new Mobileye-enabled Travel Assist features in an ID.4 vehicle in Munich. During the test drive, Diess observed several advantages to using REM technology for the advanced features. &ldquo;It's a clear advantage of using real driving data over maps &hellip; everything works well, and the car is basically following this car without any intervention from my side,&rdquo; he said.\u003C/li>\n\u003Cli>\u003Cstrong>Ford and Mobileye deepen long-standing relationship:&nbsp;\u003C/strong>&nbsp;Ford and Mobileye have announced plans to expand their strategic partnership. For example, Ford will begin using Mobileye&rsquo;s REM &ndash; or Road Experience Management technology &ndash; in future versions of the Ford BlueCruise system, which allows customers to operate their vehicles hands-free while monitored by a driver-facing camera that makes sure customers are keeping their eyes on the road. The additional collaboration uses Mobileye&rsquo;s REM to expand true hands-free driving to include qualified divided highways and areas without visible lane markings, thanks to even better lane-centering and lane-keeping technology. The companies also are working together on an open platform from Mobileye that will allow Ford to build and integrate Ford&rsquo;s own solutions to make driving in the future safer and easier.\u003C/li>\n\u003Cli>&ldquo;Ford has been delivering new vehicle technologies that make driving safer and easier for more than a century,&rdquo; Ford president and CEO Jim Farley said. &ldquo;We are excited to work with Mobileye on a platform that supports our development of next-generation autonomy technologies. Our investment in these capabilities will allow us to transform our customers&rsquo; transportation experiences.&rdquo;&nbsp;\u003C/li>\n\u003Cli>\u003Cstrong>First consumer AV built on Mobileye Drive tech:&nbsp;\u003C/strong>Mobileye and Zeekr, the global premium electric mobility technology brand from Geely Holding Group, will further&nbsp;\u003Ca href=\"https://www.mobileye.com/news/mobileye-zeekr-expand-future-cars-partnership/\" target=\"_blank\" rel=\"noopener noreferrer\">expand their partnership\u003C/a>&nbsp;by building a new all-electric vehicle with L4 capabilities enabled by Mobileye True Redundancy&trade; sensing, REM mapping technology and Responsibility-Sensitive Safety (RSS)-based driving policy along with Geely SEA architecture&rsquo;s true-redundant braking, steering and power under an Open EyeQ concept that allows seamless integration between Mobileye and Zeekr technologies. It is believed to be the world&rsquo;s first L4 vehicle for consumers. The vehicle&rsquo;s consumer debut is expected by 2024 in China, with a global rollout to follow.\u003C/li>\n\u003Cli>&ldquo;Mobileye has been a strategic partner to our mission of delivering lifestyle vehicles fit for a more sustainable and autonomous future of transportation,&rdquo; said Andy An, CEO of Zeekr Intelligent Technology. &ldquo;Our partnership supports Zeekr and Mobileye&rsquo;s shared ambitions for leading the global ADAS and AV industry. Zeekr welcomes open collaboration that enables the integration of technological expertise to create a more sophisticated autonomous mobility experience for our customers.&rdquo;&nbsp;&nbsp;\u003C/li>\n\u003C/ul>\n\u003Cp>Also unveiled at the news conference was the&nbsp;\u003Ca href=\"https://www.mobileye.com/news/udelv-unveils-autonomous-cab-less-transporter/\" target=\"_blank\" rel=\"noopener noreferrer\">Udelv&nbsp;Transporter, a purpose-built autonomous delivery vehicle\u003C/a>&nbsp;powered by Mobileye Drive. As&nbsp;\u003Ca href=\"https://www.mobileye.com/news/mobileye-udelv-deal-autonomous-delivery/\" target=\"_blank\" rel=\"noopener noreferrer\">announced\u003C/a>&nbsp;last year, Udelv plans to produce more than 35,000 Mobileye-driven Transporters by 2028, with commercial operations beginning in 2023.\u003C/p>\n\u003Cp>\u003Cstrong>Find out More\u003C/strong>\u003C/p>\n\u003Cp>Interested in learning more about Mobileye&rsquo;s proprietary technology and partnerships? Join Mobileye CEO Amnon Shashua at 11:30 a.m. PST, Wednesday, Jan. 5, for an in-depth technology update, featuring more information about Mobileye&rsquo;s novel AV-on-chip technology, next steps for the company&rsquo;s key OEM partnerships and more. Find the full session on the&nbsp;Intel Newsroom\u003Ca href=\"https://www.intel.com/content/www/us/en/newsroom/news/2022-ces-mobileye-event-livestream-replay.html\" target=\"_blank\" rel=\"noopener noreferrer\">.\u003C/a>\u003C/p>\n\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\n\u003Cp>Mobileye is the global leader in the development of computer vision and machine learning, data analysis, localization and mapping for Advanced Driver Assistance Systems and autonomous driving. Mobileye&rsquo;s technology helps keep passengers safer on the roads, reduces the risks of traffic accidents, saves lives and has the potential to revolutionize the driving experience by enabling autonomous driving. Mobileye&rsquo;s proprietary software algorithms and EyeQ&reg; chips perform detailed interpretations of the visual field in order to anticipate possible collisions with other vehicles, pedestrians, cyclists, animals, debris and other obstacles. Mobileye&rsquo;s products are also able to detect roadway markings such as lanes, road boundaries, barriers and similar items; identify and read traffic signs, directional signs and traffic lights; create a Mobileye Roadbook&trade; of localized drivable paths and visual landmarks using REM&trade;; and provide mapping for autonomous driving.\u003C/p>\n\u003Cp>\u003Cstrong>Forward-Looking Statements\u003C/strong>\u003C/p>\n\u003Cp>Statements in this press release that refer to future plans and expectations are forward-looking statements that involve a number of risks and uncertainties. Words such as &ldquo;anticipates,&rdquo; &ldquo;expects,&rdquo; &ldquo;intends,&rdquo; &ldquo;goals,&rdquo; &ldquo;plans,&rdquo; &ldquo;believes,&rdquo; &ldquo;seeks,&rdquo; &ldquo;estimates,&rdquo; &ldquo;continues,&rdquo; &ldquo;may,&rdquo; &ldquo;will,&rdquo; &ldquo;would,&rdquo; &ldquo;should,&rdquo; &ldquo;could,&rdquo; and variations of such words and similar expressions are intended to identify such forward-looking statements. Statements that refer to or are based on estimates, forecasts, projections, uncertain events or assumptions, including statements relating to future products and technology and the availability and benefits of such products and technology, expectations regarding customers, market opportunity, and anticipated trends in our businesses or the markets relevant to them, also identify forward-looking statements. Such statements are based on current expectations and involve many risks and uncertainties that could cause actual results to differ materially from those expressed or implied in these forward-looking statements. Important factors that could cause actual results to differ materially are set forth in Intel&rsquo;s SEC filings, including the company&rsquo;s most recent reports on Forms 10-K and 10-Q, which may be obtained by visiting our Investor Relations website at www.intc.com or the SEC&rsquo;s website at www.sec.gov. Intel does not undertake, and expressly disclaims any duty, to update any statement made in this press release, whether as a result of new information, new developments or otherwise, except to the extent that disclosure may be required by law.\u003C/p>","2022-01-04T08:00:00.000Z","News, ADAS, Events",{"id":1602,"type":24,"url":1603,"title":1604,"description":1605,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1605,"image":1606,"img_alt":1605,"content":1607,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1599,"tags":928},158,"mobileye-ces-2022-tech-news","New Mobileye EyeQ Ultra will Enable Consumer AVs","Single-package design will deliver industry’s leanest, most performance-power efficient SoC for fully autonomous vehicles.","https://static.mobileye.com/website/us/corporate/images/7c1fa9c5ea5f33d9ada1c5a32c390ea4_1666087048027.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>NEWS HIGHLIGHTS\u003C/strong>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cul>\n\u003Cli>Mobileye introduces EyeQ&reg; Ultra&trade; &ndash; a single package AV-on-chip super-computer that is purpose-built for end-to-end autonomous driving.\u003C/li>\n\u003Cli>Proven Mobileye EyeQ&reg; architecture underpins EyeQ Ultra, maximizing performance and efficiency at 176 TOPS (tera operations per second).\u003C/li>\n\u003Cli>First silicon for the EyeQ Ultra is expected in late 2023 with full automotive-grade production in 2025.\u003C/li>\n\u003Cli>Mobileye also introduces the next-generation EyeQ system-on-chip (SoC) for advanced driver-assistance systems (ADAS): EyeQ6L and EyeQ6H.\u003C/li>\n\u003C/ul>\n\u003Cp>LAS VEGAS, Jan. 4, 2022 &ndash;&nbsp;Mobileye today introduced the EyeQ&reg; Ultra&trade;, the company&rsquo;s most advanced, highest performing system-on-chip (SoC) purpose-built for autonomous driving. As unveiled during CES 2022, EyeQ Ultra maximizes both effectiveness and efficiency at only 176 TOPS, making it the industry&rsquo;s leanest autonomous vehicle (AV) chip. This efficiently designed SoC builds on seven generations of proven EyeQ architecture to deliver exactly the power and performance needed for AVs, which are all but certain to be all-electric vehicles.\u003C/p>\n\u003Cp>First silicon for the EyeQ Ultra SoC is expected at the end of 2023, with full automotive-grade production in 2025.&nbsp;&nbsp;\u003C/p>\n\u003Cp>&ldquo;Consumer AV is the end game for the industry,&rdquo; said Prof. Amnon Shashua, Mobileye president and chief executive officer. &ldquo;By developing the entire self-driving solution &ndash; from hardware and software to mapping and service models &ndash; Mobileye has a unique perspective into the exact requirements for the self-driving system that enables us to reach the performance-and-cost optimization that will make consumer AVs a reality.&rdquo;&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>Mobileye designed the EyeQ Ultra after having first built an AV to understand exactly what a self-driving vehicle needs to operate at a very high meantime between failures. This approach enables the optimum balance of performance across different accelerators and general-purpose processors in an extremely efficient power-performance envelope.\u003C/p>\n\u003Cp>\u003Cstrong>Delivering Unmatched Cost-to-Performance\u003C/strong>\u003C/p>\n\u003Cp>Marking a leap in the evolution of the EyeQ family of SoCs, EyeQ Ultra packs the performance of 10 EyeQ5s in a single package. Leveraging 5 nanometer process technology, EyeQ Ultra can handle all the needs and applications of Level 4 (L4) autonomous driving without the power consumption and costs related to integrating multiple SoCs together. Like its EyeQ predecessors, EyeQ Ultra has been engineered in tandem with Mobileye software, enabling extreme power efficiency with zero performance sacrifices.\u003C/p>\n\u003Cp>EyeQ Ultra utilizes an array of four classes of proprietary accelerators, each built for a specific task. These accelerators are paired with additional CPU cores, ISPs and GPUs in a highly efficient solution capable of processing input from two sensing subsystems &ndash; one camera-only system and the other radar and lidar combined &ndash; as well as the vehicle&rsquo;s central computing system, the high-definition map and driving policy software. At a mere 176 TOPS, the EyeQ Ultra is much more efficient than other AV solutions, delivering the necessary performance and price-point required for consumer-level AVs.\u003C/p>\n\u003Cp>By optimizing for efficiency, EyeQ Ultra unlocks the AV potential for safer roads and reduced congestion for consumers.\u003C/p>\n\u003Cp>\u003Cstrong>Evolution of the EyeQ Architecture\u003C/strong>\u003C/p>\n\u003Cp>The introduction of EyeQ Ultra comes at the same time as two new EyeQ SoCs for ADAS &ndash; the EyeQ6L and EyeQ6H &ndash; and follows the shipment of Mobileye&rsquo;s&nbsp;\u003Ca href=\"https://www.mobileye.com/blog/100-million-eyeq-chips/\" target=\"_blank\" rel=\"noopener noreferrer\">100 millionth EyeQ\u003C/a>&nbsp;SoC late last year. First introduced in 2004, Mobileye&rsquo;s EyeQ transformed the ADAS market by proving that cost-effective camera sensors processed by Mobileye&rsquo;s purpose-built technology were capable of preventing and mitigating collisions. The innovation of EyeQ helped make roadway safety technology more accessible, bringing features including forward-collision warning, lane departure warning and blind spot detection to millions of drivers around the world.\u003C/p>\n\u003Cp>Mobileye&rsquo;s proven EyeQ architecture lays the foundation for EyeQ Ultra. Designed to make consumer AVs accessible, EyeQ Ultra fills a void in the automotive market as the EyeQ family of SoCs has done before it. As an extension of the EyeQ family, EyeQ Ultra will also be informed by Mobileye&rsquo;s Road Experience Management&trade; (REM) mapping technology. Gathered via millions of vehicles on the road already equipped with&nbsp;Mobileye, REM captures packages of road data to create the Mobileye Roadbook&trade;, which is accessed via the cloud to provide, in real time, up-to-date information on the drivable paths ahead.&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Two New EyeQ SoCs for Next-Generation ADAS\u003C/strong>\u003C/p>\n\u003Cp>The EyeQ6L will be the successor to the EyeQ4 SoC in a package that is just 55 percent the size of the EyeQ4. This one-box windshield solution delivers more deep-learning TOPS at ultra-low power for highly efficient entry and premium (L2) ADAS. It began sampling last year and is due to reach start of production by the middle of 2023.\u003C/p>\n\u003Cp>The EyeQ6H will support premium ADAS or partial AV capabilities with full surround. It is equivalent to two EyeQ5 SoCs in terms of computing power but more importantly supports visualization and performs better under heavy artificial intelligence workloads. This centralized solution will provide all ADAS L2+ functionalities, multi-camera processing (including parking cameras), and will host third-party apps such as parking visualization and driver monitoring. This most advanced ADAS SoC in the EyeQ family will begin sampling this year and is due to begin production by the end of 2024.\u003C/p>\n\u003Cp>Both EyeQ6 SoCs will be manufactured on 7nm process technology.\u003C/p>\n\u003Cp>\u003Cstrong>More Context\u003C/strong>\u003C/p>\n\u003Cp>As Mobileye continues to execute its plan to enable autonomous driving, the versatility and scalability of the company&rsquo;s portfolio comes into view. Mobileye recently shipped its&nbsp;\u003Ca href=\"https://www.mobileye.com/blog/100-million-eyeq-chips/\" target=\"_blank\" rel=\"noopener noreferrer\">100 millionth EyeQ SoC\u003C/a>, unveiled its production robotaxi, and&nbsp;\u003Ca href=\"https://www.mobileye.com/press-kit/press-kit-mobileye-new-york-city/\" target=\"_blank\" rel=\"noopener noreferrer\">scaled its autonomous vehicle testing\u003C/a>&nbsp;across multiple cities around the world including in the U.S., Europe and Asia.\u003C/p>\n\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\n\u003Cp>Mobileye is leading the mobility revolution with its autonomous driving and driver-assist technologies, harnessing world-renowned expertise in computer vision, machine learning, mapping and data analysis. Our technology enables self-driving vehicles and mobility solutions, powers industry-leading advanced driver-assistance systems and delivers valuable intelligence to optimize mobility infrastructure. Mobileye pioneered such groundbreaking technologies as True Redundancy&trade; sensing, REM&trade; crowdsourced mapping, and Responsibility Sensitive Safety (RSS) technologies that are driving the ADAS and AV fields toward the future of mobility.\u003C/p>\n\u003Cp>For more information:&nbsp;\u003Ca href=\"https://www.mobileye.com/\" target=\"_blank\" rel=\"noopener\">www.mobileye.com\u003C/a>.\u003C/p>\n\u003Cp>\u003Cstrong>Forward-Looking Statements\u003C/strong>\u003C/p>\n\u003Cp>Statements in this press release that refer to future plans and expectations are forward-looking statements that involve a number of risks and uncertainties. Words such as &ldquo;anticipates,&rdquo; &ldquo;expects,&rdquo; &ldquo;intends,&rdquo; &ldquo;goals,&rdquo; &ldquo;plans,&rdquo; &ldquo;believes,&rdquo; &ldquo;seeks,&rdquo; &ldquo;estimates,&rdquo; &ldquo;continues,&rdquo; &ldquo;may,&rdquo; &ldquo;will,&rdquo; &ldquo;would,&rdquo; &ldquo;should,&rdquo; &ldquo;could,&rdquo; and variations of such words and similar expressions are intended to identify such forward-looking statements. Statements that refer to or are based on estimates, forecasts, projections, uncertain events or assumptions, including statements relating to future products and technology and the availability and benefits of such products and technology, expectations regarding customers, market opportunity, and anticipated trends in our businesses or the markets relevant to them, also identify forward-looking statements. Such statements are based on current expectations and involve many risks and uncertainties that could cause actual results to differ materially from those expressed or implied in these forward-looking statements. Important factors that could cause actual results to differ materially are set forth in Intel&rsquo;s SEC filings, including the company&rsquo;s most recent reports on Forms 10-K and 10-Q, which may be obtained by visiting our Investor Relations website at www.intc.com or the SEC&rsquo;s website at www.sec.gov. Intel does not undertake, and expressly disclaims any duty, to update any statement made in this press release, whether as a result of new information, new developments or otherwise, except to the extent that disclosure may be required by law.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>[**]gallery:mobileye-ces-2022-tech-news-gallery-1[**]\u003C/p>",{"id":1609,"type":24,"url":1610,"title":1611,"description":1612,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1612,"image":1613,"img_alt":1614,"content":1615,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1616,"tags":928},139,"udelv-unveils-autonomous-cab-less-transporter","CES: Udelv Unveils Mobileye-Powered AV Transporter","The autonomous, electric delivery vehicle will help solve two challenges: a shortage of drivers and fleet electrification.","https://static.mobileye.com/dev/website/us/corporate/images/99b4ba3c8b8edab1262fc170ae397979_1663142139761.jpg","Udelv  aims to have 50,000 units of the Transporter, driven by Mobileye, on public roads by 2028.","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>​Today, Udelv unveiled its Transporter – the company’s next-generation autonomous delivery vehicle driven by Mobileye Drive™&nbsp;– in advance of this week’s International Consumer Electronics Show 2022 in Las Vegas. The cab-less Transporter is designed for commercial delivery fleets and is expected to help solve two of the most pressing challenges facing commercial fleets: the current shortage of drivers and the coming electrification of fleets.\u003C/p>\u003Cp>The Mobileye Drive self-driving system is powered by the Mobileye EyeQ® 5 system-on-chip for automotive applications and a robust suite of cameras, lidars and radars. To enable Udelv to rapidly deploy the Transporter at scale, the company has also integrated Mobileye’s Road Experience Management™,&nbsp;&nbsp;a crowdsourced, continuously updated map of the world that digitizes what autonomous vehicles need to navigate.\u003C/p>\u003Cp>The Transporter features a patented cargo space that is secure, automated, hot-swappable and modular. It is specifically designed for autonomous delivery and can carry up to 2,000 pounds of cargo and make up to 80 stops per run. It is made to deliver nearly anything from convenience goods, e-commerce packages and groceries to auto parts, electronics and medical supplies for B2B and B2C applications.\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>[**]gallery:udelv-unveils-autonomous-cab-less-transporter-gallery-1[**]\u003C/p>","2022-01-03T13:00:00.000Z",{"id":1618,"type":24,"url":1619,"title":1620,"description":1621,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1621,"image":1622,"img_alt":1623,"content":1624,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1625,"tags":402},114,"ces-2022-livestream-press-conference-schedule","Join Us Online at CES 2022","Tune in to discover our latest technologies and the vehicles they’re driving during this year’s big tech expo.","https://static.mobileye.com/website/us/corporate/images/a080dcf90f7e1d02e265ca22ae8a27b9_1641241664118.png","Mobileye at CES 2022","\u003Cp>For all its ups and downs, \u003Ca href=\"https://www.mobileye.com/blog/100-million-eyeq-chips/\" target=\"_blank\" rel=\"noopener noreferrer\">2021 was a big year for Mobileye\u003C/a>. Now 2022 is upon us, and we&rsquo;re starting it out in full force at CES.\u003C/p>\n\u003Cp>We&rsquo;ve had to switch to an exclusively virtual presence for this year&rsquo;s big tech expo. But that won&rsquo;t stop us from bringing you all the latest from Mobileye and all that we&rsquo;re doing to proliferate our self-driving technology everywhere, in every way, for everyone. Here&rsquo;s where you can tune in for our live sessions.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/162c82846c38a36933c165f66a256779_1641222790699.png\" alt=\"Intel Executive Vice President Gregory Bryant and Mobileye CEO Prof. Amnon Shashua\" />\u003C/p>\n\u003Cp>\u003Cstrong>&ldquo;On the Road to the Future&rdquo;\u003C/strong>\u003C/p>\n\u003Cp>\u003Cstrong> Intel Press Conference with Gregory Bryant and Prof. Amnon Shashua\u003C/strong>\u003C/p>\n\u003Cp>Tuesday, January 4, 2022, 10 a.m. PST | Watch live at \u003Ca href=\"https://www.intel.com/ces/\" target=\"_blank\" rel=\"noopener noreferrer\">intel.com/ces\u003C/a>\u003C/p>\n\u003Cp>Join Intel senior leaders as they share new developments in two of Intel&rsquo;s most important businesses: personal computing and automated driving. Gregory Bryant, executive vice president and general manager of Intel&rsquo;s Client Computing Group, and Prof. Amnon Shashua, Mobileye CEO, will share the stage with several guest executives whose companies, working with Intel, are leading digital transformation and computing growth across two of today&rsquo;s fastest-growing sectors.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/96cf0297aa42d3b5a6d0b73a6bc0162a_1641222844022.jpg\" alt=\"Mobileye CEO Prof. Amnon Shashua speaks at CES\" />\u003C/p>\n\u003Cp>\u003Cstrong>&ldquo;Under the Hood&rdquo; with Prof. Amnon Shashua\u003C/strong>\u003C/p>\n\u003Cp>Wednesday, January 5, 2022, 11:30 a.m. PST\u003C/p>\n\u003Cp>Join \u003Ca href=\"https://www.mobileye.com/blog/mobileye-ces-2021-recap/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye CEO Prof. Amnon Shashua\u003C/a> as he explains how the company will deliver economically viable consumer autonomous vehicles (AV) to the world. He will unveil new chip technology, share progress on radar and lidar technology, and, for the first time, disclose details about Mobileye&rsquo;s approach to enabling fully autonomous solutions across vehicle types and use cases around the globe. The 2022 &ldquo;Under the Hood&rdquo; session is not to be missed, as Shashua shows how Mobileye is rewriting the AV game.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/5ad566a659a71e6568a0822a05ee2cd1_1641222918906.jpg\" alt=\"Mobileye Robotaxi in Jerusalem\" />\u003C/p>\n\u003Cp>\u003Cstrong>But that&rsquo;s not all...\u003C/strong>\u003C/p>\n\u003Cp>During CES, we&rsquo;ll be demonstrating a broad spectrum of Mobileye technologies and how they&rsquo;re being implemented across the industry.\u003C/p>","2022-01-03T08:00:00.000Z",{"id":1627,"type":5,"url":1628,"title":1629,"description":1630,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1630,"image":1631,"img_alt":1632,"content":1633,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1634,"tags":1635},113,"paris-ratp-autonomous-vehicle-testing-pilot","Mobileye Self-Driving Vehicles Hit the Boulevards of Paris","Joint pilot project with RATP Group is already shuttling passengers around the French capital in our autonomous vehicles.","https://static.mobileye.com/website/us/corporate/images/18fc9380ff93de6548794c6454d09691_1639661141996.jpg","Mobileye autonomous vehicle on the Champs Elysees by the Arc de Triomphe in Paris","\u003Cp>Getting a self-driving vehicle to work in a controlled environment is one thing. Getting it to work in the real-world traffic of a densely populated city is quite another. But we&rsquo;re not just testing our autonomous vehicles in one city. We&rsquo;re testing them in several of the most challenging urban environments around the world.\u003C/p>\n\u003Cp>Now we&rsquo;re pleased to add Paris to the growing network of locations where we&rsquo;re running our AVs. And this time, it&rsquo;s not just for testing purposes.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/Q69tBNCVJa0\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>\u003Cstrong>Why Paris?\u003C/strong>\u003C/p>\n\u003Cp>The unique driving culture, roadway infrastructure, and other parameters particular to each of the cities in which we&rsquo;re testing provides us with a unique opportunity to further hone our technology in different driving environments. That&rsquo;s equally true of the French capital as it is of \u003Ca href=\"https://www.mobileye.com/news/mobileye-releases-unedited-footage-of-a-40-minute-av-ride-through-jerusalem/\" target=\"_blank\" rel=\"noopener\">Jerusalem\u003C/a>, \u003Ca href=\"https://www.mobileye.com/blog/munich-av-video/\" target=\"_blank\" rel=\"noopener noreferrer\">Munich\u003C/a>, \u003Ca href=\"https://www.mobileye.com/press-kit/press-kit-mobileye-new-york-city/\" target=\"_blank\" rel=\"noopener\">New York\u003C/a>, Detroit, Tokyo, and Shanghai, where our test fleets are also operating.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.france24.com/en/live-news/20210830-it-s-not-easy-slower-era-dawns-for-paris-drivers\" target=\"_blank\" rel=\"noopener noreferrer\">Driving in Paris\u003C/a> is characterized by its iconic boulevards, labyrinth side streets, multi-lane roundabouts, and its own \u003Ca href=\"https://www.mobileye.com/opinion/digitizing-the-social-contract-for-safer-roads/\" target=\"_blank\" rel=\"noopener\">unwritten rules of the road\u003C/a>. The permit we&rsquo;ve now secured to run our AVs in the City of Lights opens a new chapter under the \u003Ca href=\"https://www.mobileye.com/blog/mobileye-hosts-its-first-investor-summit-since-the-intel-acquisition/\" target=\"_blank\" rel=\"noopener noreferrer\">collaboration into which we entered two years ago with RATP Group\u003C/a>, the principal public transit operator in Paris and one of the largest in the world.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/c02b4dfbd42053f7ce4161c21def1341_1639661160921.png\" alt=\"Mobileye autonomous vehicle by the Eiffel Tower in Paris\" />\u003C/p>\n\u003Cp>\u003Cstrong>Beyond Testing\u003C/strong>\u003C/p>\n\u003Cp>With RATP in Paris, we&rsquo;ve launched our first on-demand self-driving mobility service, taking our autonomous vehicle program beyond the testing phase.\u003C/p>\n\u003Cp>Under the pilot project, we&rsquo;re honored to be entrusted with transporting employees of the famous \u003Ca href=\"https://www.galerieslafayette.com/\" target=\"_blank\" rel=\"noopener noreferrer\">Galeries Lafayette\u003C/a> Paris Haussmann department store to and from work in our completely self-driving vehicles. Rides can be ordered up on-demand or booked in advance through the Moovit app for up to two passengers at a time, together with a Mobileye safety driver and an RATP co-pilot. (Read the \u003Ca href=\"https://www.mobileye.com/news/mobileye-autonomous-cars-piloting-paris/\" target=\"_blank\" rel=\"noopener noreferrer\">news release \u003C/a>for more details.)\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/184c6f6ca2f47398ed4a779c5c2d143d_1639661189203.jpg\" alt=\"Mobileye autonomous vehicle outside a street corner cafe in Paris\" />\u003C/p>\n\u003Cp>This pilot project sets the stage for much \u003Ca href=\"https://www.forbes.com/sites/bradtempleton/2020/05/21/intelmobileye-promises-self-driving-robotaxi-service-in-2022-while-others-back-off\" target=\"_blank\" rel=\"noopener noreferrer\">more to come\u003C/a>. Next year we&rsquo;re scheduled to launch \u003Ca href=\"https://www.mobileye.com/blog/self-driving-robotaxi-sixt-germany-iaa/\" target=\"_blank\" rel=\"noopener noreferrer\">robotaxi services\u003C/a> in both Tel Aviv and Munich. And our self-driving technology is set to enable autonomous mobility services in even more locations around the world &ndash; including \u003Ca href=\"https://www.mobileye.com/blog/mobileye-to-deploy-robotaxis-in-japan-with-willer/\" target=\"_blank\" rel=\"noopener noreferrer\">Japan\u003C/a>, South Korea, the \u003Ca href=\"https://www.mobileye.com/news/mobileye-is-bringing-driverless-maas-to-the-uae/\" target=\"_blank\" rel=\"noopener\">United Arab Emirates\u003C/a>, a \u003Ca href=\"https://www.mobileye.com/blog/mobileye-transdev-lohr-maas-i-cristal-shuttles-robotaxis/\" target=\"_blank\" rel=\"noopener noreferrer\">second initiative in France\u003C/a>, and \u003Ca href=\"https://www.mobileye.com/blog/udelv-transporter-autonomous-delivery-vehicles-powered-by-mobileye/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomous delivery services in the United States\u003C/a>.\u003C/p>\n\u003Cp>Watch this space for more as the self-driving mobility revolution is just getting under way.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/53e84e6cb20679c65697f4d83ae325ca_1639661229304.jpg\" alt=\"Mobileye autonomous vehicle outside the Galeries Lafayette Paris Haussmann department store\" />\u003C/p>","2021-12-16T08:00:00.000Z","Autonomous Driving, News, Video, Driverless MaaS",{"id":1637,"type":24,"url":1638,"title":1639,"description":1640,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1640,"image":1641,"img_alt":1639,"content":1642,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1634,"tags":1643},146,"mobileye-autonomous-cars-piloting-paris","Mobileye Autonomous Cars Piloting in Paris","Employees from world-famous Galeries Lafayette in Paris can experience first autonomous rides by Mobileye and RATP Group.","https://static.mobileye.com/dev/website/us/corporate/images/a5120267e1f6070a4150870340ca877b_1663141218558.jpg","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>\u003Cstrong>What&rsquo;s New:\u003C/strong>&nbsp;Mobileye, an Intel company, is adding Paris to its rapidly expanding global autonomous vehicle (AV) testing program and announcing its first autonomous on-demand service in the city of Paris, in collaboration with&nbsp;\u003Ca href=\"https://www.ratp.fr/groupe-ratp\" target=\"_blank\" rel=\"noopener noreferrer\">RATP Group\u003C/a>, the world's third-largest public transportation operator.\u003C/p>\n\u003Cp>\"As a leading operator of autonomous mobility, we are very happy to offer to our client Galeries Lafayette a new mobility service by associating our know-how with Mobileye. This is an opportunity for the RATP Group to test a new use case, an autonomous car service for companies, but also to test the vehicle's autonomous technology for possible integration on other transport modes such as a bus or minibus.\"\u003C/p>\n\u003Cp>&ndash;C&ocirc;me Berbain, director of Innovation for RATP Group\u003C/p>\n\u003Cp>\u003Cstrong>How It Works:\u003C/strong>&nbsp;Mobileye has obtained an AV testing permit to allow the company to drive its autonomous robotaxis on the streets of Paris.\u003C/p>\n\u003Cp>Passengers who are part of the pilot will be among the first consumers to ride in a Mobileye AV, and the first to hail a robotaxi using the Moovit app that will be at the center of Mobileye&rsquo;s MoovitAV consumer ridesharing service.\u003C/p>\n\u003Cp>In collaboration with RATP, Mobileye organized an AV on-demand service that will provide employees from&nbsp;\u003Ca href=\"https://haussmann.galerieslafayette.com/en/\" target=\"_blank\" rel=\"noopener noreferrer\">Galeries Lafayette Paris Haussmann\u003C/a>&nbsp;the ability to request or schedule a ride to work using the Moovit app four days a week. Each Mobileye AV test vehicle is able to transport two passengers at a time, plus a Mobileye safety driver and an RATP co-pilot.\u003C/p>\n\u003Cp>\"Urban mobility is inseparable from the issues of our time and must enable us to meet the challenges of the sustainable city,&rdquo; said Alexandre Liot, managing director of Galeries Lafayette Haussmann. &ldquo;Today it is essential to think about improving and transforming mobility, to prepare the future of city centers and allow Paris to continue its attractiveness. Therefore, as the French leader of department stores, we are proud to join forces with Mobileye and the RATP Group to participate in this innovative project which reflects on the transport of tomorrow.\"\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"Mobileye Autonomous Vehicle Maneuvers the Streets of Paris (B-Roll)\" src=\"https://player.vimeo.com/video/710524664?h=4b76167428&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"3840\" height=\"2160\" frameborder=\"0\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Cp>\u003Cstrong>B-Roll Video:\u003C/strong> Footage shows a self-driving vehicle from Mobileye&rsquo;s autonomous fleet driving through the streets of Paris. In December 2021, Mobileye announced it is adding Paris to its rapidly expanding global autonomous vehicle testing program. (Credit: Mobileye, an Intel Company)\u003C/p>\n\u003Cp>\u003Ca href=\"https://vimeo.com/intelpr/review/657181109/fd1c462bc8\" target=\"_blank\" rel=\"noopener noreferrer\">Download video: \"Mobileye Autonomous Vehicle Maneuvers the Streets of Paris (B-Roll)\"\u003C/a>\u003C/p>\n\u003Cp>\u003Cstrong>Why It Matters:\u003C/strong>&nbsp;With complex road systems and high volumes of pedestrians and traffic, cities like Paris pose a major challenge to human drivers and AVs alike. They represent essential testing fields to develop safe and comfortable driverless services on a global scale. Adding the French capital to Mobileye&rsquo;s rapidly expanding portfolio of daunting AV testing environments marks another important step in moving the industry toward commercial readiness. Allowing customers to order their AV ride on-demand via the Moovit app lays an important foundation for Mobileye&rsquo;s planned MoovitAV robotaxi service by gaining consumer awareness and confidence while gathering valuable insights and feedback from riders.\u003C/p>\n\u003Cp>&ldquo;Autonomously driving the roads of Paris is yet another milestone on the way to realizing our vision of self-driving inclusive mobility. We are happy to have not only gained the testing permit, but also strong partners in Paris,&rdquo; said Johann Jungwirth, vice president of mobility-as-a-service at Mobileye.\u003C/p>\n\u003Cp>In addition to testing in&nbsp;\u003Ca href=\"https://www.mobileye.com/press-kit/press-kit-mobileye-new-york-city/\" target=\"_blank\" rel=\"noopener noreferrer\">New York City this year\u003C/a>, Mobileye is also operating autonomous vehicle&nbsp;\u003Ca href=\"https://www.mobileye.com/news/ces-2021-mobileye-avs-on-move/\" target=\"_blank\" rel=\"noopener noreferrer\">test fleets\u003C/a>&nbsp;in&nbsp;\u003Ca href=\"https://www.mobileye.com/news/mobileye-avs-go-anywhere-germany/\" target=\"_blank\" rel=\"noopener noreferrer\">Munich\u003C/a>, Detroit, Tokyo,&nbsp;Jerusalem,&nbsp;\u003Ca href=\"https://www.mobileye.com/news/mobileye-moves-garage-streets/\" target=\"_blank\" rel=\"noopener noreferrer\">Tel Aviv\u003C/a>&nbsp;and China. The company is expected to launch&nbsp;\u003Ca href=\"https://www.mobileye.com/news/mobileye-sixt-plan-new-robotaxi-service/\" target=\"_blank\" rel=\"noopener noreferrer\">commercial robotaxi service\u003C/a>s under the MoovitAV brand in Munich and Tel Aviv in 2022 after obtaining regulatory approval.\u003C/p>\n\u003Cp>\u003Cstrong>More Context:\u003C/strong>&nbsp;As Mobileye continues to execute its plan to enable autonomous driving, the versatility and scalability of the company&rsquo;s portfolio comes into view. Mobileye recently shipped its&nbsp;\u003Ca href=\"https://www.mobileye.com/blog/100-million-eyeq-chips/\" target=\"_blank\" rel=\"noopener noreferrer\">100 millionth EyeQ \u003Csup>&reg;\u003C/sup>&nbsp;system-on-chip \u003C/a>, unveiled its production robotaxi and&nbsp;\u003Ca href=\"https://www.mobileye.com/press-kit/press-kit-mobileye-new-york-city/\" target=\"_blank\" rel=\"noopener noreferrer\">scaled its autonomous vehicle testing\u003C/a>&nbsp;across multiple cities around the world, including the U.S., Europe and Asia.\u003C/p>\n\u003Cp>Customers from across the mobility-as-a-service landscape are able to use Mobileye products and solutions to transition to driverless capabilities.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/ffb8e17fd0465107f7ec5527531a9436_1663141194419.jpg\" alt=\"Mobileye in Paris\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\n\u003Cp>Mobileye is leading the mobility revolution with its autonomous driving and driver-assist technologies, harnessing world-renowned expertise in computer vision, machine learning, mapping and data analysis. Our technology enables self-driving vehicles and mobility solutions, powers industry-leading advanced driver-assistance systems and delivers valuable intelligence to optimize mobility infrastructure. Mobileye pioneered such groundbreaking technologies as True Redundancy&trade; sensing, REM&trade; crowdsourced mapping, and Responsibility Sensitive Safety (RSS) technologies that are driving the ADAS and AV fields towards the future of mobility.&nbsp;For more information:&nbsp;\u003Ca href=\"https://www.mobileye.com/\" target=\"_blank\" rel=\"noopener\">www.mobileye.com\u003C/a>.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>About Moovit\u003C/strong>\u003C/p>\n\u003Cp>Intel subsidiary Moovit,&nbsp;acquired in 2020, is a leading Mobility-as-a-Service (MaaS) provider. The company is guided by the&nbsp;\u003Ca href=\"https://moovit.com/about-us/\" target=\"_blank\" rel=\"noopener noreferrer\">idea\u003C/a>&nbsp;that mobility should be a basic human right. For this purpose, Moovit developed an urban mobility app enabling people to get around cities quickly and easily using any mode of transportation. Today, Moovit has served more than one billion passengers in more than 112 countries. It offers a global consumer and transportation network that represents the ideal platform to put forth Mobileye&rsquo;s AVs for commercial driverless ride-hailing.&nbsp;For more information:&nbsp;\u003Ca href=\"http://www.moovit.com/\" target=\"_blank\" rel=\"noopener noreferrer\">www.moovit.com\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>About Galeries Lafayette\u003C/strong>\u003C/p>\n\u003Cp>Leading French department store player and famous throughout the world, Galeries Lafayette has been an unrivalled specialist in fashion and experience marketing for 125 years. It aims to make each visit a unique experience and offer its French and international customers a range of constantly renewed brands, ranging from affordable to premium and luxury. The Galeries Lafayette brand, covering a wide range of segments taking in fashion and accessories, decoration, food and catering, promotes its offering via a network of 65 stores in France and abroad, the merchant website&nbsp;\u003Ca href=\"https://www.galerieslafayette.com/\" target=\"_blank\" rel=\"noopener noreferrer\">galerieslafayette.com\u003C/a>&nbsp;and the Galeries Lafayette Outlet discount store. For more information about&nbsp;\u003Ca href=\"https://haussmann.galerieslafayette.com/en/\" target=\"_blank\" rel=\"noopener noreferrer\">galerieslafayette.com\u003C/a>, go to&nbsp;\u003Ca href=\"https://www.facebook.com/GaleriesLafayetteEnglish/\" target=\"_blank\" rel=\"noopener noreferrer\">Facebook\u003C/a>&nbsp;and&nbsp;\u003Ca href=\"https://www.instagram.com/galerieslafayette/\" target=\"_blank\" rel=\"noopener noreferrer\">Instagram\u003C/a>.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>About RATP Group\u003C/strong>\u003C/p>\n\u003Cp>With 16 million trips daily worldwide, RATP Group is one of the largest urban transport operators in the world. Through its subsidiary RATP DEV, the group operates in 14 countries on four continents, where it provides everyday mobility services across eight transport modes: metro, urban and intercity buses; trams; rail; sightseeing services; cable cars; maritime shuttles; and transport on demand. The group also has a strong presence in new forms of mobility, in partnership with other modes (free-floating electric scooters; car-sharing; carpooling and autonomous shuttles; smart car parks). The group leverages its recognised expertise in infrastructure management and engineering to provide, via its subsidiaries, a wide range of urban services from engineering and managing real estate and retail areas, to delivering fibre networks, tailor-made passenger information and innovative ticketing solutions. For over 70 years, RATP has operated one of the world&rsquo;s densest multimodal networks in &Icirc;le-de-France. RATP Group&rsquo;s 69,000 employees worldwide design, implement and bring to life mobility solutions and innovative services to make the city more sustainable and more human.\u003C/p>\n\u003Ch6>&nbsp;\u003C/h6>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>[**]gallery:mobileye-autonomous-cars-piloting-paris-gallery-1[**]\u003C/p>","Video, Driverless MaaS, Autonomous Driving, News",{"id":1645,"type":5,"url":1646,"title":1647,"description":1648,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1648,"image":1649,"img_alt":1650,"content":1651,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1652,"tags":1653},111,"100-million-eyeq-chips","Celebrating 100 Million EyeQ®s on the Road","Our highly efficient, scalable, and proven System-on-Chip, EyeQ® is the brain behind everything Mobileye does. And we’ve shipped over 100 million of them to date. ","https://static.mobileye.com/website/us/corporate/images/19d71a6b159b1f9c8d464dbe15c98275_1639040463469.png","100 million Mobileye EyeQ® chips","\u003Cp>100 million is an enormous number to contemplate. It&rsquo;s greater than \u003Ca href=\"https://www.space.com/17081-how-far-is-earth-from-the-sun.html\" target=\"_blank\" rel=\"noopener noreferrer\">the distance, in miles, from Earth to the Sun\u003C/a>. It&rsquo;s more than the combined populations of the three largest cities in the world. And it&rsquo;s also the number of EyeQ&reg; chips we&rsquo;ve shipped to date &ndash; or put another way, the number of cars that have been made with our technology on board. If that&rsquo;s not a cause for celebration, an opportunity for reflection, and grounds for optimism, we don&rsquo;t know what is.\u003C/p>\n\u003Cp>\u003Cstrong>What is EyeQ?\u003C/strong>\u003C/p>\n\u003Cp>EyeQ is a chip unlike any other. Our family of Systems-on-Chip are designed not for brute processing power alone, but are built from an application-driven approach to serve one very specific purpose: to deliver advanced mobility solutions.\u003C/p>\n\u003Cp>EyeQ is singularly efficient &ndash; with low power consumption and high utilization of its compute resources, in a compact, cost-effective package. EyeQ is extensively proven &ndash; by the sheer breadth of its adoption that we&rsquo;re now celebrating, by the standards it meets, and the extreme temperatures at which it&rsquo;s capable of operating. And EyeQ is broadly scalable &ndash; it&rsquo;s the brain behind everything we do, from driver assistance to self-driving technology, and more.\u003C/p>\n\u003Cp>It is, in short, the processor that&rsquo;s driving the mobility revolution.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/c562eeecd2fa8d2d2f531f3c191c803a_1639314921457.jpg\" alt=\"The number of Mobileye EyeQs shipped, 100 million, is greater than the distance (in miles) from Earth to the Sun\" />\u003C/p>\n\u003Cp>\u003Cstrong>Leadership in Driver Assistance\u003C/strong>\u003C/p>\n\u003Cp>Mobileye launched the first-generation EyeQ in 2004. Now, 17 years later, EyeQ is in its fifth generation. More than 30 of the world&rsquo;s leading automakers have come to trust in the capabilities of EyeQ and the technologies it encapsulates to support the \u003Ca href=\"https://www.mobileye.com/blog/buying-a-new-car-here-are-four-adas-features-to-look-for/\" target=\"_blank\" rel=\"noopener noreferrer\">driver-assistance features\u003C/a> in over 300 models currently on the market.\u003C/p>\n\u003Cp>What's more is that the number of vehicles using EyeQ is growing, and at a quickening pace. In 2014, for example, we shipped 2.7 million units. In 2020, we shipped 19.3 million &ndash; ten percent more than the year before, despite the constrictions of COVID-19 and the global chip shortage. And we fully expect those numbers to further increase as the \u003Ca href=\"https://www.globenewswire.com/fr/news-release/2021/05/06/2225177/0/en/Advanced-Driver-Assistance-System-ADAS-Market-Size-Worth-Around-US-142-bn-by-2027.html\" target=\"_blank\" rel=\"noopener noreferrer\">demand for driver-assistance technology continues to grow\u003C/a>.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/670beb7cdb959ea1a9306c4cccba0861_1639314950068.jpg\" alt=\"100 million Mobileye EyeQs is more than the combined populations of the three largest cities in the world.\" />\u003C/p>\n\u003Cp>\u003Cstrong>Processing the Self-Driving Revolution\u003C/strong>\u003C/p>\n\u003Cp>Driver assistance holds great promise for increasing safety on today&rsquo;s roads. But that&rsquo;s just the starting point for the broad range of scalable mobility solutions supported by EyeQ.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye SuperVision&trade;\u003C/a> unlocks the power of EyeQ with L2++ capabilities derived directly from our autonomous-vehicle development program. The \u003Ca href=\"https://www.mobileye.com/blog/mobileye-drive-self-driving-system/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Drive&trade;\u003C/a> self-driving system and our \u003Ca href=\"https://www.mobileye.com/blog/self-driving-robotaxi-sixt-germany-iaa/\" target=\"_blank\" rel=\"noopener noreferrer\">recently revealed robotaxi\u003C/a> further harness the promise of EyeQ to propel the self-driving revolution.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/08f656f4619c0e9a2c7e76a75db3daf6_1639314962706.jpg\" alt=\"100 million vehicles equipped with Mobileye EyeQ chips would take up a parking lot bigger than New York City\" />\u003C/p>\n\u003Cp>EyeQ is at the heart of the innovations that distinguish our approach to self-driving technology, including \u003Ca href=\"https://www.mobileye.com/blog/av-safety-demands-true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">True Redundancy&trade;\u003C/a>, \u003Ca href=\"https://www.mobileye.com/blog/rem-mapping-avs/\" target=\"_blank\" rel=\"noopener noreferrer\">Road Experience Management&trade;\u003C/a>, and \u003Ca href=\"https://www.mobileye.com/blog/responsibility-sensitive-safety-gains-traction-worldwide/\" target=\"_blank\" rel=\"noopener noreferrer\">Responsibility-Sensitive Safety\u003C/a>. EyeQ helps keep \u003Ca href=\"https://www.mobileye.com/us/fleets/\" target=\"_blank\" rel=\"noopener noreferrer\">fleets of commercial vehicles\u003C/a> safe and on the road. And it enables our \u003Ca href=\"https://www.mobileye.com/en/data/\" target=\"_blank\" rel=\"noopener noreferrer\">data services\u003C/a> for infrastructure surveying and smart city planning.\u003C/p>\n\u003Cp>Everything we do, in short, runs on EyeQ. And it does so efficiently, consistently, and at a scale that serves to underline just how far we&rsquo;ve come over the past two decades.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/what-drives-us/\" rel=\"noopener noreferrer\">\u003Cu>Read more here\u003C/u>\u003C/a> about the impact that our technology is bringing to bear on road safety.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/1514fb25d5c71155e1c3b6b56ed366b0_1639314978686.jpg\" alt=\"100 million vehicles equipped with Mobileye EyeQ chips would wrap around the world at the equator more than 11 times\" />\u003C/p>","2021-12-13T08:00:00.000Z","Events, Autonomous Driving, ADAS, News",{"id":1655,"type":5,"url":1656,"title":1657,"description":1658,"primary_tag":954,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1658,"image":1659,"img_alt":1660,"content":1661,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":32,"publish_date":1652,"tags":1662},112,"what-drives-us","What Drives Us","The 100 million EyeQ® chips we’ve shipped to date means more to us than just a number. It means safer roads, fewer crashes, and saving lives – and we’re just getting started.","https://static.mobileye.com/website/us/corporate/images/e7ac119ea5c7c6d581414737a72ba315_1639040383767.png","The safety of 100 million Mobileye EyeQ® chips","\u003Cp>Today we&rsquo;re celebrating a major milestone with the shipment of our \u003Ca href=\"https://www.mobileye.com/blog/100-million-eyeq-chips/\" rel=\"noopener noreferrer\">\u003Cu>100 millionth EyeQ&reg; chip\u003C/u>\u003C/a>. But that achievement means far more to us than just a number. For us here at Mobileye, a hundred million chips means a hundred million vehicles out on the road that are that much safer as a result of our technology. It means innumerable collisions averted, countless lives saved, and a world inestimably better off because of our innovations than it would be without them. And \u003Cem>that\u003C/em> is what drives us to continue innovating.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/whIHBpJwAOQ\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>\u003Cstrong>Driver Assistance: Technology to the Rescue\u003C/strong>\u003C/p>\n\u003Cp>Advanced Driver-Assistance Systems are proven to reduce the likelihood of road accidents. Forward Collision Warning, for example, has been shown to lower the incidence of front-to-rear crashes by 27 percent. Turn that passive safety feature into an \u003Cem>active\u003C/em> one like Automatic Emergency Braking and that number jumps to 50%, resulting in 56% fewer injuries. And those are just two of the many features supported by EyeQ, the System-on-Chip with a proven track record of ushering in game-changing innovations in driver-assistance technology.\u003C/p>\n\u003Cp>Based on the industry-leading computer-vision technology encapsulated in EyeQ, Mobileye launched the first cost-effective, camera-only Adaptive Cruise Control, Forward Collision Warning, Automatic Emergency Braking, and Highway Assist systems on the market (to name just a few). The list of \u003Ca href=\"https://www.mobileye.com/blog/buying-a-new-car-here-are-four-adas-features-to-look-for/\" target=\"_blank\" rel=\"noopener noreferrer\">features supported by EyeQ\u003C/a> today is among the longest and most comprehensive in the industry. And we can barely fathom the number of accidents avoided and lives saved as a result of those technologies incorporated into so many vehicles driving &ndash; and driving safer &ndash; along the world&rsquo;s roadways.\u003C/p>\n\u003Cp>&ldquo;We&rsquo;re very proud to have reached this milestone, and grateful for the trust our customers have come to place in our technology,&rdquo; Mobileye CEO \u003Ca href=\"https://www.mobileye.com/news/amnon-shashua-automated-driving-executive-of-the-year-automotive-news/\" target=\"_blank\" rel=\"noopener\">Prof. Amnon Shashua\u003C/a> commented on the occasion. &ldquo;To have reached 100 million EyeQ chips is something we barely could have imagined when we started this company two decades ago. But this isn&rsquo;t just a business achievement. For all of us here at Mobileye, this means 100 million vehicles avoiding countless collisions thanks to our technology. And this just sets the stage for so much more to come.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/f5a0d77754024c44193702050044fd98_1639315309704.png\" alt=\"Mobileye EyeQ chip pictured on an integrated circuit board\" />\u003C/p>\n\u003Cp>\u003Cstrong>The Self-Driving Revolution: We&rsquo;ve Only Just Begun\u003C/strong>\u003C/p>\n\u003Cp>As proud as we are of all that we&rsquo;ve accomplished so far, we&rsquo;re not about to sit back on the laurels of our achievements and let progress stop here. Not when road safety still has so much more room for improvement.\u003C/p>\n\u003Cp>Every year, some \u003Ca href=\"https://www.who.int/news-room/fact-sheets/detail/road-traffic-injuries\" target=\"_blank\" rel=\"noopener noreferrer\">1.3 million people die in road accidents\u003C/a>, with tens of millions more injured. Road accidents remain the leading cause of death among children and young adults worldwide.\u003C/p>\n\u003Cp>It doesn&rsquo;t have to be that way. A startling \u003Ca href=\"https://crashstats.nhtsa.dot.gov/api/public/viewpublication/812115\" target=\"_blank\" rel=\"noopener noreferrer\">94% of motor vehicle crashes are attributed to human error\u003C/a>. Reducing the reliance on the inherent fallibility of human drivers stands to drastically decrease the number of accidents, injuries, and deaths on the road even further. That&rsquo;s the motivation that drives us here at Mobileye to keep innovating, improving, and finding new applications for our technology.\u003C/p>\n\u003Cp>The latest generations of EyeQ promise to unlock a veritable self-driving revolution. And that in turn stands to save lives on an even greater scale than the driver-assistance technologies that have served as the stepping stones to the arrival of autonomous vehicles. So here&rsquo;s to the first hundred million, to the next hundred million, and the collisions that can be avoided and lives that can be saved on the road to the future &ndash; when \u003Ca href=\"https://www.mobileye.com/blog/national-autonomous-vehicle-day-how-avs-will-change-your-life/\" target=\"_blank\" rel=\"noopener noreferrer\">the benefits of self-driving technology\u003C/a> will be available everywhere, in every way, to everyone.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/092edd1b1edcad129c769644e6e3b6db_1639406168119.png\" alt=\"Infographic: the Road to 100 Million Mobileye EyeQ chips \" />\u003C/p>","Video, Events, Autonomous Driving, ADAS",{"id":1664,"type":24,"url":1665,"title":1666,"description":1667,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1667,"image":1668,"img_alt":1669,"content":1670,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1671,"tags":1672},108,"amnon-shashua-automated-driving-executive-of-the-year-automotive-news","Shashua Named Automated Driving Exec of the Year","Latest award from leading industry publication joins a long list of accolades for Mobileye and our All-Star chief executive.","https://static.mobileye.com/website/us/corporate/images/d47560234d3779a5bedf760a67c62788_1638447873385.jpg","Mobileye CEO Prof. Amnon Shashua outside the NASDAQ MarketSite in New York","\u003Cp>Bringing cutting-edge mobility technology to the world is its own reward. But we&rsquo;re that much more honored when our top minds are recognized for their contributions. Like our CEO, who has been \u003Ca href=\"https://www.autonews.com/awards/2021-all-stars-amnon-shashua\" target=\"_blank\" rel=\"noopener noreferrer\">named 2021 Automated Driving Executive of the Year\u003C/a> by \u003Cem>Automotive News.\u003C/em>\u003C/p>\n\u003Cp>&ldquo;Many companies have said they will be a part of the shift to self-driving vehicles,&rdquo; reads the citation. &ldquo;Few have articulated how they plan to get there as well as Mobileye under CEO \u003Ca href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Amnon Shashua\u003C/a>.&rdquo; The award forms part of the publication&rsquo;s 2021 list of All Stars.\u003C/p>\n\u003Cp>The co-founder and CEO of Mobileye, Senior Vice President at Intel, and \u003Ca href=\"https://www.cs.huji.ac.il/~shashua/\" target=\"_blank\" rel=\"noopener noreferrer\">Sachs Professor of Computer Science\u003C/a> at the Hebrew University of Jerusalem, Shashua is both a business leader and a world-renowned expert in the scientific fields of artificial intelligence and computer-vision technology. This latest accolade follows the \u003Ca href=\"https://dandavidprize.org/laureates/prof-amnon-shashua/\" target=\"_blank\" rel=\"noopener noreferrer\">Dan David Prize\u003C/a> he was awarded last year for his leadership in AI research, and the Electronic Imaging Scientist of the Year award in 2019 from the Society for Imaging Science and Technology, among many others.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/947e74d9a663e176f82b8b0991594b23_1638447822976.jpg\" alt=\"Mobileye CEO Prof. Amnon Shashua delivers a press conference the NASDAQ MarketSite in New York\" />\u003C/p>\n\u003Cp>Earlier this year, Gaby Hayon (our Executive Vice President of Research and Development) was named a \u003Ca href=\"https://www.mobileye.com/news/gaby-hayon-business-insider-av-power-player/\" target=\"_blank\" rel=\"noopener\">Power Player\u003C/a> by \u003Cem>Business Insider\u003C/em>. Last year, our CTO Prof. Shai Shalev-Schwartz won the \u003Ca href=\"https://iias.huji.ac.il/brunolaureates\" target=\"_blank\" rel=\"noopener noreferrer\">Michael Bruno Prize\u003C/a> for his work on our Responsibility-Sensitive Safety model. This is the second straight year in which Mobileye has been honored by \u003Cem>Automotive News\u003C/em>, which gave us a \u003Ca href=\"https://www.autonews.com/awards/2020-mobileye-rem-road-experience-management\" target=\"_blank\" rel=\"noopener noreferrer\">PACE Award\u003C/a> in 2020 for our Road Experience Management&trade; (REM&trade;) technology.\u003C/p>\n\u003Cp>In naming Shashua as Automated Driving Executive of the Year, \u003Cem>Automotive News\u003C/em> specifically points to several of Mobileye&rsquo;s recent achievements in the fields of driver-assistance and self-driving technologies, including our \u003Ca href=\"https://www.mobileye.com/press-kit/press-kit-mobileye-new-york-city/\" target=\"_blank\" rel=\"noopener\">AV testing in New York\u003C/a>, the \u003Ca href=\"https://www.mobileye.com/blog/self-driving-robotaxi-sixt-germany-iaa/\" target=\"_blank\" rel=\"noopener noreferrer\">launch of our robotaxi\u003C/a>, early commercial success with \u003Ca href=\"https://www.mobileye.com/blog/mobileye-drive-self-driving-system/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Drive&trade;\u003C/a> (our self-driving system), and the debut of \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye SuperVision&trade;\u003C/a> (our next-generation driver-assistance system). Watch this space for more groundbreaking advancements to come.\u003C/p>","2021-12-06T08:00:00.000Z","Awards, Autonomous Driving, News, Amnon Shashua",{"id":1674,"type":5,"url":1675,"title":1676,"description":1677,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1677,"image":1678,"img_alt":1679,"content":1680,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1681,"tags":839},107,"iaa-mobility-2021-munich-wrap-up","Here’s What You Might Have Missed from IAA 2021","Mobileye had a lot to say and to show at this year’s big mobility expo in Munich. Here’s a recap with all our sessions, reveals, and announcements.\n","https://static.mobileye.com/website/us/corporate/images/75b58ad59fbe16d115efebc7e31ec360_1631439993136.jpg","Mobileye robotaxi and Transdev i-Cristal shuttle on display at IAA Mobility 2021","\u003Cp>This past week the international automotive and mobility industries descended on the Bavarian capital of Munich for one of the biggest events of the year: the \u003Ca href=\"https://www.mobileye.com/news/iaa-2021-event-session-schedule/\" target=\"_blank\" rel=\"noopener\">2021 IAA Mobility\u003C/a> show. And Mobileye was naturally there on the ground with a substantial presence.\u003C/p>\n\u003Cp>Our participation centered on a booth on the show floor with interactive demonstrations and vehicles on display incorporating our technologies. Our executives also delivered a keynote presentation and took part in several panel discussions, making some major announcements in the process. Here&rsquo;s everything you might have missed.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>The Vital Role Technology Plays in the Future of Mobility\u003C/strong>\u003C/p>\n\u003Cp>Keynote by Pat Gelsinger\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/QRY214s1a7U\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>Taking the stage at IAA on Tuesday, Pat Gelsinger delivered his first in-person keynote since taking the helm as CEO of Intel this past February. During the course of his approximately 45-minute presentation, Gelsinger covered a broad range of subjects, starting with the chip shortage in the automotive industry and investments Intel is undertaking to help meet demand. Watch the full replay in the video above, or \u003Ca href=\"https://youtu.be/0xeStmFY-XE\" target=\"_blank\" rel=\"noopener noreferrer\">click here for the condensed highlights\u003C/a>.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/WuWqxqZhYZs\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>Gelsinger was also joined remotely by Mobileye CEO \u003Ca href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Prof. Amnon Shashua\u003C/a> to reveal the Mobileye robotaxi. Alexander Sixt (co-CEO of the eponymous car-rental giant) joined Gelsinger live on stage to announce a \u003Ca href=\"https://www.mobileye.com/blog/self-driving-robotaxi-sixt-germany-iaa/\" target=\"_blank\" rel=\"noopener noreferrer\">new autonomous ride-hailing service in Germany\u003C/a>, as did Intel Fellow and Mobileye vice-president Jack Weast for an update on autonomous-vehicle safety standards.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>How Mobility-as-a-Service will foster the Mobility Revolution\u003C/strong>\u003C/p>\n\u003Cp>Panel Discussion with Johann Jungwirth\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/5eabf770107c59317d03386ec907becd_1631442246145.jpg\" alt=\"Johann Jungwirth at IAA 2021\" width=\"2000\" height=\"942\" />\u003C/p>\n\u003Cp>On Wednesday, Johann Jungwirth (our Vice President of Mobility-as-a-Service) joined Prof. Andreas Herrmann (of the University of St. Gallen and the London School of Economics) to discuss the technology, business, and social impact of smart mobility.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Autonomous Mobility in Cities of the Future\u003C/strong>\u003C/p>\n\u003Cp>Panel Discussion with Erez Dagan\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/5fd79dc9f0d4c5dc96505734c1be1dbf_1631441406080.jpg\" alt=\"Erez Dagan\" width=\"3600\" height=\"1696\" />\u003C/p>\n\u003Cp>The next day, Erez Dagan (our Executive Vice President of Products and Strategy) participated in a panel discussion on the effects of self-driving mobility on urban environments. Joining Dagan on the panel were Kristopher Carter from the Boston mayor's office, Christoph Schr&ouml;der from Luminar, Manja Greimeier from Continental, and Michael Wiesinger of Kodiak Robotics, with BCG partner Augustin Wegscheider moderating.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>The Importance of Scenarios &amp; Simulations in AV Safety\u003C/strong>\u003C/p>\n\u003Cp>Panel Discussion with Jack Weast\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/89d0f9a618a0f9dbba827954d0d5084a_1631441345518.jpg\" alt=\"Jack Weast\" width=\"4287\" height=\"2019\" />\u003C/p>\n\u003Cp>Before the conference wrapped up, Jack Weast (our Vice President of Autonomous Vehicle Standards) joined a panel discussion on scenarios and simulations for autonomous-vehicle safety. Weast spoke alongside Siemens&rsquo; Andrea Kollmorgen, Shauna McIntyre of Sense Photonics, and Siddartha Khastgir from WMG, with Michelle Avary of the World Economic Forum as moderator.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Technology on Display at the Mobileye Booth\u003C/strong>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/sHTMcxMc_YY\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>In addition to the keynote and panels, Mobileye was present on the ground with a booth in the Messe M&uuml;nchen. Visitors were able to see two fully autonomous vehicles driven by Mobileye on display &ndash; \u003Ca href=\"https://www.mobileye.com/blog/self-driving-robotaxi-sixt-germany-iaa/\" target=\"_blank\" rel=\"noopener noreferrer\">our newly revealed robotaxi\u003C/a> and the \u003Ca href=\"https://www.mobileye.com/blog/mobileye-transdev-lohr-maas-i-cristal-shuttles-robotaxis/\" target=\"_blank\" rel=\"noopener noreferrer\">i-Cristal shuttle from Transdev ATS and Lohr Group\u003C/a> &ndash; and experience interactive demos showcasing our spectrum of scalable mobility solutions and our REM&trade; mapping technology.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/4e1e06bf6523849dd383eabf650ed10d_1631440359065.jpg\" alt=\"Angela Merkel visits the Mobileye IAA 2021 booth with Pat Gelsinger\" width=\"3845\" height=\"1811\" />\u003C/p>\n\u003Cp>Among the many visitors whom we were pleased to host, German chancellor Angela Merkel stopped by to see what we had in store.\u003C/p>\n\u003Cp>We&rsquo;re suitably proud of the new developments we were able to announce and showcase at IAA &ndash; and we&rsquo;re looking forward to bringing you even more as we drive the road to the future of mobility.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/videoseries?list=PLWCfS_Yhbvs5UqpdTMcZUvQuryKmqmks3\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2021-09-12T07:00:00.000Z",{"id":1683,"type":5,"url":1684,"title":1685,"description":1686,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1686,"image":1687,"img_alt":1688,"content":1689,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1690,"tags":1691},105,"self-driving-robotaxi-sixt-germany-iaa","Launching our Self-Driving Robotaxi in Germany","Driven by Mobileye and operated by Sixt, our newly revealed robotaxis are slated to hit the streets of Munich next year.\n","https://static.mobileye.com/website/us/corporate/images/0b0b196ce83c71aa31f04895665395b8_1631122507847.jpg","The Mobileye robotaxi in Tel Aviv-Jaffa","\u003Cp>During his keynote presentation at the \u003Ca href=\"https://www.mobileye.com/news/iaa-2021-event-session-schedule/\" target=\"_blank\" rel=\"noopener\">2021 IAA Mobility\u003C/a> show, Intel CEO Pat Gelsinger revealed our new self-driving robotaxi, powered by \u003Ca href=\"https://www.mobileye.com/blog/mobileye-drive-self-driving-system/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Drive&trade;\u003C/a> and ready for commercial deployment. And it&rsquo;s only fitting that it should be hitting the streets of Munich among its first applications.\u003C/p>\n\u003Cp>\u003Cstrong>Why Germany?\u003C/strong>\u003C/p>\n\u003Cp>Germany is, after all, \u003Ca href=\"https://fortune.com/2021/05/28/germany-automobile-legalize-robotaxi-autonomous-vehicle/\" target=\"_blank\" rel=\"noopener noreferrer\">the birthplace of the automobile\u003C/a>, and remains a major global hub of the automotive industry. Germany has a famously extensive roadway network. And it&rsquo;s home to several of the world&rsquo;s leading automakers and suppliers, many of which remain among our best customers and key partners.\u003C/p>\n\u003Cp>The country was also \u003Ca href=\"https://www.mobileye.com/blog/mobileye-hits-the-autobahn-with-german-permit/\" target=\"_blank\" rel=\"noopener noreferrer\">the first\u003C/a> outside of our homebase in which we began \u003Ca href=\"https://www.mobileye.com/blog/munich-av-video/\" target=\"_blank\" rel=\"noopener noreferrer\">testing our developmental autonomous vehicle\u003C/a>. And Germany recently became the first country in the world to \u003Ca href=\"https://www.mobileye.com/news/germany-level-4-autonomous-vehicle-law-regulations/\" target=\"_blank\" rel=\"noopener\">authorize the deployment of Level 4 self-driving vehicles\u003C/a> without a safety driver behind the wheel, anchoring its place at the vanguard of mobility innovation.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/ZSihbQDg2HA\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>\u003Cstrong>In Partnership with Sixt\u003C/strong>\u003C/p>\n\u003Cp>Where else, then, would be a more suitable location both to reveal our new \u003Ca href=\"https://www.mobileye.com/blog/how-robotaxis-will-lead-the-way-toward-the-fully-autonomous-future/\" target=\"_blank\" rel=\"noopener noreferrer\">robotaxi\u003C/a> and to deploy it commercially? That&rsquo;s just what we&rsquo;re doing in collaboration with Sixt SE, a leading international mobility services provider based in Germany.\u003C/p>\n\u003Cp>The collaboration was officially announced when Sixt SE co-CEO Alexander Sixt joined Gelsinger on stage during his keynote at IAA. Early-rider testing is slated to start in Munich in 2022 before expanding robotaxi services across Germany and into other European countries later this decade. Customers will be able to order rides through their choice of Sixt and \u003Ca href=\"https://www.mobileye.com/opinion/there-is-more-to-our-moovit-acquisition-than-meets-the-eye/\" target=\"_blank\" rel=\"noopener\">Moovit\u003C/a> mobile apps.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/dd056522d6592816857825a93b9502ad_1631172412617.jpg\" alt=\"MoovitAV and Mobileye robotaxi\" width=\"1920\" height=\"1201\" />\u003C/p>\n\u003Cp>\u003Cstrong>The Mobileye Robotaxi\u003C/strong>\u003C/p>\n\u003Cp>Our new six-passenger, all-electric, self-driving robotaxi revealed at IAA incorporates a \u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">full range of technologies and services\u003C/a> to enable fully autonomous operation and deployment.\u003C/p>\n\u003Cp>Central to that package is Mobileye Drive, our self-driving system. The vehicle encompasses eleven cameras positioned around the vehicle, long- and short-range LiDAR sensors and a full array of radar sensors. The cameras and radar &amp; LiDAR sensors operate in two independent subsystems to deliver \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">True Redundancy&trade;\u003C/a> in perception capabilities. The sensors feed the decision-making algorithms in our purpose-built \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ&reg; System-on-a-Chip\u003C/a>, filtered through our \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">Responsibility-Sensitive Safety\u003C/a> envelope. The Mobileye Roadbook&trade; digital map provides an \u003Ca href=\"https://www.mobileye.com/blog/rem-mapping-avs/\" target=\"_blank\" rel=\"noopener noreferrer\">additional layer of awareness\u003C/a> of the driving environment, fed by \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">crowdsourced data gathered by REM&trade;\u003C/a>.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/368706d79c697a374912f4faa0043551_1631172723465.jpg\" alt=\"Rear view MoovitAV and Mobileye robotaxi\" width=\"1000\" height=\"471\" />\u003C/p>\n\u003Cp>Our sister-company Moovit provides a full suite of robotaxi services, including fleet management tools, tele-operations system, mobility intelligence for optimization and deployment, and rider-experience services.\u003C/p>\n\u003Cp>Our technology is being integrated as well into a new platform for self-driving shuttles from German automotive components supplier \u003Ca href=\"https://www.schaeffler.com/content.schaeffler.com/en/news_media/press_releases/press_releases_detail.jsp?id=87723393\" target=\"_blank\" rel=\"noopener noreferrer\">Schaeffler Group\u003C/a>, also debuted at IAA. Mobileye Drive was previously announced for deployment in the \u003Ca href=\"https://www.mobileye.com/blog/udelv-transporter-autonomous-delivery-vehicles-powered-by-mobileye/\" target=\"_blank\" rel=\"noopener noreferrer\">Udelv Transporter\u003C/a> and the \u003Ca href=\"https://www.mobileye.com/blog/mobileye-transdev-lohr-maas-i-cristal-shuttles-robotaxis/\" target=\"_blank\" rel=\"noopener noreferrer\">i-Cristal shuttle\u003C/a> from French partners Transdev ATS and Lohr Group.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/b3433bf89140112224b2c67867a7e936_1630843360897.jpg\" alt=\"MoovitAV and Mobileye robotaxi at night\" width=\"1000\" height=\"471\" />\u003C/p>\n\u003Cp>\u003Cstrong>Munich and Beyond\u003C/strong>\u003C/p>\n\u003Cp>Launching our robotaxi in Munich represents a significant step on \u003Ca href=\"https://www.mckinsey.com/industries/automotive-and-assembly/our-insights/change-vehicles-how-robo-taxis-and-shuttles-will-reinvent-mobility\" target=\"_blank\" rel=\"noopener noreferrer\">the road to the future of mobility\u003C/a>. We look forward to rolling out the service in Germany and to expanding to additional locations &ndash; with Sixt in Europe, with Moovit in Tel Aviv, and beyond &ndash; to bring the life-changing promise of self-driving mobility everywhere, in every way, to everyone.\u003C/p>\n\u003Cp>Read more about \u003Ca href=\"https://www.mobileye.com/news/mobileye-moves-garage-streets/\" target=\"_blank\" rel=\"noopener noreferrer\">our robotaxi\u003C/a> and \u003Ca href=\"https://www.mobileye.com/news/mobileye-sixt-plan-new-robotaxi-service/\" target=\"_blank\" rel=\"noopener noreferrer\">our collaboration with Sixt\u003C/a>, and \u003Ca href=\"https://www.mobileye.com/blog/\" target=\"_blank\" rel=\"noopener\">watch this space\u003C/a> for more developments from IAA 2021.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/WuWqxqZhYZs\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2021-09-08T07:00:00.000Z","Events, Autonomous Driving, Driverless MaaS, News",{"id":1693,"type":24,"url":1694,"title":1695,"description":1696,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1696,"image":1697,"img_alt":1698,"content":1699,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1700,"tags":1016},148,"mobileye-moves-garage-streets","Mobileye Moves from the Garage to the Streets","Sleek new Mobileye robotaxi unveiled with launch of MoovitAV robotaxi service.","https://static.mobileye.com/website/us/corporate/images/67f8f4be9dccd47545d352221a57ca23_1666085288498.png","The Mobileye Drive™ self-driving system features 8 EyeQ™5 system-on-chips, believed to be the most efficient AV system on the market today. ","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\u003Cp>\u003Cstrong>What’s New:&nbsp;\u003C/strong>Mobileye, an Intel company, today unveiled the 6-passenger, road-ready electric autonomous vehicle (AV) that will be used for commercial driverless ride-hailing services in Tel Aviv and Munich starting in 2022. Equipped with the&nbsp;\u003Ca href=\"https://static.mobileye.com/website/common/files/Mobileye-Drive-Fact-Sheet-675839.pdf\" rel=\"noopener noreferrer\" target=\"_blank\">Mobileye Drive™\u003C/a>&nbsp;self-driving system featuring 8 EyeQ™5 SoCs in the AVKIT58, the all-electric Mobileye AVs will operate under the MoovitAV service branding.\u003C/p>\u003Cp>“Mobileye is passionate about bringing autonomous vehicles to consumers. The new Mobileye AV, accessible through the MoovitAV service, is an important milestone on the way to a driverless world.”\u003C/p>\u003Cp>–Prof. Amnon Shashua, Mobileye chief executive officer\u003C/p>\u003Cp>\u003Cstrong>Why It Matters:&nbsp;\u003C/strong>Getting to full autonomy requires solutions that can scale. Mobileye’s AV was designed from the ground up to scale both economically and geographically while addressing the essential attributes of efficiency, accessibility and safety. It is the first AV to employ all features of the Mobility trinity, including the True Redundancy™ sensing solution with cameras, radar and lidar sensors, Mobileye’s crowd-sourced Roadbook™ AV map, and Responsibility-Sensitive Safety (RSS) driving policy.\u003C/p>\u003Cp>\u003Cstrong>About More Uses:&nbsp;\u003C/strong>The same Mobileye Drive self-driving system used in Mobileye‘s AV can be used in a variety of vehicle types for the movement of goods and people, making it perhaps the most versatile self-driving solution available today. For example, Mobileye plans to collaborate with \u003Ca href=\"https://www.schaeffler.com/content.schaeffler.com/en/news_media/press_releases/press_releases_detail.jsp?id=87723393\" rel=\"noopener noreferrer\" target=\"_blank\">Schaeffler\u003C/a> to build a self-driving chassis that can be used in building autonomous shuttles. Mobileye also previously announced an agreement with Udelv to supply Mobileye Drive for the autonomous&nbsp;\u003Ca href=\"https://www.mobileye.com/news/mobileye-udelv-deal-autonomous-delivery/\" rel=\"noopener noreferrer\" target=\"_blank\">Udelv Transporter\u003C/a>&nbsp;for last-mile goods delivery. Also previously announced was an agreement with&nbsp;Transdev and Lohr&nbsp;to produce and deploy autonomous shuttles in France and Germany.\u003C/p>\u003Cp>\u003Cstrong>About Moovit AV Services:&nbsp;\u003C/strong>Intel subsidiary Moovit, with its global consumer and transportation network, offers the ideal platform to put forth Mobileye’s AVs for commercial driverless ride-hailing. Mobileye AVs will wear the MoovitAV services branding to help consumers know where to go to hail one of the new AVs. The new service is expected to begin operations in Munich in 2022 in cooperation with Sixt SE, as well as in Tel Aviv.\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>[**]gallery:mobileye-moves-garage-streets-gallery-1[**]\u003C/p>","2021-09-07T15:00:00.000Z",{"id":1702,"type":24,"url":1703,"title":1704,"description":1705,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1705,"image":1706,"img_alt":1705,"content":1707,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1708,"tags":1016},147,"mobileye-sixt-plan-new-robotaxi-service","Mobileye and SIXT Plan New Robotaxi Service","Autonomous ride-hailing service is expected to begin driverless pilot in Munich in 2022.","https://static.mobileye.com/website/us/corporate/images/78c82468e9c6fbebcd0eaca1253a5d0a_1666085135730.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>\u003Cstrong>NEWS HIGHLIGHTS\u003C/strong>\u003C/p>\n\u003Cul>\n\u003Cli>At IAA Mobility, Intel CEO Pat Gelsinger and Sixt SE Co-CEO Alexander Sixt announced a collaboration to begin offering a driverless robotaxi service in Munich starting next year\u003C/li>\n\u003Cli>Riders can hail one of the robotaxis using either the Moovit app or the integrated SIXT app for ride hailing, vehicle rental, car sharing and car subscriptions\u003C/li>\n\u003C/ul>\n\u003Cp>MUNICH, Germany, Sept. 7, 2021 &ndash; \u003Cstrong>&nbsp;\u003C/strong>During a keynote at IAA Mobility today, Intel CEO Pat Gelsinger and Sixt SE Co-CEO Alexander Sixt announced a collaboration to begin offering autonomous ride-hailing services in Munich in 2022. The collaboration between Intel subsidiary Mobileye and SIXT, a leading international provider of mobility services headquartered in Germany, also aspires to scale driverless ride-sharing services across Germany and other European countries later this decade.\u003C/p>\n\u003Cp>Riders will be able to access the service via the Moovit app as well as the SIXT app. The autonomous robotaxi offering will be included in SIXT&rsquo;s holistic mobility platform ONE, which combines in just one app products for ride hailing as well as car rental, car sharing and car subscriptions. By integrating the services of cooperation partners like Mobileye, the ONE mobility platform gives SIXT customers worldwide access to more than 200,000 vehicles, 1,500 cooperation partners, around 1.5 million drivers and soon even robotaxi services.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>The autonomous robotaxi option will be part of the ride-hailing service SIXT ride and was demonstrated during Alexander Sixt&rsquo;s keynote walk-on. Mobileye also unveiled the vehicles &ndash; branded with MoovitAV and SIXT &ndash; that will be produced in volume and used for the robotaxi service in Germany. It is the first time Mobileye has publicly displayed its fully integrated self-driving system, known as&nbsp;\u003Ca href=\"https://static.mobileye.com/website/common/files/Mobileye-Drive-Fact-Sheet-675839.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Drive\u003C/a>&trade;, in a vehicle that will be used for commercial, driverless ride-hailing services.\u003C/p>\n\u003Cp>A recently enacted autonomous vehicle (AV) law permits driverless vehicles on German roads, allowing Mobileye robotaxis to begin early-rider testing on Munich streets in 2022.&nbsp;The fleet will thereafter move from test to commercial operations upon regulatory approval. &ldquo;Germany has shown global leadership toward a future of autonomous mobility by expediting crucial AV legislation,&rdquo; Gelsinger said. &ldquo;Our ability to begin robotaxi operations in Munich next year would not be possible without this new law.&rdquo;\u003C/p>\n\u003Cp>The collaboration with SIXT is the first known commercial robotaxi service between a tech supplier and a mobility services provider. &ldquo;With strong logistics and operational partners like SIXT, Mobileye can bring the promise of full autonomy to life in cities around the world,&rdquo; said Mobileye CEO Prof. Amnon Shashua. &ldquo;We&rsquo;re delighted that Germany is a first mover.&rdquo;\u003C/p>\n\u003Cp>Alexander Sixt, Co-CEO of Sixt SE, added \u003Cstrong>:\u003C/strong>&nbsp;&ldquo;This strategic collaboration is the next step in expanding our integrated mobility platform ONE and underlines our company&rsquo;s evolution towards becoming the industry&rsquo;s leading provider of innovative and digital premium mobility. We are delighted to leverage the remarkable technology leadership of Mobileye to bring driverless mobility to customers in Germany and beyond.&rdquo;\u003C/p>\n\u003Cp>Mobileye will own the robotaxi fleet used in the Munich service, while SIXT will draw upon its established expertise in providing, maintaining and operating the fleet. The vehicles will include the MoovitAV service and SIXT branding once the service launches in Munich, to make it easy for customers to distinguish between traditional ride-hailing and the autonomous fleet vehicles.\u003C/p>\n\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\n\u003Cp>Mobileye is leading the mobility revolution with its autonomous driving and driver-assist technologies, harnessing world-renowned expertise in computer vision, machine learning, mapping, and data analysis. Our technology enables self-driving vehicles and mobility solutions, powers industry-leading advanced driver-assistance systems, and delivers valuable intelligence to optimize mobility infrastructure. Mobileye pioneered such groundbreaking technologies as True Redundancy&trade; sensing, REM&trade; crowdsourced mapping, and Responsibility Sensitive Safety (RSS) technologies that are driving the ADAS and AV fields towards the future of mobility. For more information,&nbsp;\u003Ca href=\"https://www.mobileye.com/\" target=\"_blank\" rel=\"noopener\">www.mobileye.com\u003C/a>.\u003C/p>\n\u003Cp>\u003Cstrong>About SIXT\u003C/strong>\u003C/p>\n\u003Cp>Sixt SE with its registered office in Pullach near Munich, is a leading international provider of high-quality mobility services. With its products&nbsp;\u003Ca href=\"https://www.sixt.com/#/\" target=\"_blank\" rel=\"noopener noreferrer\">SIXT rent\u003C/a>,&nbsp;\u003Ca href=\"https://www.sixt.com/share/#/\" target=\"_blank\" rel=\"noopener noreferrer\">SIXT share\u003C/a>,&nbsp;\u003Ca href=\"https://www.sixt.com/ride\" target=\"_blank\" rel=\"noopener noreferrer\">SIXT ride\u003C/a>&nbsp;and&nbsp;\u003Ca href=\"https://www.sixt.com/plus/#/\" target=\"_blank\" rel=\"noopener noreferrer\">SIXT+\u003C/a>&nbsp;the company offers a uniquely integrated premium mobility service across the fields of vehicle and commercial vehicle rental, car sharing, ride hailing and car subscriptions. The products can be booked through one single app, which also integrates the services of its renowned mobility partners. SIXT has a presence in around 110 countries around the globe. The company is characterized by consistent customer orientation and excellent customer experience, a living culture of innovation with strong technological expertise, the high share of premium vehicles in its fleet and an attractive price-performance ratio. The SIXT Group doubled its revenue since 2009 and generated revenues of EUR 3.31 billion in 2019 and is ranked as one of the most profitable mobility companies in the world.\u003C/p>","2021-09-07T07:00:00.000Z",{"id":1710,"type":24,"url":1711,"title":1712,"description":1713,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1713,"image":1714,"img_alt":1715,"content":1716,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1717,"tags":1718},141,"mobileye-zeekr-expand-future-cars-partnership","Mobileye and ZEEKR Expand Partnership to Enable Future of Cars","New Mobileye agreement brings cutting-edge ADAS to premium global electric vehicle-maker.","https://static.mobileye.com/dev/website/us/corporate/images/79cad786b5b572e19481be2ac77158e1_1663142335136.jpg","Mobileye will work with ZEEKR to create driver-assist systems with increasingly sophisticated capabilities, starting with ZEEKR vehicles featuring Mobileye SuperVision. (Credit: ZEEKR)","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>Mobileye, an Intel company, is expanding its global influence in the advanced driver-assistance systems (ADAS) industry with a new partnership with ZEEKR, the global premium electric mobility technology brand from Geely Holding Group. Together, Mobileye and ZEEKR will introduce the world’s most highly advanced safety technology available in the market for advanced, intelligent vehicles.\u003C/p>\u003Cp>As part of the long-term agreement, Mobileye will work with ZEEKR to create advanced ADAS systems with increasingly sophisticated capabilities for a variety of ZEEKR models. The collaboration will begin with the launch of ZEEKR vehicles in the fourth quarter of 2021 featuring Mobileye® SuperVision™, a full-stack ADAS solution powered by two EyeQ5® system-on-chip (SoC) devices processing data from 11 cameras. The two companies also plan to collaborate further on a next-generation system powered by six EyeQ5 SoCs to deliver a new standard for a comprehensive ADAS experience. It is expected to make its global debut as soon as 2023.\u003C/p>\u003Cp>“ZEEKR’s powerful vision for the future of driving make them an ideal partner to Mobileye,” said Prof. Amnon Shashua, co-founder and CEO of Mobileye and senior vice president of Intel. “By working closely together, we have an exciting opportunity to reach a new level of excellence in ADAS, bringing to market what will be the industry’s most state-of-the-art, full-feature system.”\u003C/p>\u003Cp>The collaboration follows an&nbsp;\u003Ca href=\"https://www.intelcapital.com/news/\" rel=\"noopener noreferrer\" target=\"_blank\">equity investment\u003C/a>&nbsp;in ZEEKR by Intel Capital.\u003C/p>","2021-08-27T09:00:00.000Z","Autonomous Driving, News, ADAS",{"id":1720,"type":24,"url":1721,"title":1722,"description":1723,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1723,"image":1724,"img_alt":1725,"content":1726,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1727,"tags":402},104,"iaa-2021-event-session-schedule","Join us at the 2021 IAA Mobility Show","Stop by the big event in Munich to discover how Mobileye is taking self-driving technology Out of the Garage and Onto the Streets.\n","https://static.mobileye.com/website/us/corporate/images/011db6a06dd62beb215d95c8aea4f89a_1629805134396.jpg","Mobileye at IAA 2021","\u003Cp>Want to hear about the latest developments from across the international mobility industry? This year&rsquo;s \u003Ca href=\"https://www.iaa-mobility.com/en\" target=\"_blank\" rel=\"noopener\">IAA Mobility show in Munich\u003C/a> is the place to be. And Mobileye will be there in full force with our latest products, a series of speaking engagements, and some exciting new announcements you won&rsquo;t want to miss.\u003C/p>\n\u003Cp>Short for \u003Cem>Internationale Automobil-Ausstellung\u003C/em> (or International Automobile Show), the IAA has been held in various cities across \u003Ca href=\"https://www.mobileye.com/news/germany-level-4-autonomous-vehicle-law-regulations/\" target=\"_blank\" rel=\"noopener\">Germany\u003C/a> since 1897. This year it&rsquo;s moving to \u003Ca href=\"https://www.mobileye.com/blog/munich-av-video/\" target=\"_blank\" rel=\"noopener noreferrer\">Munich\u003C/a>, and broadening its scope from passenger and commercial vehicles to encompass all manner of mobility solutions.\u003C/p>\n\u003Cp>That makes it the ideal event for Mobileye to showcase our cutting-edge technologies. Here&rsquo;s a schedule of sessions at which you&rsquo;ll be able to hear from our senior leadership on what we have to offer and what we have in the works.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong class=\"ql-size-large\">Keynote: Pat Gelsinger \u003C/strong>\u003C/p>\n\u003Cp>Tuesday, September 7 @5pm (CEST)\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/313862a8a36e08e40ecc4bab83572148_1629792998632.jpg\" alt=\"Pat Gelsinger\" width=\"1000\" height=\"471\" />\u003C/p>\n\u003Cp>Leading our participation at this year&rsquo;s show will be Pat Gelsinger, CEO of our parent company Intel. In his keynote address, Gelsinger will discuss the effects of the global chip shortage on the automotive industry and reveal the latest developments from Mobileye.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong class=\"ql-size-large\">How Mobility-as-a-Service will foster the Mobility Revolution\u003C/strong>\u003C/p>\n\u003Cp>Wednesday, September 8 @10 am (CEST)\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/c52ed915349720f429dc3a26afd53f73_1629793012932.jpg\" alt=\"Johann Jungwirth, Executive Vice President, Autonomous Vehicles\" width=\"1000\" height=\"471\" />\u003C/p>\n\u003Cp>Mobileye Vice President of \u003Ca href=\"https://www.mobileye.com/blog/self-driving-maas-suite/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobility-as-a-Service\u003C/a> Johann Jungwirth will join Prof. Andreas Herrmann (of the University of St. Gallen and the London School of Economics) to discuss the technology, business, and social impact of smart mobility.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong class=\"ql-size-large\">Autonomous Mobility in Cities of the Future\u003C/strong>\u003C/p>\n\u003Cp>Thursday, September 9 @5 pm (CEST)\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/6206897cf0e5e9c25fa8bd7f7d93c634_1629793024340.png\" alt=\"Erez Dagan\" width=\"1000\" height=\"471\" />\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/opinion/digitizing-the-social-contract-for-safer-roads/\" target=\"_blank\" rel=\"noopener\">Erez Dagan\u003C/a> &ndash; Executive Vice President of Products and Strategy at Mobileye and Intel Vice President &ndash; will participate in a panel discussion on the effects of self-driving mobility on urban environments. Joining Dagan on the panel will be Kristopher Carter from the Boston mayor's office, Christoph Schr&ouml;der from Luminar, Manja Greimeier from Continental, and Michael Wiesinger of Kodiak Robotics, with BCG partner Augustin Wegscheider moderating.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong class=\"ql-size-large\">The Importance of Scenarios &amp; Simulations in AV Safety\u003C/strong>\u003C/p>\n\u003Cp>Friday, September 10 @11 am (CEST)\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/e4bf8d8e1d885694d50bc05007fd77f7_1629793037276.png\" alt=\"Jack Weast\" width=\"1000\" height=\"471\" />\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/the-very-definition-of-safe-driving/\" target=\"_blank\" rel=\"noopener noreferrer\">Jack Weast\u003C/a> &ndash; Vice President of Autonomous Vehicle Standards at Mobileye and a Fellow and CTO of the Corporate Strategy Office at Intel &ndash; will take part in a panel discussion on scenarios and simulations for AV safety. Weast will speak alongside Siemens&rsquo; Andrea Kollmorgen, Shauna McIntyre of Sense Photonics, and Siddartha Khastgir from WMG, with Michelle Avary of the World Economic Forum as moderator.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong class=\"ql-size-large\">Stop by the Mobileye booth for more\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/321f0f55d54148a381899afee2f326b7_1629793054309.jpg\" alt=\"IAA 2021 Mobileye booth\" width=\"1000\" height=\"471\" />\u003C/p>\n\u003Cp>In addition to these keynotes and panels, Mobileye will be present on the ground at booth A70 in hall B2 of the Messe M&uuml;nchen. There visitors will be able to view interactive displays of our technology and see up close some of the most advanced self-driving vehicles employing \u003Ca href=\"https://www.mobileye.com/blog/mobileye-drive-self-driving-system/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Drive&trade;\u003C/a> &ndash; including one to be revealed to the public for the first time.\u003C/p>\n\u003Cp>Can&rsquo;t make it in person for the live sessions? You&rsquo;ll be able to tune in at \u003Ca href=\"https://www.intel.com/iaamobility\" target=\"_blank\" rel=\"noopener noreferrer\">www.intel.com/iaamobility\u003C/a>. And be sure to \u003Ca href=\"https://www.mobileye.com/blog/\" target=\"_blank\" rel=\"noopener\">watch this space\u003C/a> for more to come.\u003C/p>","2021-08-25T07:00:00.000Z",{"id":1729,"type":69,"url":1730,"title":1731,"description":1732,"primary_tag":73,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1732,"image":1733,"img_alt":1734,"content":1735,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1736,"tags":1737},149,"press-kit-mobileye-new-york-city","Mobileye Now Testing AVs in New York City","State permit in hand, Mobileye AVs are getting advanced driving experience across New York City’s congested streets.","https://static.mobileye.com/website/us/corporate/images/a4bb98356ab00be27f467c5591e685ca_1666084849869.png","A self-driving vehicle from Mobileye’s autonomous test fleet drives by iconic New York City landmarks.","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>Mobileye, an Intel company, has added New York City to its rapidly expanding global autonomous vehicle (AV) testing program. The company&rsquo;s entry into New York City &ndash; the largest city in North America and one of the world&rsquo;s most challenging driving environments &ndash; demonstrates the vast capabilities of its AV technology and proves how its unique approach is enabling rapid geographic and economic scalability.\u003C/p>\n\u003Cp>&ldquo;Driving in complex urban areas such as New York City is a crucial step in vetting the capabilities of an autonomous system and moving the industry closer to commercial readiness,&rdquo; said Professor Amnon Shashua, senior vice president of Intel and president and CEO of Mobileye.\u003C/p>\n\u003Cp>As seen in the video below, Mobileye&rsquo;s camera-only subsystem AV is now successfully driving through New York City, on highly congested streets replete with pedestrians, bicyclists, aggressive drivers, double-parked vehicles, construction zones, emergency vehicles, tunnels and bridges, and so forth. Mobileye&rsquo;s True Redundancy&trade; approach first &ldquo;doubles down&rdquo; on the computer-vision subsystem before adding a lidar/radar subsystem for redundancy.\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" title=\"YouTube\" src=\"https://www.youtube.com/embed/50NPqEla0CQ\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Ch3>\u003Ca href=\"https://vimeo.com/intelpr/review/576970433/315328f4f4\" target=\"_blank\" rel=\"noopener noreferrer\">Download \"Mobileye Autonomous Vehicle Maneuvers through New York City\"\u003C/a>\u003C/h3>\n\u003Cp>Recently Mobileye applied for and received a New York AV testing permit to drive AVs on New York City streets. Mobileye is the only company currently holding an AV testing permit. During both day and nighttime driving, Mobileye&rsquo;s AV is getting an advanced AV testing experience. Seven things stand out in particular:\u003C/p>\n\u003Cp>\u003Cstrong>Pedestrians: \u003C/strong>Jaywalking is common in many cities, but in New York City it is particularly rampant, and is coupled with a high number of pedestrians. An AV must make assumptions about the behavior of those pedestrians and factor those assumptions into its driving policy. Humans do this instinctively; machines must be programmed for it.\u003C/p>\n\u003Cp>\u003Cstrong>Driving behavior: \u003C/strong>When streets are clogged, drivers become impatient and aggressive. New York City drivers &ndash; especially cabbies and other professionals &ndash; are much more assertive than drivers in other cities.\u003C/p>\n\u003Cp>\u003Cstrong>Traffic density and road user diversity: \u003C/strong>Although car ownership in New York City is low compared with other large U.S. cities, the number and variety of road users is especially dense. New York City has more than its share of cabs and limousines, buses, trucks, food carts, horse-drawn carriages, emergency vehicles, bicycles, scooters, skateboards &ndash; you name it.&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Double-parking: \u003C/strong>Who&rsquo;s parked and who&rsquo;s not? The question is easier for a human to answer than a machine, and New York City&rsquo;s population density contributes to a high number of delivery vehicles that must stop to unload. As a result, double-parking is ubiquitous. AVs struggle with this, although the video shows Mobileye&rsquo;s AVs taking clues from other road users to decide when to maneuver around.\u003C/p>\n\u003Cp>\u003Cstrong>Construction: \u003C/strong>New York City is one big construction zone, and Mobileye knows this thanks to all the data saved in its always-updating AV maps. While competitors rely on either their own test cars to build maps or spend millions of dollars driving special mapping vehicles, Mobileye receives data about blocked or closed lanes from cars already on the road (data it can and does license back to municipal services, too).\u003C/p>\n\u003Cp>\u003Cstrong>Tunnels and bridges: \u003C/strong>The island of Manhattan is connected to the surrounding areas via 15 tunnels and 21 bridges, many of which contain narrow lanes framed with bollards or cones &ndash; the Achilles heel of many an AV. In the face of all that traffic &ldquo;furniture&rdquo; and even multilevel roads, Mobileye&rsquo;s crowd-source mapping technology and its well-trained sensing system understands all of this and handles it beautifully.\u003C/p>\n\u003Cp>\u003Cstrong>This city really never sleeps (the lights!):\u003C/strong> Though Paris gets the &ldquo;city of lights&rdquo; moniker, Manhattan is electrified at night. The visual noise and light pollution is daunting to an AV&rsquo;s sensing system. Mobileye AVs handle it easily with only a bit of algorithm tuning.\u003C/p>\n\u003Cp>\u003Cstrong>New York City Event\u003C/strong>\u003C/p>\n\u003Cp>During a media event on Tuesday, July 19, at Nasdaq in New York City, Mobileye CEO Prof. Amnon Shashua explained how Mobileye technology is preparing for commercial deployment. To make his points, he shared videos of the AV handling New York City&rsquo;s challenging streets during both day and nighttime driving, as well as during heavy rainfall.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/20e4ea06bf6fa7066832eec72f960e17_1663143205321.jpg\" alt=\"Amnon Shashua at NASDAQ\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Photos\u003C/strong>\u003C/p>\n\u003Cp>[**]gallery:press-kit-mobileye-new-york-city-gallery-1[**]\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>B-Roll Video\u003C/strong>\u003C/p>\n\u003Ch3 class=\"videoWrap\">\u003Ciframe src=\"https://player.vimeo.com/video/575430337?h=21c7082568\" width=\"640\" height=\"360\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/h3>\n\u003Ch3>\u003Ca href=\"https://vimeo.com/intelpr/review/575430337/c282147cbc\" target=\"_blank\" rel=\"noopener noreferrer\">Download &ldquo;Mobileye Autonomous Vehicle in New York City (B-Roll)&rdquo;\u003C/a>\u003C/h3>","2021-07-20T07:00:00.000Z","Press Kit, Autonomous Driving, Video",{"id":1739,"type":24,"url":1740,"title":1741,"description":1742,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1742,"image":1743,"img_alt":1744,"content":1745,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1746,"tags":928},101,"germany-level-4-autonomous-vehicle-law-regulations","Germany Opens Roads to Autonomous Vehicles","Mobileye applauds new law, making Germany the first country to authorize the deployment of Level 4 self-driving vehicles on public roadways.","https://static.mobileye.com/website/us/corporate/images/ace401fb046836a71ed2a8cf42151b4e_1625642207676.jpg","Traffic at night passing by the Brandenburg Gate in Berlin, Germany","\u003Cp>There&rsquo;s no small number of complex pieces that need to fit into place to realize the dream of \u003Ca href=\"https://www.mobileye.com/blog/national-autonomous-vehicle-day-how-avs-will-change-your-life/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomous vehicles\u003C/a>. Not the least of them is permission to operate self-driving vehicles on public roads traditionally used only by human-driven vehicles. Now, at least one country has legislated a framework for just such permission.\u003C/p>\n\u003Cp>That country is Germany, where both houses of the federal parliament have passed a bill amending the relevant laws in order to \u003Ca href=\"https://www.linkedin.com/posts/christin-eisenschmid-a9048545_mobility-innovation-automotive-activity-6807622884903665664-CN3w\" target=\"_blank\" rel=\"noopener noreferrer\">allow Level 4 autonomous vehicles to operate on the country&rsquo;s roadways\u003C/a>.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/ca4c8ad1aa4047d5b6e1fc8d337bf7bd_1625642312285.jpg\" alt=\"Mobileye in Germany\" width=\"1000\" height=\"471\" />\u003C/p>\n\u003Cp>\u003Cstrong>What is a Level 4 Autonomous Vehicle?\u003C/strong>\u003C/p>\n\u003Cp>Characterized as &ldquo;high automation,&rdquo; Level 4 is the second highest \u003Ca href=\"https://www.mobileye.com/news/autonomous-driving-hands-on-the-wheel-or-no-wheel-at-all/\" target=\"_blank\" rel=\"noopener noreferrer\">level of autonomy\u003C/a> identified by the Society of Automobile Engineers (SAE). A Level 4 AV is a vehicle fully capable of driving itself within specific areas and under certain conditions, and might be for passengers or \u003Ca href=\"https://www.mobileye.com/blog/udelv-transporter-autonomous-delivery-vehicles-powered-by-mobileye/\" target=\"_blank\" rel=\"noopener noreferrer\">cargo\u003C/a>, whether privately owned or made available on demand on a \u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">Mobility-as-a-Service (MaaS)\u003C/a> basis.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/20994982f563662887a8950c31b5c139_1625642301178.jpg\" alt=\"The 6 levels of autonomous driving\" width=\"2000\" height=\"1824\" />\u003C/p>\n\u003Cp>\u003Cstrong>Are Self-Driving Vehicles Legal?\u003C/strong>\u003C/p>\n\u003Cp>Through its new legislation, Germany has become the first country in the world to allow autonomous vehicles onto public roads \u003Cem>without requiring a human backup safety driver behind the wheel\u003C/em>. The new law therefore takes a considerable step beyond the Level 3 vehicles recently authorized in Japan. Unlike higher levels of autonomy, Level 3 &ldquo;conditional automation&rdquo; requires the driver to remain behind the steering wheel and be ready to take over control of the vehicle when prompted, which \u003Ca href=\"https://amnon-shashua.medium.com/on-black-swans-failures-by-design-and-safety-of-automated-driving-systems-1401076e9027\" target=\"_blank\" rel=\"noopener noreferrer\">raises serious questions about proper implementation\u003C/a>.\u003C/p>\n\u003Cp>Elsewhere and until now, permits have been granted on a case-by-case basis in countries around the world only for \u003Cem>testing\u003C/em> autonomous vehicles. Mobileye has been \u003Ca href=\"https://www.mobileye.com/blog/munich-av-video/\" target=\"_blank\" rel=\"noopener noreferrer\">testing in Germany\u003C/a> since \u003Ca href=\"https://www.mobileye.com/blog/mobileye-hits-the-autobahn-with-german-permit/\" target=\"_blank\" rel=\"noopener noreferrer\">receiving such a permit\u003C/a> almost a year ago, in parallel to our testing on public roadways in \u003Ca href=\"https://www.mobileye.com/news/mobileye-releases-unedited-footage-of-a-40-minute-av-ride-through-jerusalem/\" target=\"_blank\" rel=\"noopener\">Jerusalem\u003C/a>&nbsp;and \u003Ca href=\"https://youtu.be/vL_QNy25n74\" target=\"_blank\" rel=\"noopener noreferrer\">the United States\u003C/a> &ndash; with \u003Ca href=\"https://www.mobileye.com/news/ces-2021-mobileye-avs-on-move/\" target=\"_blank\" rel=\"noopener noreferrer\">additional locations slated soon to follow\u003C/a> (including France, China, and Japan).\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/a16ad07085ba97c5357ce84f58ce1c43_1625642364002.jpg\" alt=\"Mobileye testing on highways\" width=\"1000\" height=\"471\" /> \u003C/strong>\u003C/p>\n\u003Cp>\u003Cstrong>Leading the Way to the Autonomous Future\u003C/strong>\u003C/p>\n\u003Cp>The position of leadership that Germany has taken in this regard should come as little surprise. Widely regarded as the birthplace of the automobile, Germany has an extensive network of famously derestricted highways, and remains a global hub of automotive production and innovation. The country is home to several of the world&rsquo;s leading automakers and suppliers, many of which have partnered with Mobileye for years and still \u003Ca href=\"https://www.mobileye.com/news/mobileye-tech-makes-the-grade-under-euro-ncaps-new-assisted-driving-standard/\" target=\"_blank\" rel=\"noopener\">place their trust in our industry-leading and constantly evolving technology\u003C/a>.\u003C/p>\n\u003Cp>\u003Cstrong>Where to Next?\u003C/strong>\u003C/p>\n\u003Cp>With this new law in place, Germany has opened its doors &ndash; and its roads &ndash; to a bright autonomous future, allowing such potentially game-changing applications as self-driving mobility and autonomous delivery services to begin operating on its roadways.\u003C/p>\n\u003Cp>As leaders in the realm of autonomous mobility and \u003Ca href=\"https://www.mobileye.com/blog/responsibility-sensitive-safety-gains-traction-worldwide/\" target=\"_blank\" rel=\"noopener noreferrer\">champions of regulatory cooperation\u003C/a>, Mobileye applauds the German government&rsquo;s forward-thinking initiative. And we hope that Germany will be just the first of many countries to recognize and facilitate the autonomous revolution unfolding before us &ndash; a global development in which, as our Vice President of Autonomous Vehicle Standards and Intel Fellow Jack Weast laid out in a recent proposal, \u003Ca href=\"https://blogs.intel.com/policy/2021/05/18/intel-proposes-process-to-make-u-s-a-global-leader-in-automated-driving-systems/\" target=\"_blank\" rel=\"noopener noreferrer\">the United States in particular can play a leading role\u003C/a>.\u003C/p>","2021-07-12T07:00:00.000Z",{"id":1748,"type":5,"url":1749,"title":1750,"description":1751,"primary_tag":140,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1751,"image":1752,"img_alt":1753,"content":1754,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1755,"tags":1756},100,"av-scale-erez-dagan-jack-weast-reuters","Setting the Stage for Autonomous Vehicle Deployment at Scale","Mobileye execs Erez Dagan and Jack Weast go in-depth at Reuters Car of the Future 2021 on the technology and regulations required to enable autonomous vehicles at scale.","https://static.mobileye.com/website/us/corporate/images/73fcc07366163df5905c897e50310156_1625479351772.jpg","Mobileye EVP Erez Dagan and VP Jack Weast ","\u003Cp>To an outsider, ushering in the age of autonomous vehicles might seem like a rather large undertaking. But insiders know that it’s not just one task. It’s a collection of challenges, and solving them requires both technological innovation and decisive action on the part of government regulators before self-driving vehicles can be deployed at scale. Fortunately, huge strides have been made on both fronts, as two of our most senior executives showed in their presentations at this year’s \u003Ca href=\"https://reutersevents.com/events/automotivefuturecar/\" rel=\"noopener noreferrer\" target=\"_blank\">Car of the Future conference\u003C/a> organized by Reuters.\u003C/p>\u003Cp>On the technology front, Mobileye’s Executive Vice President for Products and Strategy (and Intel Vice President) Erez Dagan delved deep into the solutions that make up the \u003Ca href=\"https://youtu.be/B7YNj66GxRA?t=300\" rel=\"noopener noreferrer\" target=\"_blank\">Mobileye Trinity\u003C/a>. Our three-pronged approach to enabling the widespread deployment of autonomous vehicles incorporates \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" rel=\"noopener noreferrer\" target=\"_blank\">True Redundancy™\u003C/a>, our unique approach to AV sensing; \u003Ca href=\"https://www.mobileye.com/technology/rem/\" rel=\"noopener noreferrer\" target=\"_blank\">Road Experience Management™ (REM™)\u003C/a>, our AV mapping system using crowdsourced data; and \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" rel=\"noopener noreferrer\" target=\"_blank\">Responsibility-Sensitive Safety (RSS)\u003C/a>, our open safety model for ensuring that the decisions taken by the self-driving system are as safe and sound as they realistically can be.\u003C/p>\u003Cp>Alongside Dagan, Mobileye Vice President for Autonomous Vehicle Standards (and Intel Fellow) Jack Weast spoke about the regulatory approvals that will be required in order for AVs to begin large-scale commercial deployment in jurisdictions around the world. It’s a topic no less important than the enabling technologies, with vital gaps still left to bridge, and opposition slowing down the process. With the technological groundwork set, now is the time for governments to take action in order to regulate the autonomous mobility revolution that lies just around the corner.\u003C/p>\u003Cp>If you missed the event, you can still catch the replays right here, and learn more about how Mobileye is leading the industry on both fronts.\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Ciframe class=\"ql-video\" frameborder=\"0\" allowfullscreen=\"true\" src=\"https://www.youtube.com/embed/4ozTwatcK24\" height=\"315\" width=\"560\">\u003C/iframe>\u003Cp>\u003Cbr>\u003C/p>\u003Ciframe class=\"ql-video\" frameborder=\"0\" allowfullscreen=\"true\" src=\"https://www.youtube.com/embed/1viI_7h_cRM\" height=\"315\" width=\"560\">\u003C/iframe>\u003Cp>\u003Cbr>\u003C/p>","2021-07-04T21:00:00.000Z","AV Safety, Autonomous Driving, Video, Events",{"id":1758,"type":5,"url":1759,"title":1760,"description":1761,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1761,"image":1762,"img_alt":1763,"content":1764,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1765,"tags":997},91,"radar-lidar-next-generation-active-sensors","The Next Generation of Active Sensors for Autonomous Driving","Mobileye revolutionized camera-based computer vision. Now we’re doing the same with the development of our own cutting-edge radar and LiDAR sensors.","https://static.mobileye.com/website/us/corporate/images/051ed8df3a68805d57b1eaa807033c77_1617719733778.jpg","Radar/LiDAR coverage for self-driving system by Mobileye","\u003Cp>Mobileye was founded on computer-vision technology, and we&rsquo;re proud of what we&rsquo;ve achieved through the combination of simple cameras and advanced algorithms. But if autonomous vehicles are to take over complete control from human drivers, they&rsquo;ll require \u003Ca href=\"https://www.mobileye.com/blog/av-safety-demands-true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">multiple types of sensors working in parallel\u003C/a>. So, in addition to our \u003Ca href=\"https://www.mobileye.com/blog/munich-av-video/\" target=\"_blank\" rel=\"noopener noreferrer\">camera-only developmental AV\u003C/a>, we&rsquo;ve also been testing another type using only radar and LiDAR. And we're developing each system &ndash; camera and radar/LiDAR &ndash; to be able to power the AV in complete autonomy of the other. But we soon found that the existing detection-and-ranging sensors on the market still leave much to be desired. That&rsquo;s why we&rsquo;re developing our own.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/f0187e762ab624edd222550866eee5be_1620042128973.png\" alt=\"Cameras and active radar &amp; LiDAR sensors form a more complete picture under Mobileye&rsquo;s approach of True Redundancy\" />\u003C/p>\n\u003Cp>\u003Cstrong>What are Radar and LiDAR?\u003C/strong>\u003C/p>\n\u003Cp>Different from cameras which passively understand their environment, radar and LiDAR both work by actively emitting signals and measuring their returns. This allows them to detect other objects and road users and determine their range (or relative distance). Radar does this with radio waves, while LiDAR employs infrared light. Both types have been widely embraced across the \u003Ca href=\"https://www.mobileye.com/news/mobileye-ranked-5-in-guidehouse-insights-automated-driving-leaderboard/\" target=\"_blank\" rel=\"noopener\">autonomous-vehicle industry\u003C/a> for their unique respective capabilities, each filling in the blind spots left by the other. Even at this basic level, however, Mobileye&rsquo;s approach differs fundamentally from that more commonly practiced across the industry.\u003C/p>\n\u003Cp>\u003Cstrong>True Redundancy&trade;\u003C/strong>\u003C/p>\n\u003Cp>Most companies pursuing autonomous-vehicle technology feed information from all three types of sensors &ndash; cameras, radars, and LiDARs &ndash; into one sensing system. That system then produces a single model of the environment, in a method known as &ldquo;sensor fusion.&rdquo; Mobileye&rsquo;s differentiated approach of \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">True Redundancy\u003C/a> creates two parallel AV sub-systems, with two independent models of the driving environment &ndash; one from cameras, one from radar and LiDAR &ndash; each operating independently of the other. This method is simultaneously more robust and more streamlined, while offering a key failsafe.\u003C/p>\n\u003Cp>Mobileye&rsquo;s innovation, however, runs much deeper than how we process the data from the radar and LiDAR to the essential types of these sensors we&rsquo;re employing. Utilizing the expertise of our parent company, Intel, we&rsquo;re developing new, more advanced versions of both types of sensors, in a more cost-effective setup, specifically designed for self-driving cars.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/4d97cc46ecea1888ea4c51afadb468d2_1620042223261.png\" alt=\"A fleet of developmental autonomous vehicles employing a variety of sensors under development by Mobileye\" />\u003C/p>\n\u003Cp>\u003Cstrong>Software-Defined Digital Imaging Radar\u003C/strong>\u003C/p>\n\u003Cp>Moving radar sensors into the digital age, our software-defined digital imaging radar \u003Ca href=\"https://www.globaldata.com/4d-imaging-radar-technology-provides-game-changing-safety-autonomous-vehicles-says-globaldata/\" target=\"_blank\" rel=\"noopener noreferrer\">promises to deliver true imaging in 4D\u003C/a>, with much higher resolution, dynamic range, and level of accuracy than conventional analog radar. This paradigm shift in architecture enables a veritable leap in performance &ndash; increasing the probability of detection, while reducing the clutter of echoes. It also allows us to detect weaker targets from farther away&hellip; even in the presence of stronger targets that are closer or in the same range and doppler, while effectively handling interferences.\u003C/p>\n\u003Cp>Software definition enables greater flexibility. Complex, proprietary algorithms, coded onto a dedicated System-on-a-Chip (SoC), allow for far lower processing power than the twelve-fold increase in resolution would otherwise dictate. And because ours is a true \u003Cem>imaging\u003C/em> radar, we&rsquo;re able to process what the radar detects using methods adapted from our expertise in computer-vision technology.\u003C/p>\n\u003Cp>\u003Cstrong>Silicon Photonics-based Frequency-Modulated Continuous Wave LiDAR\u003C/strong>\u003C/p>\n\u003Cp>The next frontier in LiDAR technology, \u003Ca href=\"https://www.laserfocusworld.com/home/article/16556322/lasers-for-lidar-fmcw-lidar-an-alternative-for-selfdriving-cars\" target=\"_blank\" rel=\"noopener noreferrer\">Frequency-Modulated Continuous Wave (FMCW) LiDAR\u003C/a> presents a striking array of advantages over conventional time-of-flight (ToF) LiDAR systems. To the LiDAR&rsquo;s typical capability of sampling range, elevation, and azimuth (or relative trajectory), FMCW adds velocity, elevating the most advanced type of self-driving car sensor from 3D to 4D. This allows for the quick identification of incoming small and fast targets (like motorcycles) at farther distances, measuring the headings of detected objects more reliably, and enriching the AI algorithms with additional velocity information.\u003C/p>\n\u003Cp>FMCW LiDAR is also less sensitive to interference from sunlight, reflections, and other LiDAR units. And because it sends a continuous wave of light instead of short pulses, FMCW LiDAR can operate at lower and safer power levels while achieving higher detection and effective dynamic ranges, minimizing unwelcome artifact interference from retroreflectors (like traffic signs and license plates).\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/2c87e5ae1f240cefaf7ff6e0eb12c0bf_1617719906265.png\" alt=\"The differences between typical LiDAR and the Frequency-Modulated Continuous Wave (FMCW) LiDAR being developed by Mobileye\" />\u003C/p>\n\u003Cp>\u003Cstrong>The Dream Team to Make It Happen\u003C/strong>\u003C/p>\n\u003Cp>Mobileye and Intel&rsquo;s combined competences put us in a unique position to advance the development of these complex-to-engineer, cutting-edge active sensors and bring them to market. Intel has a wealth of experience in developing \u003Ca href=\"https://www.sdxcentral.com/cloud/definitions/software-defined-everything-sdx-part-1-definition/\" target=\"_blank\" rel=\"noopener noreferrer\">software-defined\u003C/a> infrastructure that forms the basis of our new radar. It also has the rare \u003Ca href=\"https://physicsworld.com/a/the-promise-of-silicon-photonics/\" target=\"_blank\" rel=\"noopener noreferrer\">silicon photonics\u003C/a> fab to integrate both active and passive components onto a single chip to underpin our LiDAR. And Mobileye has the proven expertise in both automotive applications and digital-imagery analysis to implement these highly advanced sensors for autonomous driving.\u003C/p>\n\u003Cp>\u003Cstrong>Savings Through Advancement\u003C/strong>\u003C/p>\n\u003Cp>In order for self-driving vehicles to transition from science experiment to broad adoption, as we&rsquo;ve long advocated, the costs must be reduced to a practicable level. Unfortunately, LiDAR is inherently a relatively expensive type of autonomous vehicle sensor, costing roughly ten times as much as radar. And we don&rsquo;t anticipate this cost to come down significantly anytime soon.\u003C/p>\n\u003Cp>Rather than making the most of the cheapest available sensors to reduce overall cost, we&rsquo;re increasing the radar&rsquo;s capabilities, which allows us to rely less on cost-intensive LiDAR. Accordingly, our road map is aiming toward 360-degree coverage by both cameras and digital imaging radar around the vehicle, requiring only a single front-facing FMCW LiDAR with very high resolution, allowing it to map the drivable space and to detect small targets further ahead.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/ae57e4ad34b74562c29ac1226bc456c0_1620042255536.png\" alt=\"Overlapping coverage of the different types of optical and active sensors employed in Mobileye&rsquo;s self-driving vehicle technology\" />\u003C/p>\n\u003Cp>\u003Cstrong>The Future of Autonomous Driving\u003C/strong>\u003C/p>\n\u003Cp>Using the best radar and LiDAR sensors currently available, we remain focused on our timeline of bringing our autonomous-vehicle platform to market by the end of next year &ndash; first in on-demand \u003Ca href=\"https://www.mobileye.com/blog/self-driving-maas-suite/\" target=\"_blank\" rel=\"noopener noreferrer\">self-driving Mobility-as-a-Service\u003C/a>, which in turn will pave the way towards mass-produced consumer AVs. But with development of our next-generation radar and LiDAR sensors steaming ahead, we&rsquo;re looking at the next step in the evolution of our AV sensing suite. Look for our software-defined digital imaging radar and Silicon Photonics-based FMCW LiDAR to further advance our AV platform in 2025.\u003C/p>","2021-06-14T07:00:00.000Z",{"id":1767,"type":5,"url":1768,"title":1769,"description":1770,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1770,"image":1771,"img_alt":1772,"content":1773,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1774,"tags":1775},98,"national-autonomous-vehicle-day-how-avs-will-change-your-life","How Will Autonomous Vehicles Change Your Life?","To mark National Autonomous Vehicle Day, we look at five concrete ways in which self-driving vehicles stand to change the lives of people around the world.","https://static.mobileye.com/website/us/corporate/images/925f04c1d3bc5cb3f8b35008b2152585_1622535529787.jpg","National Autonomous Vehicle Day:  How AVs Will Change Your Life","\u003Cp>May 31 is designated in the United States as National Autonomous Vehicle Day. This year we&rsquo;re marking the occasion by looking at five concrete ways in which the daily lives of average people across the country and around the world stand to be affected for the better by the advent and widespread adoption of fully \u003Ca href=\"https://www.mobileye.com/blog/tag/autonomous-driving/\" target=\"_blank\" rel=\"noopener\">autonomous vehicles\u003C/a> (AVs) &ndash; a long-held dream that&rsquo;s rapidly approaching reality.\u003C/p>\n\u003Cp>\u003Cstrong>1) Self-driving Mobility-as-a-Service\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/4e6c272f885b14cf949f56b79889a953_1622535655910.png\" alt=\"Mobileye, Moovit, and their partners around the world are bringing self-driving mobility services to a city near you\" />\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/how-robotaxis-will-lead-the-way-toward-the-fully-autonomous-future/\" target=\"_blank\" rel=\"noopener noreferrer\">Your next ride-share trip\u003C/a> could very well be in an AV. \u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">Self-driving mobility services\u003C/a>, due to be introduced in cities around the world in the coming years, will take you where you need to go, autonomously. Mobileye and Moovit are already working on bringing self-driving Mobility-as-a-Service to locations including \u003Ca href=\"https://www.mobileye.com/blog/mobileye-transdev-lohr-maas-i-cristal-shuttles-robotaxis/\" target=\"_blank\" rel=\"noopener noreferrer\">France\u003C/a>, the \u003Ca href=\"https://www.mobileye.com/news/mobileye-is-bringing-driverless-maas-to-the-uae/\" target=\"_blank\" rel=\"noopener\">United Arab Emirates\u003C/a>, \u003Ca href=\"https://www.mobileye.com/blog/mobileye-to-deploy-robotaxis-in-japan-with-willer/\" target=\"_blank\" rel=\"noopener noreferrer\">Japan\u003C/a>, and South Korea&hellip; with more expected to follow.\u003C/p>\n\u003Cp>\u003Cstrong>2) Autonomous Delivery\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/82f0c1183b819dada4f8e159a0d4c74a_1622535674847.png\" alt=\"The Mobileye Drive&trade; self-driving system can enable autonomous delivery vehicles like the Udelv Transporter\" />\u003C/p>\n\u003Cp>All that stuff you order online? Don&rsquo;t be surprised if it arrives autonomously at your doorstep in the near future in a \u003Ca href=\"https://medium.com/@udelv/autonomous-delivery-vehicles-why-they-matter-and-how-they-work-79b740d5e1e9#:~:text=These%20vehicles%20enable%20a%20safer,better%20maintain%20engines%20and%20brakes\" target=\"_blank\" rel=\"noopener noreferrer\">self-driving delivery vehicle\u003C/a>. Like the \u003Ca href=\"https://www.mobileye.com/blog/udelv-transporter-autonomous-delivery-vehicles-powered-by-mobileye/\" target=\"_blank\" rel=\"noopener noreferrer\">Udelv Transporter\u003C/a>, which is slated to employ \u003Ca href=\"https://www.mobileye.com/blog/mobileye-drive-self-driving-system/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Drive&trade;\u003C/a> in order to carry out middle- and last-mile delivery of goods across the United States, starting as soon as 2023.\u003C/p>\n\u003Cp>\u003Cstrong>3) Democratizing Mobility\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/92777efdfbbe189ad8b1598fa80733bf_1622535687865.png\" alt=\"Self-driving vehicles will democratize mobility for all &ndash; including the young, the old, and people with disabilities\" />\u003C/p>\n\u003Cp>The use of a conventional private motor vehicle requires the financial means to acquire one, and the license and ability to operate it. The arrival of autonomous vehicles will change that.\u003C/p>\n\u003Cp>By removing the driver from the equation, private (or semi-private) vehicular transport will be available for use by children, the elderly, \u003Ca href=\"https://www.forbes.com/sites/gusalexiou/2021/04/11/how-passengers-with-disabilities-can-drive-the-autonomous-vehicle-revolution/\" target=\"_blank\" rel=\"noopener noreferrer\">people with disabilities\u003C/a>&hellip; pretty much anyone &ndash; whether on demand or permanently on call. That in and of itself will represent the democratization of mobility and a historical development that stands to rival the domestication of the horse, the building of the railroad, the advent of the automobile, and the innovation of the assembly line.\u003C/p>\n\u003Cp>\u003Cstrong>4) Freeing Up Time\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/13c13c866d4b170bb76c5364ffaac2ac_1622535701010.png\" alt=\"Autonomous vehicles will leave commuters with more time for other things, like entertainment and creativity\" />\u003C/p>\n\u003Cp>Drivers spend an inordinate amount of their time stuck in traffic behind the steering wheel &ndash; and the problem is only getting worse as traffic increases and \u003Ca href=\"https://www.mobileye.com/en/data/\" target=\"_blank\" rel=\"noopener noreferrer\">infrastructure\u003C/a> struggles to keep up. But in self-driving vehicles, \u003Ca href=\"https://www.bmw.com/en/innovation/value-of-time-via-autonomous-driving.html\" target=\"_blank\" rel=\"noopener noreferrer\">occupants will be free\u003C/a> to relax, enjoy some entertainment, and increase their creativity and productivity while the autonomous vehicle does the driving for you. Just imagine the possibilities!\u003C/p>\n\u003Cp>\u003Cstrong>5) They Already Are!\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/7ad54dc654dc5cc771907b70dba684db_1622535715572.png\" alt=\"Self-driving technologies are already trickling down to human-driven vehicles with the latest advanced driver-assistance systems\" />\u003C/p>\n\u003Cp>Perhaps the most immediate answer to the question of how AVs will change your life is that they already are. The \u003Ca href=\"https://www.mobileye.com/blog/everything-you-need-to-know-about-adas/\" target=\"_blank\" rel=\"noopener noreferrer\">advanced driver-assistance systems\u003C/a> in production cars today &ndash; including tens of millions of \u003Ca href=\"https://www.mobileye.com/blog/toyota-zf-adas/\" target=\"_blank\" rel=\"noopener noreferrer\">vehicles fitted with Mobileye ADAS\u003C/a> &ndash; represent the building blocks of AV technology.\u003C/p>\n\u003Cp>As integrated ADAS features like Automatic Emergency Braking (AEB), Lane Keeping Assist (LKA), and Adaptive Cruise Control (ACC) evolve and improve, the line between ADAS and AV will continue to blur into a spectrum &ndash; distinguished not by their capabilities but by the level of trust we&rsquo;re collectively prepared to place in the hands of technology as it proves its reliability, robustness, and \u003Ca href=\"https://www.forbes.com/sites/gusalexiou/2021/04/11/how-passengers-with-disabilities-can-drive-the-autonomous-vehicle-revolution/\" target=\"_blank\" rel=\"noopener noreferrer\">safety\u003C/a>.\u003C/p>","2021-05-31T07:00:00.000Z","Autonomous Driving, Driverless MaaS, AV Safety, Events",{"id":1777,"type":24,"url":1778,"title":1779,"description":1780,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1780,"image":1781,"img_alt":1782,"content":1783,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1784,"tags":1298},142,"zf-mobileye-safety-technology-toyota","ZF and Mobileye Safety Technology Chosen by Toyota","New deal extends both companies’ reach in advanced driver-assistance solutions globally.","https://static.mobileye.com/website/us/corporate/images/1bcacc9c23ef66ffdbed23df7c18514c_1666084253688.png","Mobileye’s EyeQ4 applies enhanced computational capabilities with computer vision algorithms while rapidly processing information from the vehicle’s front-facing camera.","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\u003Cp>\u003Cstrong>What’s New:\u003C/strong>&nbsp;ZF and Mobileye, an Intel company have been chosen by Toyota Motor Corp to develop advanced driver-assistance systems (ADAS) for use in multiple vehicle platforms starting in the next few years. As part of the agreement, ZF, one of the world’s largest producers of automotive cameras driven by Mobileye technology, will also supply its Gen 21 mid-range radar and be responsible for the integration of camera and radar in Toyota vehicles.\u003C/p>\u003Cp>“Mobileye is delighted to be working with ZF to develop leading driver-assistance and safety technology for Toyota, the world’s largest automaker.”\u003C/p>\u003Cp>–Professor Amnon Shashua, senior vice president of Intel and president and CEO of Mobileye\u003C/p>\u003Cp>\u003Cstrong>Why It Matters:\u003C/strong>&nbsp;Mobileye and ZF continue to be a winning combination for the world’s largest automakers because of their innovative approach to improving road safety with computer vision and machine learning-based sensing, localization, mapping and best-in-class lateral vehicle control technology for systems such as lane keeping/lane centering. This new relationship with Toyota, the world’s largest automaker, marks the first time that ZF and Mobileye have been nominated with their ADAS systems for Toyota and significantly extends the reach of Mobileye and ZF safety technology to enhance safety and driver convenience functions on world roadways.\u003C/p>\u003Cp>\"ZF looks forward to working closely with Toyota and Mobileye to develop advanced safety systems designed to meet advanced global safety regulations,\" said Christophe Marnat, executive vice president, Electronics and ADAS division at ZF. \"Our innovative technologies will deliver outstanding performance and robustness for fusion-based systems and ADAS functions.”\u003C/p>\u003Cp>\u003Cstrong>How It Works:&nbsp;\u003C/strong>ZF and Mobileye will collaborate closely to produce advanced camera technology integrated with ZF radar technology to power key advanced driver-assistance platforms in Toyota vehicles. Mobileye’s EyeQ®4, one of the most advanced application-specific vision computing system-on-chip (SoC) currently available, will be combined with ZF’s Gen 21 mid-range radar technology to precisely interpret the environment around Toyota vehicles. Together, these technologies will help prevent and mitigate collisions while yielding best-in-class lateral and longitudinal vehicle control.\u003C/p>\u003Cp>Mobileye’s EyeQ4 applies enhanced computational capabilities with computer vision algorithms while rapidly processing information from the vehicle’s front-facing camera. With a variety of feature sets including vehicle detection from any angle and next-generation lane detection, EyeQ4 enables automakers to take a major step forward in autonomy, achieving the ability to support and enhance complicated driving tasks.\u003C/p>\u003Cp>ZF’s Gen21 mid-range radar is a high-performance 77 GHz front radar designed to meet 2022+ Euro NCAP 5-Star Safety Ratings and enable L2/L2+ Automated Driving functions. It is scalable to vehicle manufacturer needs and offers both a wide field of view at low speeds to assist in pedestrian detection to support systems like automatic emergency braking (AEB) and a longer detection range at high speeds for systems like adaptive cruise control.\u003C/p>","2021-05-18T12:00:00.000Z",{"id":1786,"type":5,"url":1787,"title":1788,"description":1789,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1789,"image":1790,"img_alt":1791,"content":1792,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1793,"tags":1036},97,"toyota-zf-adas","Leading Automakers Rely on Mobileye – Now Including Toyota","We’re pleased to welcome the world’s largest automaker to the substantial and growing list of manufacturers placing their trust in our ADAS technology.","https://static.mobileye.com/website/us/corporate/images/830c2fdc760029c2b040eea4bcfe5040_1621340515993.jpg","Parking lot filled with new cars and trucks","\u003Cp>Over the past two decades since our founding, Mobileye has worked with the vast majority of the world&rsquo;s leading automakers to help make their vehicles and the roads on which they drive safer. Now we&rsquo;re pleased to count the largest automaker in the world among our customers as Toyota and Mobileye have signed a pivotal new deal to deploy our ADAS tech in an array of new vehicles in the coming years.\u003C/p>\n\u003Cp>With today&rsquo;s announcement, Toyota joins a substantial and growing list of automotive manufacturers that have placed their trust in Mobileye technology to power their \u003Ca href=\"https://www.mobileye.com/blog/buying-a-new-car-here-are-four-adas-features-to-look-for/\" target=\"_blank\" rel=\"noopener noreferrer\">advanced driver-assistance systems\u003C/a>. Tens of millions of vehicles worldwide have been manufactured with Mobileye tech inside by dozens of automakers, with hundreds of models currently on the market offering our technology. These staggering numbers underline the leading role that Mobileye has taken in the ADAS sphere since pioneering the category at the turn of the millennium.\u003C/p>\n\u003Cp>&ldquo;Our new deal with Toyota is a significant achievement for Mobileye in our drive to help make vehicles and driving safer,&rdquo; said \u003Ca href=\"https://www.mobileye.com/opinion/digitizing-the-social-contract-for-safer-roads/\" target=\"_blank\" rel=\"noopener\">Erez Dagan\u003C/a>, Executive Vice President for Products and Strategy at Mobileye and Vice President at our parent Intel Corporation. &ldquo;More than the specifics of the business deal itself, however, this latest development symbolizes how far Mobileye has come in the past two decades, and how indispensable advanced driver-assistance systems like those we pioneered and continue to develop have become to the automobile. We&rsquo;re truly honored by the trust which so many of the world&rsquo;s leading automakers have placed in the proven capabilities of our technology.&rdquo;\u003C/p>\n\u003Cp>This latest agreement is one we&rsquo;re particularly excited about. Teaming up with the largest producer of automobiles in the world presents the opportunity to place our safety technology in more vehicles and reach more drivers around the world. Toyota&rsquo;s clear track record of emphasizing safety is one which we share and look forward to furthering as we embark on this exciting new partnership.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/news/zf-mobileye-safety-technology-toyota/\" target=\"_blank\" rel=\"noopener noreferrer\">Read more about our new deal with Toyota in the official news release\u003C/a>.\u003C/p>","2021-05-18T07:00:00.000Z",{"id":1795,"type":24,"url":1796,"title":1797,"description":1798,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1798,"image":1799,"img_alt":1800,"content":1801,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1802,"tags":1803},95,"gaby-hayon-business-insider-av-power-player","Business Insider Names Mobileye R&D Chief an AV Power Player","The publication cites Gaby Hayon’s pioneering efforts in True Redundancy™ as a significant factor in placing our Executive Vice President on this year’s list.","https://static.mobileye.com/website/us/corporate/images/7c9c00e93e20a9c335405232c41caee8_1619963326277.jpg","Gaby Hayon, Executive Vice President for Research & Development, Mobileye","\u003Cp>For over two decades, Mobileye has played a leading role in both ADAS and AVs, and official recognition of this leadership has been growing. The latest member of the Mobileye team to be recognized for his efforts is Gaby Hayon, Executive Vice President of Research and Development at Mobileye. This week Hayon was named \u003Ca href=\"https://www.businessinsider.com/power-players-autonomous-vehicles-tesla-morgan-stanley-zoox-2021-4\" target=\"_blank\" rel=\"noopener noreferrer\">one of 11 AV power players by \u003Cem>Business Insider\u003C/em>\u003C/a>.\u003C/p>\n\u003Cp>With this, Hayon joins other talents at Mobileye who have recently been honored for their pioneering work in AV and ADAS, including our CEO \u003Ca href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Prof. Amnon Shashua\u003C/a>, who won the \u003Ca href=\"https://www.mobileye.com/news/prof-amnon-shashua-wins-the-dan-david-prize/\" target=\"_blank\" rel=\"noopener\">Dan David Prize\u003C/a> for his work in AI; our CTO, Prof. Shai Shalev-Schwartz, who won the 2020 Michael Bruno Prize for his work on RSS (among other achievements); and our entire senior leadership who were nominated by the European Patent Office for its 2019 Inventor of the Year award.\u003C/p>\n\u003Cp>The goal of the AV Power Players list is to look beyond well-known CEOs to the talents and teams supporting them. \u003Cem>Insider\u003C/em> put it together by surveying analysts, venture capitalists and industry specialists, asking them to name the most essential people in the AV industry, with only one caveat: no CEOs.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/f93ac36ec673036da776fb065c7bdf4d_1620022960154.png\" alt=\"Gaby Hayon, Executive Vice President, Research and Development\" width=\"1000\" height=\"471\" />\u003C/p>\n\u003Cp>This survey narrowed the list to 40 candidates, 11 of whom were then named by \u003Cem>Insider\u003C/em> as &ldquo;power players,&rdquo; with Hayon among them.\u003C/p>\n\u003Cp>According to \u003Cem>Business Insider\u003C/em>, this honor was based on Hayon&rsquo;s role in in AV development, pointing to his work on \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">True Redundancy&trade;\u003C/a>, the Mobileye approach to developing two independent environmental models for self-driving production vehicles &ndash; one based on cameras, and one based on radar/LiDAR. Also cited was his work on developing the software that helps the Mobileye system identify how nearby objects are moving, allowing ADAS to warn drivers about potential collisions.\u003C/p>\n\u003Cp>Hayon's place on this list is not only an honor for him individually, but also a great example of how Mobileye has taken up the mantle of leadership in the AV industry &ndash; a role that is being increasingly acknowledged across the industry.\u003C/p>","2021-05-03T07:00:00.000Z","Autonomous Driving, Awards, News",{"id":1805,"type":5,"url":1806,"title":1807,"description":1808,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1808,"image":1809,"img_alt":1810,"content":1811,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1812,"tags":997},94,"earth-day-autonomous-electric-vehicles-environment","How Autonomous Vehicles Can Help Save the Environment","This Earth Day, we look at how self-driving vehicles will not only be safer and more convenient than conventional vehicles, but better for the environment as well.\n","https://static.mobileye.com/website/us/corporate/images/93190dc617496f6b55486cbed9464846_1618924864571.jpg","Illustration of Mobileye Drive autonomous shuttle circling the globe","\u003Cp>When it comes to \u003Ca href=\"https://www.mobileye.com/blog/mobileye-drive-self-driving-system/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomous vehicles\u003C/a>, the principal benefit on which we tend to focus most is \u003Ca href=\"https://www.mobileye.com/blog/tag/av-safety/\" target=\"_blank\" rel=\"noopener\">safety\u003C/a>. Comfort and convenience, too. But today being \u003Ca href=\"https://www.earthday.org/\" target=\"_blank\" rel=\"noopener noreferrer\">Earth Day\u003C/a>, we&rsquo;re looking at another potential benefit of the autonomous revolution that&rsquo;s more often overlooked, but which stands to make a major impact on the world in which self-driving vehicles will operate. And that&rsquo;s the benefit to the environment.\u003C/p>\n\u003Cp>While passenger vehicles may not be the biggest contributor to climate change, they certainly take their toll on the environment. The good news is that, while it may yet take several years to play out, there&rsquo;s ample reason to anticipate that the rise of autonomous vehicles will help reduce the carbon footprint of ground transportation around the world.\u003C/p>\n\u003Cp>\u003Cstrong> \u003C/strong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/2ee17beaa3a6cf82070372c956c00d83_1619083282975.jpg\" alt=\"Self-driving electric vehicles like those developed by Mobileye charging up their batteries for their next autonomous journey\" />\u003C/p>\n\u003Cp>\u003Cstrong>Will All Autonomous Vehicles Be Electric?\u003C/strong>\u003C/p>\n\u003Cp>The most significant factor is the widespread expectation that autonomous vehicles will almost universally be \u003Ca href=\"https://www.caranddriver.com/research/a32781943/electric-cars-vs-gas-cars/\" target=\"_blank\" rel=\"noopener noreferrer\">electric vehicles\u003C/a> as well. And while the power to charge and propel those electric vehicles will have to come from somewhere, the overwhelming consensus is that electric powertrains are already a more efficient and environmentally friendly mode of propulsion than internal-combustion engines. And they'll only grow more efficient as the technology improves and clean energy becomes more prevalent over time.\u003C/p>\n\u003Cp>We strongly believe in \u003Ca href=\"https://www.wsj.com/articles/how-electric-self-driving-cars-and-ride-hailing-will-transform-the-car-industry-11619189966\" target=\"_blank\" rel=\"noopener noreferrer\">the combined promise of autonomous electric vehicles\u003C/a>. But there are yet more reasons to anticipate that self-driving vehicles will lessen the environmental impact of the automobile &ndash; regardless of what form of energy propels them.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/c39ed0d2944cbadd685674f710ee04a5_1618909491641.jpg\" alt=\"A vehicle equipped with Mobileye self-driving technology autonomously changing lanes in heavy traffic on the highway\" />\u003C/p>\n\u003Cp>\u003Cstrong>Autonomous Efficiency\u003C/strong>\u003C/p>\n\u003Cp>For starters, autonomous vehicles will be programmed to take the most optimal route &ndash; contrary to conventional vehicles that go wherever their human drivers steer them. AVs will also, by design, avoid the erratic, often-unnecessary acceleration and braking to which human drivers often subject their vehicles. And the higher the proportion of AVs on the road, the smoother the overall flow of traffic ought to be, resulting in less energy-consuming stop-and-go traffic and less time spent idling in traffic jams.\u003C/p>\n\u003Cp>Autonomous vehicles (heavy-duty \u003Ca href=\"https://www.mobileye.com/us/fleets/\" target=\"_blank\" rel=\"noopener noreferrer\">trucks\u003C/a> especially) might also be able \u003Ca href=\"https://www.volpe.dot.gov/news/how-automated-car-platoon-works\" target=\"_blank\" rel=\"noopener noreferrer\">to &ldquo;platoon&rdquo; in slipstreamed packs\u003C/a>, whose combined aerodynamics will require less energy to travel along highways. AVs communicating (with one another and with \u003Ca href=\"https://www.mobileye.com/en/data/\" target=\"_blank\" rel=\"noopener noreferrer\">infrastructure\u003C/a>) could even eventually eliminate the need for traffic lights and stop signs, which cause vehicles to waste considerable energy decelerating to (and re-accelerating from) a complete stop.\u003C/p>\n\u003Cp>The increased safety of autonomous vehicles, by itself, also stands to reap environmental benefits. Fewer accidents would mean fewer traffic jams (and less need for towing). And the lowered risk of collisions (and decreased severity of those which do occur) could obviate the need for much of the heavy safety equipment and structural bulk built into today&rsquo;s passenger vehicles, resulting in lighter, more energy-efficient vehicles.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/bab16a08e9c02866dff5e279e32ff37b_1618924855029.png\" alt=\"Self-driving robotaxis like Mobileye&rsquo;s can be dispatched across the city to where they&rsquo;re needed most, relieving congestion and demand for parking\" />\u003C/p>\n\u003Cp>\u003Cstrong>Sharing Means Caring\u003C/strong>\u003C/p>\n\u003Cp>The biggest factor in reducing the automobile&rsquo;s environmental impact through automation, however, may lie in their increased potential for shared usage. Fewer vehicles serving more people would decrease the need for production, distribution, and maintenance of private passenger vehicles. On-demand AVs would also mean less driving time spent looking for parking. And (in internal-combustion vehicles) they would result in fewer energy-intensive &ldquo;cold starts&rdquo; as well, as shared vehicles would remain longer at optimal operating temperatures as they ferried passengers around.\u003C/p>\n\u003Cp>Robotaxis dispatched specifically to where they&rsquo;re needed, based on intelligent demand prediction, would mean fewer human-driven taxis and ride-hailing vehicles driving around looking for fares or idling at taxi stands, and fewer \u003Ca href=\"https://www.mobileye.com/news/moovit-2020-global-public-transport-report/\" target=\"_blank\" rel=\"noopener\">buses\u003C/a> driving along set routes and schedules (regardless of the number of passengers on board). And with rides booked in advance, \u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">self-driving mobility services\u003C/a> would be able to dispatch vehicles right-sized for the purpose. So we could expect to see fewer seven-passenger SUVs, for example, clogging up roadways with only one or two occupants on board.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/8b5dec44d52e7f5c27ccfcde8394de5f_1618909524354.jpg\" alt=\"Automobiles are adopting growing arrays of technologies, like the self-driving systems developed by Mobileye\" />\u003C/p>\n\u003Cp>\u003Cstrong>Time Will Tell\u003C/strong>\u003C/p>\n\u003Cp>There are, of course, many questions that will be answered only as adoption of autonomous vehicles grows over time. At the outset, at least, AVs could result in \u003Cem>more\u003C/em> miles being driven overall. AVs promise, after all, to be more convenient, and potentially lower the opportunity cost of travel, encourage longer commuting ranges. AVs could also broaden independent vehicle usage beyond licensed drivers (such as children, the elderly, and the disabled), and result in &ldquo;deadheading&rdquo; (empty vehicles driving to pick up passengers).\u003C/p>\n\u003Cp>However, we&rsquo;d argue that, on the balance of probabilities, these potentially detrimental factors will surely fall well short of outweighing the potential benefits, and are themselves likely only to lessen as \u003Ca href=\"https://www.mobileye.com/opinion/the-challenge-of-supporting-av-at-scale/\" target=\"_blank\" rel=\"noopener\">the use of autonomous vehicles becomes more widespread\u003C/a>.\u003C/p>\n\u003Cp>So, this Earth Day, we hope you&rsquo;ll join us in looking forward to the autonomous future that will not only be safer and more convenient, but stands to be better for the environment as well.\u003C/p>","2021-04-22T07:00:00.000Z",{"id":1814,"type":5,"url":1815,"title":1816,"description":1817,"primary_tag":40,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1817,"image":1818,"img_alt":1819,"content":1820,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1821,"tags":1822},92,"udelv-transporter-autonomous-delivery-vehicles-powered-by-mobileye"," Udelv Taps Mobileye to Power Autonomous Delivery Vehicles","Partnership between Mobileye and Udelv slated to put more than 35,000 autonomous delivery vehicles on the road by 2028, with commercial operations starting in 2023.\n\n","https://static.mobileye.com/website/us/corporate/images/04ac322282f96d496606333b2519b5d2_1618127013415.jpg","Udelv Transporter autonomous delivery vehicles powered by Mobileye","\u003Cp>Today Mobileye took another big step towards the future of autonomous mobility with a new partnership that will put autonomous vehicles powered by Mobileye technology on American roads in just a few short years.\u003C/p>\n\u003Cp>One of the largest AV tech deals to date, the collaboration will see the Mobileye Drive&trade; full-stack self-driving system deployed in Transporters &ndash; autonomous delivery vehicles developed by our new partners at \u003Ca href=\"https://www.udelv.com/\" target=\"_blank\" rel=\"noopener noreferrer\">Udelv\u003C/a>. The purpose-built, fully electric Transporters will benefit from key Mobileye innovations, including the \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ&reg;\u003C/a> SoC, \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">True Redundancy&trade;\u003C/a> sensing suite, \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">REM&trade;\u003C/a> mapping tech, and autonomous driving policy backed by the \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">Responsibility-Sensitive Safety\u003C/a> model &ndash; all part of the \u003Ca href=\"https://www.mobileye.com/blog/mobileye-drive-self-driving-system/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Drive system also announced today\u003C/a>.\u003C/p>\n\u003Cp>With Mobileye Drive, the Udelv Transporters will be capable of Level 4 autonomous point-to-point operation for middle- and last-mile delivery of goods. Udelv&rsquo;s proprietary tele-operations system will enable the vehicles&rsquo; maneuvering in parking lots, loading zones, apartment complexes, and private roads. Mobileye will also provide technical oversight for the integration of our \u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">modular self-driving stack\u003C/a> with Udelv&rsquo;s Delivery Management System, and over-the-air software support for the lifetime of the vehicles.\u003C/p>\n\u003Cp>The collaboration calls for producing more than 35,000 Transporters by 2028. Commercial operations are slated to begin in 2023, with the first pre-order of 1,000 vehicles already placed by Donlen &ndash; one of America&rsquo;s largest commercial fleet management companies.\u003C/p>\n\u003Cp>See the \u003Ca href=\"https://www.mobileye.com/news/mobileye-udelv-deal-autonomous-delivery/\" target=\"_blank\" rel=\"noopener noreferrer\">full news release \u003C/a>for more details on the Udelv partnership, and read more about our other \u003Ca href=\"https://www.mobileye.com/blog/tag/driverless-maas/\" target=\"_blank\" rel=\"noopener\">self-driving Mobility-as-a-Service partnerships here on the Mobileye blog\u003C/a>.\u003C/p>","2021-04-12T07:00:00.000Z","Driverless MaaS, Autonomous Driving, News",{"id":1824,"type":5,"url":1825,"title":1826,"description":1827,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1827,"image":1828,"img_alt":1829,"content":1830,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1821,"tags":997},93,"mobileye-drive-self-driving-system","Presenting the Mobileye Drive™ Self-Driving System","Combining groundbreaking technologies, Mobileye Drive™ is now commercially available to support self-driving mobility and delivery services around the world.","https://static.mobileye.com/website/us/corporate/images/de47f658215c80da581d6fc66e9d05da_1632042760288.jpg","Mobileye Drive Self-Driving System illustration","\u003Cp>Today we announced our latest self-driving mobility deal, and together with it, we&rsquo;re presenting Mobileye Drive&trade; &ndash; our Level 4 self-driving system, now commercially available for a variety of autonomous-vehicle applications.\u003C/p>\n\u003Cp>The industry&rsquo;s premiere commercial self-driving system, Mobileye Drive is the result of years of research and development in autonomous vehicle technologies &ndash; rigorously tested on public roadways \u003Ca href=\"https://www.mobileye.com/blog/tag/autonomous-driving/\" target=\"_blank\" rel=\"noopener\">around the world\u003C/a> &ndash; and builds upon our proven leadership in driver-assistance technologies.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/6c280f44dc9242bf85a89a2c0fb1b7c1_1626618943169.jpg\" alt=\"Mobileye Drive&trade; Self-Driving System infographic\" width=\"1000\" height=\"732\" />\u003C/p>\n\u003Cp>\u003Cstrong>The Mobileye Trinity\u003C/strong>\u003C/p>\n\u003Cp>Mobileye Drive incorporates an array of groundbreaking, industry-leading technologies that together form the Mobileye Trinity of autonomous-vehicle tech. The embodiment of our approach to AV sensing, \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">True Redundancy&trade;\u003C/a> combines two independent perception sub-systems &ndash; one powered by cameras, another by radar and LiDAR &ndash; each capable of supporting full end-to-end autonomous capabilities.\u003C/p>\n\u003Cp>Our innovative approach to mapping, \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">Road Experience Management&trade;\u003C/a> (REM&trade;) leverages crowdsourced data from mass-market ADAS to build AV maps &ndash; efficiently and automatically, anywhere in the world, and on short notice. And our open \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">Responsibility-Sensitive Safety\u003C/a> (RSS) driving policy provides a formal mathematical model to ensure optimal safety, quickly and easily adaptable to different driving cultures.\u003C/p>\n\u003Cp>\u003Cstrong>Autonomously Transporting Passengers and Goods\u003C/strong>\u003C/p>\n\u003Cp>Powered by our \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ&reg; system-on-a-chip\u003C/a> and fed by multiple camera, radar, and LiDAR sensors, this trinity of technologies combine in Mobileye Drive to deliver an industry-leading \u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">self-driving system\u003C/a>. Mobileye Drive is now ready for implementation in robotaxi services, self-driving commercial delivery vehicles, and eventually for consumer passenger AVs &ndash; and is already being embraced by customers around the world.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/a0ef7327dc5a8d7debd8f5513479a4a0_1672653867367.jpg\" alt=\"Udelv and Mobileye Mobility-as-a-Service (MaaS) solution\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>The Autonomous Revolution Has Already Started\u003C/strong>\u003C/p>\n\u003Cp>As part of our \u003Ca href=\"https://www.mobileye.com/blog/self-driving-maas-suite/\" target=\"_blank\" rel=\"noopener noreferrer\">full stack of self-driving Mobility-as-a-Service (MaaS) solutions\u003C/a>, Mobileye Drive is the self-driving system that will enable the launch of our autonomous shuttles that are set to hit the roads in Tel Aviv next year. Additional MaaS partnership agreements have already been signed in such locations as \u003Ca href=\"https://www.mobileye.com/blog/mobileye-transdev-lohr-maas-i-cristal-shuttles-robotaxis/\" target=\"_blank\" rel=\"noopener noreferrer\">France\u003C/a>, the \u003Ca href=\"https://www.mobileye.com/news/mobileye-is-bringing-driverless-maas-to-the-uae/\" target=\"_blank\" rel=\"noopener\">United Arab Emirates\u003C/a>, \u003Ca href=\"https://www.mobileye.com/news/mobileyes-global-ambitions-take-shape-new-deals-china-south-korea/\" target=\"_blank\" rel=\"noopener noreferrer\">South Korea\u003C/a>, and \u003Ca href=\"https://www.mobileye.com/blog/mobileye-to-deploy-robotaxis-in-japan-with-willer/\" target=\"_blank\" rel=\"noopener noreferrer\">Japan\u003C/a> &ndash; with more in the works. And with today&rsquo;s announcement of our \u003Ca href=\"https://www.mobileye.com/blog/udelv-transporter-autonomous-delivery-vehicles-powered-by-mobileye/\" target=\"_blank\" rel=\"noopener noreferrer\">commercial partnership with Udelv\u003C/a>, Mobileye Drive is slated to enable fully autonomous delivery services in the United States starting in 2023.\u003C/p>\n\u003Cp>For more details, see the \u003Ca href=\"https://static.mobileye.com/website/common/files/Mobileye-Drive-Fact-Sheet-675839.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Drive fact sheet\u003C/a>, visit our \u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">self-driving mobility services page\u003C/a>, and watch this space for further announcements.\u003C/p>",{"id":1832,"type":24,"url":1833,"title":1834,"description":1835,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1835,"image":1836,"img_alt":1837,"content":1838,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1821,"tags":1536},150,"mobileye-udelv-deal-autonomous-delivery","Mobileye and Udelv Ink Deal for Autonomous Delivery","Udelv Customers will use ‘Transporter’ for last- and middle-mile autonomous deliveries.","https://static.mobileye.com/dev/website/us/corporate/images/68fb027cb7c423b11780e33df9f803bf_1663143436244.jpg","In April 2021, Udelv announced that the Mobileye Drive self-driving system will drive the company's Transporter autonomous delivery vehicles.","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>\u003Cstrong>NEWS HIGHLIGHTS\u003C/strong>\u003C/p>\n\u003Cul>\n\u003Cli>Mobileye&rsquo;s self-driving system ― branded Mobileye Drive&trade; ― will be the autonomous &ldquo;driver&rdquo; for Udelv&rsquo;s next-generation electric self-driving delivery vehicle, called &ldquo;Transporter,&rdquo; which was revealed today.\u003C/li>\n\u003Cli>Fleets of Transporters will begin operations in 2023; more than 35,000 Mobileye-driven Transporters will be produced between 2023 and 2028.\u003C/li>\n\u003Cli>First pre-order of 1,000 Udelv Transporters announced today by Donlen, one of America&rsquo;s largest commercial fleet leasing and management companies.\u003C/li>\n\u003Cli>Premiere deal signals commercial readiness of Mobileye Drive&trade; for large-scale deployment in the movement of goods and people and maturity of Udelv&rsquo;s delivery technology.\u003C/li>\n\u003C/ul>\n\u003Cp>JERUSALEM and BURLINGAME, Calif., April 12, 2021 &ndash; Mobileye, an Intel Company, and Udelv, a Silicon Valley venture-backed company, announced that Mobileye&rsquo;s self-driving system ― branded Mobileye Drive&trade; ― will &ldquo;drive&rdquo; the next-generation Udelv autonomous delivery vehicles (ADV), called \u003Ca href=\"https://www.udelv.com/wp-content/uploads/_pda/2020/11/Udelv-Transporter-Fact-Sheet-040821-1.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">&ldquo;Transporters.&rdquo;\u003C/a>&nbsp;The companies plan to produce more than 35,000 Mobileye-driven Transporters by 2028, with commercial operations beginning in 2023. Today&rsquo;s news is believed to be the first large-scale deal for a self-driving system and signals that Mobileye Drive is ready for commercial deployment in solutions involving the autonomous movement of goods and people.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>[**]gallery:mobileye-udelv-deal-autonomous-delivery-gallery-1[**]\u003C/p>\n\u003Cp>&ldquo;Our deal with Udelv is significant for its size, scope and rapid deployment timeline, demonstrating our ability to deliver Mobileye Drive&trade; for commercial use now and in volume,&rdquo; said Prof. Amnon Shashua, Mobileye president and CEO. &ldquo;COVID-19 has accelerated demand for autonomous goods delivery, and we are delighted to partner with Udelv to address this demand in the near term.&rdquo;\u003C/p>\n\u003Cp>Daniel Laury, CEO and co-founder of Udelv, said: &ldquo;Mobileye is the only company providing a full-stack self-driving system with commercial viability and scale today. The readiness of Mobileye Drive&trade;, along with its vast map coverage of North America, Europe and Asia, will allow us to ramp up the production and deployment of Udelv Transporters and rapidly offer the service at scale to our expanding list of customers.&rdquo;\u003C/p>\n\u003Cp>Last-mile delivery is the&nbsp;\u003Ca href=\"https://www.businessinsider.com/last-mile-delivery-shipping-explained#:~:text=As%20a%20share%20of%20the,substantial%20%E2%80%94%20comprising%2053%25%20overall.\" target=\"_blank\" rel=\"noopener noreferrer\">most expensive\u003C/a>&nbsp;aspect of distribution, accounting for 53% of the overall cost of goods. At the same time, consumers are buying more and more goods online which is expected to raise urban last-mile delivery volume by 75 to 80% by 2030 and require&nbsp;\u003Ca href=\"https://www.roboticstomorrow.com/story/2020/10/global-impact-of-covid-19-on-autonomous-last-mile-delivery-market/15873/\" target=\"_blank\" rel=\"noopener noreferrer\">36% more delivery vehicles\u003C/a>. And a shortage of drivers is making it difficult for companies to keep pace. It is a service model that is ripe for improvement.\u003C/p>\n\u003Cp>Udelv&rsquo;s customers expect Transporters to dramatically improve the efficiency of last- and middle-mile delivery services for everything from baked goods and auto parts to groceries and medical supplies.\u003C/p>\n\u003Cp>Donlen, one of America&rsquo;s largest commercial fleet management companies at the forefront of fleet management innovation and technology, today placed the first pre-order for 1,000 Transporters. This pre-order is believed to be the largest to date for an autonomous delivery vehicle.\u003C/p>\n\u003Cp>&ldquo;We are thrilled to be the first customer for the Udelv Transporter,&rdquo; said Tom Callahan, president of Donlen. &ldquo;The combination of Udelv&rsquo;s zero-emissions Transporter and automated delivery management system with Mobileye Drive&trade; will enable sweeping delivery cost reductions, make our roads safer, and lower carbon emissions across America.&rdquo;\u003C/p>\n\u003Cp>Mobileye Drive comprises EyeQ&trade; system-on-chip-based level 4 (L4) compute, sensors and software, the company&rsquo;s proprietary Road Experience Management&trade; AV mapping solution and Responsibility-Sensitive Safety-based autonomous driving policy. Udelv will perform the integration with its Delivery Management System, with Mobileye providing technical oversight. Mobileye will also provide over-the-air software support.\u003C/p>\n\u003Cp>Mobileye-driven Transporters will be capable of L4 self-driving, point-to-point operation. Udelv&rsquo;s proprietary tele-operations system will allow for the maneuvering of the vehicles at the edges of the mission, in parking lots, loading zones, apartment complexes and private roads.\u003C/p>\n\u003Cp>Celebrated for creating the world&rsquo;s first custom-made ADV that completed the first autonomous delivery in early 2018, Udelv has quietly performed extensive deployment trials with customers across various industries.\u003C/p>\n\u003Cp>As one of Udelv&rsquo;s early customers, Mike Odell, president and CEO of XL Parts and Marubeni Automotive Aftermarket Holdings, said: &ldquo;We placed our trust in Udelv&rsquo;s technology two years ago and are thrilled to witness the progress this company has made in such a short period of time. XL Parts remains committed to expanding its partnership with Udelv and to being one of the first clients for the Transporters.&rdquo;\u003C/p>\n\u003Cp>The deal with Udelv advances Mobileye&rsquo;s global mobility-as-a-service ambitions, validating the company&rsquo;s technology and business approach. Mobileye plans to deploy&nbsp;\u003Ca href=\"https://www.mobileye.com/news/mobileye-transdev-ats-lohr-group-av-shuttles/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomous shuttles\u003C/a> with Transdev ATS and Lohr Group beginning in Europe.&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\n\u003Cp>\u003Csup>Mobileye is leading the mobility revolution with its autonomous driving and driver-assist technologies, harnessing world-renowned expertise in computer vision, machine learning, mapping, and data analysis. Our technology enables self-driving vehicles and mobility solutions, powers industry-leading advanced driver-assistance systems, and delivers valuable intelligence to optimize mobility infrastructure. Mobileye pioneered such groundbreaking technologies as True Redundancy&trade; sensing, REM&trade; crowdsourced mapping, and Responsibility Sensitive Safety (RSS) technologies that are driving the ADAS and AV fields towards the future of mobility. For more information,&nbsp;\u003C/sup>\u003Ca href=\"https://www.mobileye.com/\" target=\"_blank\" rel=\"noopener\">\u003Csup>www.mobileye.com\u003C/sup>\u003C/a>\u003Csup>. \u003C/sup>\u003C/p>\n\u003Cp>\u003Cstrong>About Udelv\u003C/strong>\u003C/p>\n\u003Cp>\u003Csup>On a mission to improve people&rsquo;s lives, road safety and sustainable delivery, Udelv is revolutionizing the logistics space with its autonomous delivery vans (ADV) for last- and middle-mile delivery on public roads. Founded in California in 2017 by Daniel Laury and CTO Akshat Patel, Udelv successfully accomplished the first ever autonomous delivery on public roads in 2018. Udelv has since completed over 20,000 deliveries for multiple merchants in California, Arizona, and Texas and is preparing for expansion in many other states. Udelv&rsquo;s focus on autonomous vehicles paired with its uPod delivery technology enable long-range and high-capacity deliveries that are eco, business and customer friendly. For more information, visit&nbsp;\u003C/sup>\u003Ca href=\"http://www.udelv.com/\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Csup>www.udelv.com\u003C/sup>\u003C/a>\u003Csup>. \u003C/sup>\u003C/p>\n\u003Cp>\u003Cstrong>About Donlen\u003C/strong>\u003C/p>\n\u003Cp>\u003Csup>Fleet management is moving in a new direction and you&rsquo;ll want a trusted partner for the road ahead. Headquartered in Bannockburn, Illinois, Donlen develops innovative fleet management technology solutions and offers a proactive, hands-on approach to customer service. Donlen listens to your needs, truly understanding your business, and guides you towards a more successful future. Donlen is a wholly-owned subsidiary of Athene, a leading financial services company with total assets of $202.8 billion (as of Dec. 31, 2020). For more information, visit&nbsp;\u003C/sup>\u003Ca href=\"https://urldefense.proofpoint.com/v2/url?u=http-3A__www.donlen.com_&amp;d=DwMGaQ&amp;c=euGZstcaTDllvimEN8b7jXrwqOf-v5A_CdpgnVfiiMM&amp;r=ak0E5BQj0AzMgTQC4NyEwptPwYQf2tUcsjDW-d-Bz1M&amp;m=qzmFTq5p5EfIUGmq_vNvL1bxEY3bg9JR_wx6dpdrO-I&amp;s=Q7hpuaZf4mNzyti864vVFFM0vaxoisl1oU6xp5kOyY8&amp;e=\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Csup>www.donlen.com\u003C/sup>\u003C/a>\u003Csup>. \u003C/sup>\u003C/p>",{"id":1840,"type":5,"url":1841,"title":1842,"description":1843,"primary_tag":934,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1843,"image":1844,"img_alt":1845,"content":1846,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1847,"tags":1848},79,"rem-adas-data","Our REM™ HD Mapping Technology Isn’t Just for Autonomous Vehicles","Mobileye’s Road Experience Management™ system was initially developed to digitally map the roads on which our autonomous vehicles will soon drive themselves, but it's rapidly expanding to additional applications – from driver-assistance to road works.","https://static.mobileye.com/website/us/corporate/images/b27c40147277a61e2d9df754fcc6339c_1608639206530.jpg","Illustration of a connected smart-city network.","\u003Cp>Last week we \u003Ca href=\"https://www.mobileye.com/blog/rem-mapping-avs/\" target=\"_blank\" rel=\"noopener noreferrer\">told you all about REM&trade;\u003C/a> &ndash; Mobileye&rsquo;s unique method for creating digital maps &ndash; and how our autonomous vehicles will depend on them to operate. But our \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">Road Experience Management&trade;\u003C/a> system isn&rsquo;t just benefiting the AVs of the future (as fast as it may be approaching). Like so many of the technologies we&rsquo;ve been developing for our self-driving platform, we have found real-world applications for REM &ndash; even before the long-held dream of owning a car that can drive itself becomes a reality.\u003C/p>\n\u003Cp>\u003Cstrong>ADAS\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/6323787bfb735d729ed3410f35e8b98b_1612795515569.jpg\" alt=\"Vehicles equipped with advanced driver-assistance systems enabled by Mobileye&rsquo;s Road Experience Management&trade; mapping technology\" />\u003C/p>\n\u003Cp>The same \u003Ca href=\"https://www.mobileye.com/blog/av-maps-vs-hd-maps/\" target=\"_blank\" rel=\"noopener noreferrer\">digital maps being generated by REM for AVs\u003C/a> are already available for deployment in vehicles equipped with \u003Ca href=\"https://www.mobileye.com/blog/buying-a-new-car-here-are-four-adas-features-to-look-for/\" target=\"_blank\" rel=\"noopener noreferrer\">Advanced Driver-Assistance Systems\u003C/a> (like those \u003Ca href=\"https://www.mobileye.com/news/mobileye-tech-makes-the-grade-under-euro-ncaps-new-assisted-driving-standard/\" target=\"_blank\" rel=\"noopener\">we supply to many of the world&rsquo;s leading automakers\u003C/a>). These incredibly precise, highly detailed, and constantly refreshing maps serve in ADAS as an extra layer on top of the capabilities of the vehicle&rsquo;s onboard sensors, thereby increasing their situational awareness and capacity to avoid potential collisions.\u003C/p>\n\u003Cp>Imagine, for instance, that you&rsquo;re driving in low-visibility conditions in a vehicle equipped with Mobileye-powered ADAS. You can&rsquo;t see the lane markers because it&rsquo;s dark, or raining heavily, or the road is covered in snow. But other vehicles equipped with our technology may have already informed our system of where the lines are painted on the roadway, and shown the pattern of traffic driving in between them. So even if you can&rsquo;t see them with your own eyes, and even if the onboard cameras can&rsquo;t see them, a vehicle tapped into the Mobileye Roadbook&trade; (our database of REM-generated maps) would know if you&rsquo;re \u003Ca href=\"https://www.consumerreports.org/car-safety/lane-departure-warning-lane-keeping-assist-guide/\" target=\"_blank\" rel=\"noopener noreferrer\">drifting out of your lane\u003C/a>, and alert you to that effect, or even steer the vehicle automatically back onto the proverbial straight and narrow path.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/04dd6906bb376af0b2b9d3c1b25b21dd_1626183582464.png\" alt=\"Road Experience Management&trade; creates the Mobileye Roadbook&trade;, a highly specific digital map of the driving environment\" />\u003C/p>\n\u003Cp>That&rsquo;s just one of many examples of how vehicles with our \u003Ca href=\"https://www.mobileye.com/blog/understanding-l2-in-five-questions/\" target=\"_blank\" rel=\"noopener noreferrer\">Level 2+ ADAS technology\u003C/a> stands to benefit \u003Cem>from\u003C/em> REM &ndash; but these vehicles are also being equipped to feed information back \u003Cem>into\u003C/em> REM in turn. This vital crowdsourcing function is at the core of REM&rsquo;s capability to map the world&rsquo;s roadways in a manner we&rsquo;ve found to be more efficient, cost-effective, scalable, and up-to-date than the more common industry practice of mapping by dedicated LiDAR vehicles.\u003C/p>\n\u003Cp>REM&rsquo;s proliferation across our product portfolio will be especially beneficial as part of \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye SuperVision&trade;\u003C/a>. Our latest, most advanced driver-assistance system yet, Mobileye SuperVision borrows an arsenal of features from our AV development program, including the REM-generated Roadbook (along with its eleven cameras, dual EyeQ&reg;5 system-on-a-chip devices, and more). These features bring Mobileye SuperVision incrementally closer to self-driving capabilities (while still requiring the human driver to supervise the vehicle&rsquo;s operation and remain prepared to take over its control).\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Read more about Mobileye SuperVision here\u003C/a>.\u003C/p>\n\u003Cp>\u003Cstrong>Data Services\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/16e5c7f8275d9179642be4ca192316c5_1612795527388.jpg\" alt=\"The same technology behind Road Experience Management&trade; also powers Mobileye Data Services for infrastructure surveying and smart city planning\" />\u003C/p>\n\u003Cp>Though automotive safety remains a main focus at Mobileye, it wasn&rsquo;t long into the process of developing REM that we realized the same technology could serve even more purposes beyond four wheels.\u003C/p>\n\u003Cp>Through \u003Ca href=\"https://www.mobileye.com/en/data/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Data Services&trade;\u003C/a>, we&rsquo;re able to offer such solutions as pavement-condition monitoring, road-network and utility-asset surveying, mobility intelligence, and live traffic data. Enabled by the same technology behind REM, these services can provide invaluable insights to those responsible for installing, maintaining, and operating crucial infrastructure &ndash; delivering meaningful data at a high refresh rate and more efficiently than the methods traditionally practiced by their stakeholders themselves.\u003C/p>\n\u003Cp>Say, for example, that you work at an electric company, and need to know where all the electrical poles are located in the area you serve. You could go out in a truck and painstakingly log their individual locations, as most utility companies have to undertake periodically. Or you could simply fit Mobileye devices into your fleet vehicles already out on the road and let our computer-vision technology gather the data for you automatically in a digitally streamlined process.\u003C/p>\n\u003Cp>The same goes for road-works authorities responsible for fixing potholes and cracked pavement, transportation departments that need to keep traffic flowing smoothly, municipalities eager to upgrade their urban infrastructure into \u003Ca href=\"https://www.mckinsey.com/business-functions/operations/our-insights/smart-cities-digital-solutions-for-a-more-livable-future\" target=\"_blank\" rel=\"noopener noreferrer\">smart cities\u003C/a>, and a variety of other applications.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/en/data/\" target=\"_blank\" rel=\"noopener noreferrer\">Read more about Mobileye Data Services here\u003C/a>.\u003C/p>\n\u003Cp>\u003Cstrong>Self-Driving MaaS\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/32d57d388b600a9a314ec56cec4fec60_1612795543713.png\" alt=\"Self-driving mobility services like those offered by Mobileye and Moovit stand to revolutionize the way people move\" />\u003C/p>\n\u003Cp>As beneficial as our mapping and surveying technology can be for ADAS and infrastructure, it&rsquo;ll be \u003Ca href=\"https://www.ft.com/content/2a8941a4-1625-11e8-9e9c-25c814761640\" target=\"_blank\" rel=\"noopener noreferrer\">most crucial to autonomous vehicles\u003C/a>. But we needn&rsquo;t wait until self-driving cars will be put into mass production and offered for sale to the public before even that application can be realized.\u003C/p>\n\u003Cp>The key steppingstone between ADAS-enhanced passenger vehicles and consumer AVs, robotaxis will rely just as heavily on REM-generated maps (in addition to our \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">redundant camera-based and radar/LiDAR sensing systems\u003C/a>). In fact it&rsquo;s largely due to REM&rsquo;s scalability and geographic adaptability that we&rsquo;ll soon be rolling out driverless Mobility-as-a-Service (MaaS) solutions with local partners and in locations as widespread as \u003Ca href=\"https://www.mobileye.com/blog/mobileye-hosts-its-first-investor-summit-since-the-intel-acquisition/\" target=\"_blank\" rel=\"noopener noreferrer\">France\u003C/a>, the \u003Ca href=\"https://www.mobileye.com/news/mobileye-is-bringing-driverless-maas-to-the-uae/\" target=\"_blank\" rel=\"noopener\">United Arab Emirates\u003C/a>, \u003Ca href=\"https://www.mobileye.com/news/mobileyes-global-ambitions-take-shape-new-deals-china-south-korea/\" target=\"_blank\" rel=\"noopener noreferrer\">South Korea\u003C/a>, and \u003Ca href=\"https://www.mobileye.com/blog/mobileye-to-deploy-robotaxis-in-japan-with-willer/\" target=\"_blank\" rel=\"noopener noreferrer\">Japan\u003C/a>.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">Read more about self-driving mobility solutions here\u003C/a>.\u003C/p>\n\u003Cp>Between driver-assistance systems, data services, driverless MaaS, and consumer AVs, the mapping and surveying capabilities unlocked by Mobileye&rsquo;s \u003Ca href=\"https://www.mobileye.com/blog/were-in-the-midst-of-an-ai-revolution-says-shashua/\" target=\"_blank\" rel=\"noopener noreferrer\">expertise in computer-vision technology\u003C/a> are growing almost as rapidly as our Roadbook is expanding. So don&rsquo;t be surprised if we find even more applications for this groundbreaking technology as its development drives forward.\u003C/p>","2021-03-18T07:00:00.000Z","Mapping & REM, Driverless MaaS, ADAS",{"id":1850,"type":5,"url":1851,"title":1852,"description":1853,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1853,"image":1854,"img_alt":1855,"content":1856,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1857,"tags":1858},78,"rem-mapping-avs","REM™ Gives Our Autonomous Vehicles the HD Maps They Need","Mobileye’s Road Experience Management™ system leverages our expertise in computer vision and the power of the crowd to supply our autonomous vehicles with an ever-growing and constantly updating database of highly precise, high-definition digital maps.","https://static.mobileye.com/website/us/corporate/images/50e9fb3b4ffde37e3e7ec5ce479ac77a_1607859883475.jpg","Illustration of a vehicle traveling on a digitized highhway","\u003Cp>From memory to environmental cues to asking for directions, human drivers have many ways of figuring out where they&rsquo;re going &ndash; of which consulting a map is just one. But for computer-operated autonomous vehicles, \u003Ca href=\"https://youtu.be/B7YNj66GxRA?t=1199\" target=\"_blank\" rel=\"noopener noreferrer\">maps will be absolutely essential\u003C/a> to supplement the AV&rsquo;s onboard sensors and increase the self-driving system&rsquo;s level of reliable safety.\u003C/p>\n\u003Cp>And not just any maps will do: to effectively and safely drive themselves, AVs will require a database of maps far more detailed, precise, and up-to-date than the app on your phone or the \u003Ca href=\"https://www.esa.int/Applications/Navigation/How_satellite_navigation_works\" target=\"_blank\" rel=\"noopener noreferrer\">satellite navigation system\u003C/a> built into your car&rsquo;s dashboard display (never mind that old road atlas in your glovebox).\u003C/p>\n\u003Cp>That much we determined early on in our research into autonomous vehicles, and \u003Ca href=\"https://www.mobileye.com/news/mobileye-ranked-5-in-guidehouse-insights-automated-driving-leaderboard/\" target=\"_blank\" rel=\"noopener\">the rest of the industry\u003C/a> has by now largely come around to the same conclusion. The question is how to best create those maps. Mobileye&rsquo;s answer is REM&trade;.\u003C/p>\n\u003Cp>\u003Cstrong>Mapping by Convention vs the Mobileye Way\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/1c7c00b6a75889e5774cdf2543aab227_1612794804595.jpg\" alt=\"Creating the right kind of digital map like the Mobileye Roadbook&trade; is crucial to the future of self-driving mobility\" />\u003C/p>\n\u003Cp>The digital-mapping method commonly practiced across the industry revolves around dispatching fleets of dedicated mapping vehicles. Those vehicles are hugely expensive to acquire and operate, and generate an enormous amount of data to store, transmit, and process. As a result, those (typically LiDAR-intensive) mapping vehicles can only be feasibly deployed on so many sections of roadway at a time, and can only remap those roads so often while the driving environment changes constantly.\u003C/p>\n\u003Cp>By contrast, Mobileye&rsquo;s \u003Ca href=\"https://www.mobileye.com/news/mobileye-wins-prestigious-2020-pace-award-for-rem-mapping-tech/\" target=\"_blank\" rel=\"noopener\">Road Experience Management&trade;\u003C/a> (REM) system draws data from the masses of cars already on the road equipped with our cameras and chips. The fully anonymized data is uploaded to the cloud in small packets and processed on a continuous basis to create the Mobileye Roadbook&trade; &ndash; a database of highly precise, high-definition maps by which our AVs will be able to drive in the not-so-distant future.\u003C/p>\n\u003Cp>\u003Cstrong>More than Meets the AV&rsquo;s Eye\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/bded89d504a479bedd6ca18c1c8c4aca_1612794823693.jpg\" alt=\"The Mobileye Roadbook&trade; provides the self-driving vehicle with a valuable extra layer of information on top of the vehicle&rsquo;s built-in sensors\" />\u003C/p>\n\u003Cp>Like today&rsquo;s advanced \u003Ca href=\"https://www.mobileye.com/blog/everything-you-need-to-know-about-adas/\" target=\"_blank\" rel=\"noopener noreferrer\">driver-assistance systems\u003C/a>, autonomous vehicles will rely on sensors onboard the vehicle to perceive the surrounding environment in real time, which in turn serves as the basis for its decisions on how it should operate. But on AVs, the sensors are augmented by the digital map, providing an \u003Ca href=\"https://www.mobileye.com/blog/moving-our-machine-learning-to-the-cloud-inspired-innovation/\" target=\"_blank\" rel=\"noopener noreferrer\">additional layer\u003C/a> of up-to-date information to serve as a point of comparison for what the vehicle should expect to see in front of itself.\u003C/p>\n\u003Cp>REM not only creates that map efficiently, but yields a map with more (and more up-to-date) information. By furnishing \u003Ca href=\"https://www.mobileye.com/news/mobileye-releases-unedited-footage-of-a-40-minute-av-ride-through-jerusalem/\" target=\"_blank\" rel=\"noopener\">our AV\u003C/a> with these living, breathing maps, the vehicle will be able to draw on records of how other vehicles drive on the same roadway, localize itself more precisely than a conventional GPS signal affords, and &ldquo;see&rdquo; around the next bend and over the next rise &ndash; regardless of obstructions, adverse visibility conditions, or other potential impediments to its onboard sensing capabilities.\u003C/p>\n\u003Cp>This crucial development is achieved through an irreplicable confluence of three of Mobileye&rsquo;s core strengths.\u003C/p>\n\u003Cp>\u003Cstrong>1) Harnessing the Power of the Crowd\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/8b00d630da9ff27ee0e7cf92451f3324_1612794840935.jpg\" alt=\"Millions of vehicles on the road equipped with Mobileye technology collect data through Road Experience Management&trade; (REM&trade;)\" />\u003C/p>\n\u003Cp>To date our technology has been fitted into more than 65 million vehicles, currently offered on hundreds of models from dozens of the world&rsquo;s leading automakers &ndash; including many of those \u003Ca href=\"https://www.mobileye.com/news/mobileye-tech-makes-the-grade-under-euro-ncaps-new-assisted-driving-standard/\" target=\"_blank\" rel=\"noopener\">deemed the safest and most advanced\u003C/a>. With all those cars traveling the roads we need to map on an ongoing basis, all we need in order to harvest the raw data, in essence, is for even a small proportion of those vehicles to do what they&rsquo;re already doing and relay what they&rsquo;re &ldquo;seeing&rdquo; back to our network. REM does the rest: creating the Roadbook, constantly refreshing it, and imbuing it with a wide range of relevant parameters.\u003C/p>\n\u003Cp>Among those, REM can track traffic patterns, recognize colors, and read text (like on road signs) along given segments of roadway &ndash; regardless of whether those signs were present when the original map was rendered. So a Roadbook-enabled vehicle approaching a construction zone, for example, stands to already know what&rsquo;s coming up in advance, because other REM-harvesting vehicles will likely have already reported developments like the emergence of orange construction-zone warning signs and the shuffling of traffic out of closed-off lanes &ndash; details you wouldn&rsquo;t get from a conventional static map.\u003C/p>\n\u003Cp>\u003Cstrong>2) Our (Computer) Vision for the Future\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/77ece54278fb11accb6ccafba5499f1c_1612794868506.jpg\" alt=\"Inside one of Mobileye&rsquo;s self-driving autonomous development vehicles, employing computer vision mobility technology\" />\u003C/p>\n\u003Cp>Mobileye was founded on \u003Ca href=\"https://towardsdatascience.com/everything-you-ever-wanted-to-know-about-computer-vision-heres-a-look-why-it-s-so-awesome-e8a58dfb641e\" target=\"_blank\" rel=\"noopener noreferrer\">computer-vision technology\u003C/a>, and that expertise has only grown in the decades since. Our uncommon proficiency in this cutting-edge discipline of \u003Ca href=\"https://www.mobileye.com/blog/were-in-the-midst-of-an-ai-revolution-says-shashua/\" target=\"_blank\" rel=\"noopener noreferrer\">artificial intelligence\u003C/a> propels the continued advancement of our driver-assistance technologies, our ongoing developing of autonomous vehicles, and the rapid expansion of our mapping and road-surveying initiatives.\u003C/p>\n\u003Cp>In stark contrast to LiDAR, the cameras that serve (both literally and figuratively) as the lens through which we apply our computer-vision technology are incredibly cost-effective &ndash; especially in relation to the high resolution and vivid color in which they&rsquo;re capable of capturing the environment on which they&rsquo;re trained. But what truly sets Mobileye apart is what we&rsquo;re able to do with that imagery, and the agility with which our highly specialized and efficient algorithms are able to glean meaningful insights from what \u003Ca href=\"https://www.mobileye.com/news/nissan-rogue-to-showcase-mobileyezf-100-degree-adas-camera/\" target=\"_blank\" rel=\"noopener\">our cameras\u003C/a> are picking up out on the open road.\u003C/p>\n\u003Cp>\u003Cstrong>3) Light &amp; Lean \u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/5322aaf5fbf6e282ec2f6f44fd7d102a_1612794886538.png\" alt=\"REM&trade; creates Road Segment Data &ndash; small packets of information that are transmitted to the cloud at low bandwidth\" />\u003C/p>\n\u003Cp>The most straightforward method of transmitting what those cameras are detecting might be to transmit the raw imagery or dense 3D representations of the driving environment \u003Ca href=\"https://www.mobileye.com/blog/moving-our-machine-learning-to-the-cloud-inspired-innovation/\" target=\"_blank\" rel=\"noopener noreferrer\">directly to the cloud\u003C/a>. But that would both pose potentially serious privacy concerns, and would also amount to massive amounts of data, which in turn would require incredible bandwidth to transmit.\u003C/p>\n\u003Cp>Mobileye&rsquo;s solution is to \u003Ca href=\"https://www.mobileye.com/blog/why-tops-arent-tops-when-it-comes-to-av-processors/\" target=\"_blank\" rel=\"noopener noreferrer\">process the data at the &ldquo;edge,&rdquo;\u003C/a> on board the vehicle, and transmit the relevant information in much &ldquo;lighter&rdquo; packets of pre-classified and inherently anonymous data. These packets, which we call Road Segment Data (RSD), amount to only 10 kilobytes (roughly the size of this article&rsquo;s plain text) per kilometer, and can be transmitted over the cellular modems with which new cars already commonly come equipped, requiring only a sparse connection from existing 3G infrastructure, without having to wait for the widespread implementation of \u003Ca href=\"https://www.zdnet.com/article/connected-cars-how-5g-and-iot-will-affect-the-auto-industry/\" target=\"_blank\" rel=\"noopener noreferrer\">the emerging 5G standard\u003C/a>.\u003C/p>\n\u003Cp>While these detail-rich RSDs are used first and foremost for creating the map, we also use them for refreshing the map continuously. The parameters in each of these roadway &ldquo;signatures&rdquo; are compared automatically to the existing map, and where changes are detected, the Roadbook is updated. This method is not only more efficient than remapping entire sections of roadway from scratch, but results in a database more reliably current and accurate than we&rsquo;ve determined achievable with LiDAR-equipped mapping vehicles.\u003C/p>\n\u003Cp>\u003Cstrong>Real-World Applications\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/cb8d94dafb12cc70d2f50dbe48db469a_1612794902831.jpg\" alt=\"The Mobileye Roadbook&trade; is rapidly being compiled by and for millions of vehicles already on the road around the world\" />\u003C/p>\n\u003Cp>The convergence in REM of these capabilities is how we were able, for example, to map \u003Ca href=\"https://www.mobileye.com/blog/mobileye-to-deploy-robotaxis-in-japan-with-willer/\" target=\"_blank\" rel=\"noopener noreferrer\">all of Japan\u003C/a>, with the push of a button, in only 24 hours. That entire map, covering some 25,000 kilometers of road, takes up just 400 megabytes, and has already been in use in consumer vehicles from a major domestic automaker for well over a year now. And there are even more applications for REM technology being deployed today before autonomous vehicles will be deployed at scale. Watch this space for more to come.\u003C/p>","2021-03-11T08:00:00.000Z","Autonomous Driving, Mapping & REM",{"id":1860,"type":5,"url":1861,"title":1862,"description":1863,"primary_tag":934,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1863,"image":1864,"img_alt":1865,"content":1866,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1867,"tags":1868},88,"av-maps-vs-hd-maps","HD Maps  vs AV Maps - The Crucial Differences","Powered by REM™, the Mobileye Roadbook™ provides AVs with the accurate, up-to-date information they need to operate effectively and safely.\n","https://static.mobileye.com/website/us/corporate/images/35493487c9d025c81f4e9ed21722d458_1612794598859.jpg","Mobileye Roadbook™, powered by REM™","\u003Cp>An HD map is not the same as an AV map. Compared to the map created by Mobileye specifically for autonomous vehicles, the high-definition maps prevalent across \u003Ca href=\"https://www.mobileye.com/news/mobileye-ranked-5-in-guidehouse-insights-automated-driving-leaderboard/\" target=\"_blank\" rel=\"noopener\">the industry\u003C/a> are at once both over-specified and under-specified &ndash; offering a lot of information that isn&rsquo;t necessary, while omitting details that are critical for an AV to navigate within its surroundings.\u003C/p>\n\u003Cp>The common approach taken with most HD maps employed for self-driving applications is to digitize everything on and around the road and locate the AV within that map, offering global accuracy. But the AV doesn&rsquo;t need to know about a traffic sign, for example, that&rsquo;s miles away from \u003Ca href=\"https://www.mobileye.com/blog/munich-av-video/\" target=\"_blank\" rel=\"noopener noreferrer\">where it is\u003C/a> at any given moment. What it really needs to know is what&rsquo;s within a radius of approximately 200 meters (about 650 feet) around itself in relation to the vehicle&rsquo;s current location, and how it needs to operate accordingly.\u003C/p>\n\u003Cp>For all its impressive degree of global accuracy, the typical HD map falls short of offering the AV semantic information &ndash; how to understand the best way to navigate the road ahead. Put another way, HD maps don&rsquo;t offer cues on \u003Cem>how\u003C/em> to drive &ndash; the intuition that human drivers accrue by experience. A conventional HD map can&rsquo;t tell an AV which traffic light is relevant to the path on which it&rsquo;s driving, for instance, or where is the best place to stop for an unobstructed view at an intersection (unless such information is entered manually or ported over from an external source). The fundamental question, then, is how to develop a mapping solution that offers the accuracy where it matters, with the information that&rsquo;s actually relevant to the autonomous vehicle&rsquo;s operation. Enter: the Mobileye Roadbook&trade;️.\u003C/p>\n\u003Cp>\u003Cstrong>Mobileye Roadbook &ndash; Our Autonomous Vehicle Map \u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/ca496a63bcdec040434119dc796339a2_1612794661279.png\" alt=\"Road Experience Management&trade; (REM&trade;) is rapidly mapping roadways around the world and building the Mobileye Roadbook&trade;\" />\u003C/p>\n\u003Cp>Built specifically\u003Cstrong> \u003C/strong>for its end use of enabling \u003Ca href=\"https://www.mobileye.com/blog/munich-av-video/\" target=\"_blank\" rel=\"noopener noreferrer\">self-driving vehicles\u003C/a>, the Mobileye Roadbook is built on data collected, compiled, aligned, and modeled by \u003Ca href=\"https://www.mobileye.com/news/mobileye-wins-prestigious-2020-pace-award-for-rem-mapping-tech/\" target=\"_blank\" rel=\"noopener\">REM&trade;\u003C/a> &ndash; Mobileye&rsquo;s technology for mapping the world&rsquo;s roadways from crowdsourced data. Like most any road map, the Roadbook contains a wealth of relevant information on \u003Ca href=\"https://civilread.com/road-types/\" target=\"_blank\" rel=\"noopener noreferrer\">city streets, rural roads, and interurban highways\u003C/a> &ndash; all at an exacting level of accuracy to allow for pinpoint localization down to 10 centimeters. But in order to provide our AVs with the precise information they need to inform their decision-making on the move, the Roadbook encompasses a very different set of parameters from the typical database of street names, gas stations, and points of interest.\u003C/p>\n\u003Cp>Like most roadway maps, the Mobileye Roadbook includes basic details like curbs, lane markers, and \u003Ca href=\"https://www.mobileye.com/blog/avs-and-the-drive-for-pedestrian-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">crosswalks\u003C/a>. Our map, however, goes a crucial step or two farther &ndash; informing the AV not only about the road it needs to drive on, but how it needs to drive on it. Which street sign or traffic light belongs to which lane? Which lane has \u003Ca href=\"https://www.safemotorist.com/Articles/Right_of_Way/\" target=\"_blank\" rel=\"noopener noreferrer\">right-of-way at the intersection\u003C/a>? Where do bottlenecks frequently occur in traffic? \u003Ca href=\"https://ec.europa.eu/transport/road_safety/eu-road-safety-policy/priorities/safe-road-use/safe-speed/archive/many-drivers-exceed-speed-limit_en\" target=\"_blank\" rel=\"noopener noreferrer\">What speed does traffic commonly travel\u003C/a> down any given stretch of road (notwithstanding the posted speed limit)? This type of information might be intuitive to a human driver, but would not inherently be included in a static HD map. The Mobileye Roadbook reflects these real-world parameters, in near-real time, drawn from data uploaded by legions of \u003Ca href=\"https://www.mobileye.com/news/mobileye-tech-makes-the-grade-under-euro-ncaps-new-assisted-driving-standard/\" target=\"_blank\" rel=\"noopener\">cars out on the road equipped with our technology\u003C/a>.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/a05f04600bd035255ad240cc894f04a9_1612794726959.png\" alt=\"/\" />\u003C/p>\n\u003Cp>What the autonomous vehicle doesn&rsquo;t need to know, on the other hand, the Mobileye Roadbook (quite literally) leaves by the wayside. Because the AV doesn&rsquo;t need to access photographic images of the driving environment, for example, the Roadbook contains only the distilled and usable points of data that are meaningful to the AV&rsquo;s operation, and requires less space and bandwidth as a result. The highly specific set of parameters captured by REM yields an AV map\u003Cspan style=\"color: black;\"> \u003C/span>that optimizes by design the relationship between level of detail and the digital &ldquo;weight&rdquo; of the map. This method provides our AV platform with the precise information it needs &ndash; nothing more, nothing less.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">Visit the Mobileye REM webpage\u003C/a> and watch this space for more.\u003C/p>","2021-03-01T08:00:00.000Z","Mapping & REM, Autonomous Driving",{"id":1870,"type":5,"url":1871,"title":1872,"description":1873,"primary_tag":40,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1873,"image":1874,"img_alt":1875,"content":1876,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1877,"tags":1878},89,"mobileye-transdev-lohr-maas-i-cristal-shuttles-robotaxis","Mobileye & Lohr to Deploy Autonomous Shuttles by 2023","i-Cristal autonomous electric shuttle to incorporate Mobileye’s self-driving system to offer driverless mobility services across Europe and around the world.\n\n","https://static.mobileye.com/website/us/corporate/images/aced3daaa3585348ff5b661365d42a7a_1614084154600.jpg","i-Cristal robotaxi shuttle by Lohr Group, Transdev ATS and Mobileye","\u003Cp>The path to the widespread, worldwide adoption of autonomous vehicles, we firmly believe, \u003Ca href=\"https://www.mobileye.com/blog/how-robotaxis-will-lead-the-way-toward-the-fully-autonomous-future/\" target=\"_blank\" rel=\"noopener noreferrer\">leads straight through self-driving Mobility-as-a-Service\u003C/a>. That drive is now taking another big step forward in the form of our latest collaboration with two new partners in France: Transdev, one of the largest transport operators in the world, and the Lohr Group, a major global manufacturer of commercial vehicles.\u003C/p>\n\u003Cp>The partnership will see Mobileye&rsquo;s self-driving system integrated into the i-Cristal electric shuttle, manufactured by the Strasbourg-based Lohr Group and to be deployed and operated by Transdev &ndash; specifically the Parisian company&rsquo;s Autonomous Transport Systems (ATS) division. The i-Cristal offers space for up to 16 passengers, with a ramp to enable full accessibility for the less mobile. The electric vehicle can travel at speeds of up to 50 kilometers per hour (or 31 miles per hour), all with zero emissions. Mobileye&rsquo;s self-driving system features our \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">True Redundancy&trade;\u003C/a> sensing system and \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">Responsibility-Sensitive Safety\u003C/a> model.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/d5a1217471494322a72cb4e0034b8985_1614179590376.jpg\" alt=\"Transdev and Mobileye Mobility-as-a-Service (MaaS) solution\" width=\"2000\" height=\"1244\" />\u003C/p>\n\u003Cp>Following initial testing in Jerusalem and France, Mobileye, Transdev, and Lohr aim to prepare the i-Cristal with Mobileye Inside for production by 2022. Commercial operations are slated to begin in 2023 &ndash; initially in Europe, before extending to locations across the globe.\u003C/p>\n\u003Cp>This new collaboration between Mobileye, Transdev ATS, and the Lohr Group represents the latest in a string of \u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">self-driving MaaS\u003C/a> partnerships inked with local operators around the world, including RATP (also in France), \u003Ca href=\"https://www.mobileye.com/blog/mobileye-to-deploy-robotaxis-in-japan-with-willer/\" target=\"_blank\" rel=\"noopener noreferrer\">Willer\u003C/a> (in Japan), the municipality of \u003Ca href=\"https://www.mobileye.com/news/mobileyes-global-ambitions-take-shape-new-deals-china-south-korea/\" target=\"_blank\" rel=\"noopener noreferrer\">Daegu City\u003C/a> (in South Korea), \u003Ca href=\"https://www.mobileye.com/news/mobileye-is-bringing-driverless-maas-to-the-uae/\" target=\"_blank\" rel=\"noopener\">Al Habtoor\u003C/a> (in the United Arab Emirates), and our local project in Tel Aviv.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/news/mobileye-transdev-ats-lohr-group-av-shuttles/\" target=\"_blank\" rel=\"noopener noreferrer\">Read more about the deal with Transdev and Lohr in the news release\u003C/a>.\u003C/p>","2021-02-25T08:00:00.000Z","Driverless MaaS, News",{"id":1880,"type":24,"url":1881,"title":1882,"description":1883,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1883,"image":1884,"img_alt":1885,"content":1886,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1877,"tags":1887},151,"mobileye-transdev-ats-lohr-group-av-shuttles","Mobileye, Transdev ATS and Lohr Group to Develop AV Shuttles","Mobileye, Transdev Autonomous Transport System, and Lohr Group have formed a strategic collaboration to develop and deploy autonomous shuttles.","https://static.mobileye.com/dev/website/us/corporate/images/1c559452aa38c8d4e91f6def9279436e_1663143546097.jpg","Mobileye, Transdev ATS and Lohr Group will integrate Mobileye’s self-driving system into the i-Cristal autonomous electric shuttle.","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>What&rsquo;s New:\u003C/strong>&nbsp;Mobileye, an Intel Company; Transdev Autonomous Transport System (ATS), part of Transdev Group dedicated to autonomous mobility solutions; and Lohr Group, a mobility solutions manufacturer, have formed a strategic collaboration to develop and deploy autonomous shuttles. The companies are integrating Mobileye&rsquo;s self-driving system into the i-Cristal electric shuttle, manufactured by Lohr Group, with plans to integrate it into public transportation services across the globe, starting in Europe.\u003C/p>\n\u003Cp>&ldquo;Our collaboration with Transdev ATS and Lohr Group serves to grow Mobileye&rsquo;s global footprint as the autonomous vehicle (AV) technology partner of choice for pioneers in the transportation industry.&rdquo;&nbsp;&nbsp;&ndash;Johann Jungwirth, vice president of Mobility-as-a-Service at Mobileye&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Why It Matters:\u003C/strong>&nbsp;By integrating the autonomous i-Cristal shuttle into Transdev&rsquo;s existing mobility service networks, the companies aim to improve the efficiency and convenience of mass transportation solutions. Autonomous mobility can be woven into the fabric of transportation networks to distribute service when and where it&rsquo;s needed, while also optimizing the fleets, lowering transportation costs and improving customer experiences.\u003C/p>\n\u003Cp>\u003Cstrong>How It Works:\u003C/strong> Over the next year, Mobileye will work with Transdev ATS and Lohr Group to integrate and deploy i-Cristal autonomous shuttles leveraging Mobileye&rsquo;s AV technology, Transdev ATS&rsquo;s technology and Lohr Group&rsquo;s industrial expertise. The three companies will initially test vehicles on roadways in France and Jerusalem, aiming to ready technology designs for production by 2022. The companies expect to deploy the self-driving i-Cristal shuttles in public transportation networks by 2023.\u003C/p>\n\u003Cp>\u003Cstrong>More About Mobileye&rsquo;s Self-Driving System:\u003C/strong>&nbsp;Mobileye&rsquo;s self-driving system is a turnkey AV solution that delivers safety via two core concepts: Mobileye&rsquo;s formal Responsibility-Sensitive Safety model for the safety of the system&rsquo;s decision-making, and a perception system featuring True Redundancy&trade; whereby two independent subsystems (cameras and radars+lidars) combine to enable robust perception. The self-driving system can also be deployed without geographical limitation thanks to Mobileye&rsquo;s Road Experience Management&trade; AV mapping technology through which a proprietary, crowdsourced AV map of the global road network is created and then continuously and automatically updated using data gathered from mass-market advanced driver-assistance systems.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\n\u003Cp>Mobileye is the global leader in the development of computer vision and machine learning, data analysis, localization and mapping for advanced driver-assistance systems and automated driving. Mobileye&rsquo;s technology helps keep people safer on the road, reduces the risks of traffic accidents, saves lives and aims to revolutionize the driving experience by enabling autonomous driving. Mobileye&rsquo;s proprietary software algorithms and EyeQ&reg; chips perform detailed interpretations of the visual field in order to anticipate possible collisions with other vehicles, pedestrians, cyclists, animals, debris and other obstacles. Mobileye&rsquo;s products are also able to detect roadway markings such as lanes, road boundaries, barriers and similar items; identify and read traffic signs, directional signs and traffic lights; create a RoadBook&trade; of localized drivable paths and visual landmarks using REM&trade;; and provide mapping for autonomous driving.\u003C/p>\n\u003Cp>\u003Cstrong>About Transdev ATS \u003C/strong>\u003C/p>\n\u003Cp>Transdev ATS is integrator of autonomous transport systems, including AV Supervision, autonomous vehicles and connected infrastructure. Transdev ATS also provides technologies and services to local operators and cities for autonomous mobility services day-to-day operations on a large scale. Transdev ATS is part of Transdev Group.\u003C/p>\n\u003Cp>As an operator and global integrator of mobility, Transdev &ndash; The mobility company &ndash; gives people the freedom to move whenever and however they choose. We are proud to provide 11 million passenger trips everyday thanks to efficient, easy to use and environmentally friendly transportation services that connect people and communities. Our approach is rooted in long-term partnerships with businesses and public authorities, and in the relentless pursuit of the safest and most innovative mobility solutions. We are a team of people serving people, and mobility is what we do. Transdev is jointly held by Caisse des D&eacute;p&ocirc;ts Group (66%) and the Rethmann Group (34%). In 2019, with 85,000 employees in 18 countries, the Group generated total revenues of 7.4 billion euros. For more information:&nbsp;\u003Ca href=\"http://www.transdev.com/\" target=\"_blank\" rel=\"noopener noreferrer\">www.transdev.com\u003C/a>.\u003C/p>\n\u003Cp>\u003Cstrong>About Lohr Group\u003C/strong>\u003C/p>\n\u003Cp>Lohr Group is a mobility solution manufacturer designing, building and selling passenger and cargo transportation systems. Today its global industrial presence includes six factories on three continents, 2,000 employees and one R&amp;D hub. Lohr maintains its position as the global leader in mixed vehicle carriers. It is also developing business in rail transport, and leads projects promoting sustainable mobility.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ca href=\"https://static.mobileye.com/website/corporate/zip/mobileye-av-shuttles.zip\" target=\"_blank\" rel=\"noopener noreferrer\">&raquo; Download all images (ZIP, 8 MB)\u003C/a>\u003C/p>","News, Autonomous Driving, Driverless MaaS",{"id":1889,"type":5,"url":1890,"title":1891,"description":1892,"primary_tag":40,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1892,"image":1893,"img_alt":1894,"content":1895,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1896,"tags":1897},87,"self-driving-maas-suite","The Full Suite of Self-Driving Mobility Solutions","Mobileye and Moovit come together to offer modular and end-to-end autonomous mobility solutions that are poised to transform the future of transportation.","https://static.mobileye.com/website/us/corporate/images/baccc42178b578f0e723fcd9fda6fa9c_1612257360150.jpg","Self-driving mobility solutions by Mobileye","\u003Cp>Realizing the dream of \u003Ca href=\"https://www.mobileye.com/blog/munich-av-video/\" target=\"_blank\" rel=\"noopener noreferrer\">riding in a self-driving vehicle\u003C/a> will, in the not-so-distant future, be as simple as a few taps on your smartphone. But what may be just that easy for the passenger demands a series of highly advanced technologies to execute. And we&rsquo;re proud to support our partners around the world with key individual components of the mobility stack or a full-stack, end-to-end solution. Here&rsquo;s a quick look at each of the individual layers comprising Mobileye&rsquo;s modular \u003Ca href=\"https://www.mobileye.com/blog/how-robotaxis-will-lead-the-way-toward-the-fully-autonomous-future/\" target=\"_blank\" rel=\"noopener noreferrer\">self-driving Mobility-as-a-Service\u003C/a> (MaaS) suite.\u003C/p>\n\u003Cp>\u003Cstrong>MaaS Layer 1: Self-Driving System\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/eee55226f5069faf30b120897681e21a_1612261972733.png\" alt=\"The most essential element to an autonomous vehicle is the self-driving system, like the one developed by Mobileye\" />\u003C/p>\n\u003Cp>Stemming from years of autonomous-vehicle development and decades of experience in driver-assistance technologies, Mobileye&rsquo;s turnkey self-driving system can be integrated into a wide range of vehicles from our automotive manufacturing partners. The system includes several of Mobileye&rsquo;s state-of-the-art technologies, including our \u003Ca href=\"https://www.mobileye.com/blog/why-tops-arent-tops-when-it-comes-to-av-processors/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ&reg;\u003C/a> System-on-a-Chip devices, \u003Ca href=\"https://www.mobileye.com/news/mobileye-wins-prestigious-2020-pace-award-for-rem-mapping-tech/\" target=\"_blank\" rel=\"noopener\">REM&trade;\u003C/a>-powered Mobileye Roadbook&trade; AV maps, \u003Ca href=\"https://www.mobileye.com/blog/av-safety-demands-true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">True Redundancy&trade;\u003C/a> sensing systems, and \u003Ca href=\"https://www.mobileye.com/blog/responsibility-sensitive-safety-gains-traction-worldwide/\" target=\"_blank\" rel=\"noopener noreferrer\">Responsibility-Sensitive Safety\u003C/a> (RSS) framework.\u003C/p>\n\u003Cp>\u003Cstrong>MaaS Layer 2: Self-Driving Vehicles\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/22232b514b108d42354a20ffad16f317_1612261992976.png\" alt=\"Fully integrated self-driving vehicles will come in many shapes and sizes, incorporating various technologies developed by Mobileye\" />\u003C/p>\n\u003Cp>Our self-driving system, like all Mobileye technologies, is built to be easily and seamlessly integrated into a variety of vehicles for the transportation of people as well as goods. Mobileye is partnering with automotive OEMs to develop and produce multiple, fully driverless, Level-4 vehicle platforms, suitable for anything from a self-driving car to an autonomous shuttle bus. These ready-for-deployment vehicles can be provided to private and public transportation operators.\u003C/p>\n\u003Cp>\u003Cstrong>MaaS Layer 3: Fleet &amp; Tele-Operations\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/1237500967558ed44738f744744401ea_1612262008370.png\" alt=\"Mobileye and Moovit offer tele-operation capabilities to enable to smooth integration of self-driving vehicles on today&rsquo;s roadways\" />\u003C/p>\n\u003Cp>Our fleet management services provide fleet operators with the tools they need to make informed decisions in keeping their vehicles charged, maintained, and clean. The \u003Ca href=\"https://www.eetasia.com/why-autonomous-vehicles-will-need-teleoperation/\" target=\"_blank\" rel=\"noopener noreferrer\">tele-operation\u003C/a> application and functionality allows for decision support should human operators be required to support with rules, routing, or semantic decisions, or to approve specific maneuvers.\u003C/p>\n\u003Cp>\u003Cstrong>MaaS Layer 4: Mobility Intelligence Platform &amp; Services\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/010fbd46a94df2486fb647d2c7aeebdb_1612262027036.png\" alt=\"Mobileye and Moovit offer a full stack of mobility services to deploy self-driving vehicles where they&rsquo;re needed, when they&rsquo;re needed\" />\u003C/p>\n\u003Cp>With \u003Ca href=\"https://www.mobileye.com/news/welcoming-moovit-to-the-fold/\" target=\"_blank\" rel=\"noopener\">Moovit\u003C/a>&rsquo;s Urban Mobility Analytics, the self-driving vehicles can be deployed precisely \u003Ca href=\"https://moovit.com/blog/answering-the-big-microtransit-questions-how-where-and-why-on-demand-works/\" target=\"_blank\" rel=\"noopener noreferrer\">where they&rsquo;re needed, when they&rsquo;re needed\u003C/a>. Based on predicted and real-time passenger demand, this crucial layer of mobility intelligence allows our service to be organically woven into existing fabric of a city&rsquo;s public transportation network, \u003Ca href=\"https://www.weforum.org/agenda/2020/01/will-robo-taxis-bring-radical-disruption-to-our-streets-or-gridlock/\" target=\"_blank\" rel=\"noopener noreferrer\">reducing (instead of adding to) traffic congestion\u003C/a>. In addition, the mobility platform includes a wide range of services from booking, payment, and ticketing to routing and even cloud infrastructure.\u003C/p>\n\u003Cp>\u003Cstrong>MaaS Layer 5: Rider Experience &amp; Services\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/d3159d6882666f2ec5ba52b3b0f9d1af_1612262042319.png\" alt=\"Moovit&rsquo;s end-user app will enable passengers to order up a Mobileye self-driving robotaxi on demand\" />\u003C/p>\n\u003Cp>Moovit&rsquo;s multi-modal urban mobility end-user apps for web, Android, and iOS offer convenient and seamless user interaction. The solution is designed to enable best-in-class user experience, inform passengers, and generate confidence and trust in the safety and convenience of the self-driving services.\u003C/p>\n\u003Cp>Combining the strengths of Mobileye&rsquo;s cutting-edge self-driving technologies with \u003Ca href=\"https://www.mobileye.com/news/moovit-2020-global-public-transport-report/\" target=\"_blank\" rel=\"noopener\">Moovit\u003C/a>&rsquo;s proven mobility services yields a wide variety of engagement options and a complete end-to-end solution, already embraced by local partners around the world: \u003Ca href=\"https://www.mobileye.com/blog/mobileye-hosts-its-first-investor-summit-since-the-intel-acquisition/\" target=\"_blank\" rel=\"noopener noreferrer\">France\u003C/a>, the \u003Ca href=\"https://www.mobileye.com/news/mobileye-is-bringing-driverless-maas-to-the-uae/\" target=\"_blank\" rel=\"noopener\">United Arab Emirates\u003C/a>, \u003Ca href=\"https://www.mobileye.com/news/mobileyes-global-ambitions-take-shape-new-deals-china-south-korea/\" target=\"_blank\" rel=\"noopener noreferrer\">South Korea\u003C/a>, and \u003Ca href=\"https://www.mobileye.com/blog/mobileye-to-deploy-robotaxis-in-japan-with-willer/\" target=\"_blank\" rel=\"noopener noreferrer\">Japan\u003C/a> (with more partnerships soon to follow).\u003C/p>\n\u003Cp>For more information, \u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">visit the Mobileye Self-Driving Mobility Services page\u003C/a>.\u003C/p>","2021-02-02T08:00:00.000Z","Driverless MaaS",{"id":1899,"type":24,"url":1900,"title":1901,"description":1902,"primary_tag":40,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1902,"image":1903,"img_alt":1904,"content":1905,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1906,"tags":1878},86,"moovit-2020-global-public-transport-report","Moovit Releases 2020 State of Public Transport Report","The 2020 Global Public Transport Report released by our affiliate Moovit sets the baseline for self-driving mobility. \n","https://static.mobileye.com/website/us/corporate/images/6e7feedf137553cbc48ccdd374e25210_1611747997915.jpg","Moovit 2020 Global Public Transport Report","\u003Cp>There&rsquo;s a lot to be said for public transportation and its ability to move people around cities &ndash; but there&rsquo;s a lot of room for improvement there, too. That&rsquo;s the goal of self-driving Mobility-as-a-Service (MaaS). But before \u003Ca href=\"https://www.mobileye.com/blog/how-robotaxis-will-lead-the-way-toward-the-fully-autonomous-future/\" target=\"_blank\" rel=\"noopener noreferrer\">robotaxis\u003C/a> can be deployed en masse, we first need to understand how people commute &ndash; especially via public transportation. That's precisely the kind of insights you'll find in the \u003Ca href=\"https://moovitapp.com/insights/en/Moovit_Insights_Public_Transit_Index-countries\" target=\"_blank\" rel=\"noopener noreferrer\">2020 Global Public Transport Report\u003C/a> recently released by Moovit.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/news/welcoming-moovit-to-the-fold/\" target=\"_blank\" rel=\"noopener\">Moovit\u003C/a>, an Intel company, is leading provider of MaaS solutions. Its flagship app allows users to plan and pay for trips using a variety of transportation modes &ndash; including local bike services, ride hailing, and public transit. This in turn has also made Moovit the world&rsquo;s largest repository of transit data.\u003C/p>\n\u003Cp>In this report, Moovit uses the data it has collected from tens of millions of trip requests to analyze the state of public transportation in 2020. The topics they cover include commute times, wait times, micro-mobility use, and barriers to micro-mobility.\u003C/p>\n\u003Cp>The report also allows users to compare the 2020 statistics to those of 2019. Many of the differences reflect the effect of COVID-19 on the use of public transportation, covered in two sections devoted specifically to the pandemic&rsquo;s effects.\u003C/p>\n\u003Cp>The partnership between Mobileye with its prowess in self-driving technology and Moovit with its ability to collect and analyze this data brings the goal of efficient \u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">self-driving MaaS\u003C/a> closer than ever, helping turn the promise of smart mobility into reality.\u003C/p>\n\u003Cp>\u003Ca href=\"https://moovitapp.com/insights/en/Moovit_Insights_Public_Transit_Index-countries\" target=\"_blank\" rel=\"noopener noreferrer\">Click here to read the full report\u003C/a>.\u003C/p>","2021-01-27T08:00:00.000Z",{"id":1908,"type":24,"url":1909,"title":1910,"description":1911,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1911,"image":1912,"img_alt":1913,"content":1914,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1915,"tags":402},85,"mobileye-ces-2021-in-the-news","Mobileye Had a 'Big Week' at CES","Here’s what leading news outlets had to say about our revelations at this year’s show.\n","https://static.mobileye.com/website/us/corporate/images/448864fd678f8f8a474dc13e63d2904c_1611571080172.jpg","Mobileye in the news from CES 2021","\u003Cp>Our chief executive \u003Ca href=\"https://www.mobileye.com/blog/mobileye-ces-2021-recap/\" rel=\"noopener noreferrer\" target=\"_blank\">spoke extensively during CES\u003C/a> this year, holding multiple \u003Ca href=\"https://www.mobileye.com/blog/ceo-amnon-shashua-on-the-technological-megashifts-impacting-our-world/\" rel=\"noopener noreferrer\" target=\"_blank\">online sessions\u003C/a> to showcase the latest technological developments we’re working on at Mobileye. Several leading publications picked up on the news; here’s what they had to say about Mobileye’s revelations at this year's virtual tech expo.\u003C/p>\u003Cp>\u003Cspan style=\"color: black;\">“Aspirations to place self-driving technology in the hands of ordinary vehicle owners had largely been relegated to the back burner,” \u003C/span>\u003Ca href=\"https://www.autonews.com/ces/mobileye-intends-put-self-driving-tech-consumers-hands-2025\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: black;\">\u003Cem>Automotive News\u003C/em>\u003C/a>\u003Cspan style=\"color: black;\"> noted. “That's changing. Global supplier Mobileye unwrapped plans last week to make self-driving technology available in personally owned vehicles in 2025.” \u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: black;\">“A lecture by Mobileye CEO Amnon Shashua on Tuesday at CES 2021 was one of the more ‘believable presentations’ ever given on the topic of AVs,” reported \u003C/span>\u003Ca href=\"https://www.eetimes.com/why-consumer-av-in-2025-is-believable/\" rel=\"noopener noreferrer\" target=\"_blank\">\u003Cem>EE Times\u003C/em>\u003C/a>\u003Cspan style=\"color: black;\">, citing analyst Egil Juliussen.\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: black;\">“Mobileye, a subsidiary of Intel, is scaling up its autonomous vehicle program and plans to launch test fleets in at least four more cities over the next several months,” \u003C/span>\u003Ca href=\"https://techcrunch.com/2021/01/11/mobileye-is-bringing-its-autonomous-vehicle-test-fleets-to-at-least-four-more-cities-in-2021/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: black;\">\u003Cem>TechCrunch\u003C/em>\u003C/a>\u003Cem style=\"color: black;\"> \u003C/em>\u003Cspan style=\"color: black;\">reported. “The expansion announcement, along with details about a new lidar System on Chip product that is under development and will come to market in 2025, illustrates Mobileye’s ambitions to commercialize automated vehicle technology and bring it to the masses.”\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: black;\">“As part of Intel corporation, Mobileye can tap into manufacturing resources few can match, not only to produce the chips on schedule but at scale,” noted \u003C/span>\u003Ca href=\"https://www.wired.com/story/mobileye-lidar-on-a-chip-intel/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: black;\">\u003Cem>Wired\u003C/em>\u003C/a>\u003Cspan style=\"color: black;\">. “More importantly, the lidar SoC is emblematic of Intel’s way forward: looking beyond the CPU.\"\u003C/span>\u003C/p>\u003Cp>“Mobileye aims to advance multiple segments of the autonomous vehicle market and bring AVs closer to mass adoption,” \u003Ca href=\"https://www.forbes.com/sites/marcochiappetta/2021/01/12/intels-mobileye-achieves-lidar-breakthrough-with-new-photonic-integrated-circuit/?sh=21cecf683e90\" rel=\"noopener noreferrer\" target=\"_blank\">\u003Cem>Forbes\u003C/em>\u003C/a>\u003Cem> \u003C/em>reported. “The strategies and technologies revealed during CES 2021 seem poised to do just that.”\u003C/p>\u003Cp>It's been a “pretty big week for you so far at CES,” \u003Ca href=\"https://finance.yahoo.com/video/mobileye-ceo-lidar-chip-autonomous-150131831.html\" rel=\"noopener noreferrer\" target=\"_blank\">\u003Cem>Yahoo! Finance Live\u003C/em>\u003C/a> host Brian Sozzi said in a video interview with Professor Shashua. Sozzi closed by saying “count me as one that is excited to take an Instagram story as a driverless car goes right by me, maybe at 65 miles an hour.”\u003C/p>\u003Cp>\u003Cspan style=\"color: black;\">“If Mobileye's plans for 2021 are anything to go by,” concluded \u003C/span>\u003Ca href=\"https://www.cnet.com/roadshow/news/mobileye-ces-2021-radar-lidar-mapping/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: black;\">\u003Cem>CNET\u003C/em>\u003C/a>\u003Cspan style=\"color: black;\">, “it's probably going to be a big year for self-driving car development and we're excited to see it.”\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"color: black;\">Watch the highlights and full replays from CES below, and \u003C/span>\u003Ca href=\"https://www.mobileye.com/news/ces-2021-mobileye-avs-on-move/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: black;\">read the news release\u003C/a>\u003Cspan style=\"color: black;\"> for more.\u003C/span>\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Ciframe class=\"ql-video\" frameborder=\"0\" allowfullscreen=\"true\" src=\"https://www.youtube.com/embed/videoseries?list=PLWCfS_Yhbvs40PTH9__2xtzvRApXa4axr\" height=\"315\" width=\"560\">\u003C/iframe>\u003Cp>\u003Cbr>\u003C/p>","2021-01-24T22:00:00.000Z",{"id":1917,"type":5,"url":1918,"title":1919,"description":1920,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1920,"image":1921,"img_alt":1922,"content":1923,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1924,"tags":1566},83,"mobileye-ces-2021-recap","Hear What Our CEO Had to Say at CES 2021","This year’s virtual show saw Prof. Shashua share groundbreaking updates on Mobileye’s automated driving technologies – including our crowdsourced AV mapping and active sensor development.","https://static.mobileye.com/website/us/corporate/images/d5f42e5414969cfa455b9b3045822334_1610537601061.jpg","Mobileye CEP Prof. Amnon Shashua shows our upcoming LiDAR SoC during \"Under the Hood\" at CES 2021","\u003Cp>This year&rsquo;s \u003Ca href=\"https://www.mobileye.com/news/mobileye-ces-2021/\" target=\"_blank\" rel=\"noopener\">CES took place in a virtual online format\u003C/a>, including several events with our chief executive, \u003Ca href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Prof. Amnon Shashua\u003C/a>, who shared \u003Ca href=\"https://www.mobileye.com/news/ces-2021-mobileye-avs-on-move/\" target=\"_blank\" rel=\"noopener noreferrer\">updates on the groundbreaking progress\u003C/a> made on several fronts of Mobileye&rsquo;s automated driving technologies.\u003C/p>\n\u003Cp>As in years past, Mobileye&rsquo;s presence revolved principally around &ldquo;Under the Hood,&rdquo; Prof. Shashua&rsquo;s annual hour-long press conference, which you can watch in full right here.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/B7YNj66GxRA\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>The main event was previewed during &ldquo;It&rsquo;s Time to Go,&rdquo; a half-hour conversation between Shashua and Ed Niedermeyer, Director of Communications at PAVE. (Click \u003Ca href=\"https://youtu.be/MVU0tG1yJTU\" target=\"_blank\" rel=\"noopener noreferrer\">here\u003C/a> to watch the highlights, or \u003Ca href=\"https://youtu.be/N5pJMMO6i4w\" target=\"_blank\" rel=\"noopener noreferrer\">here \u003C/a>for the full recording). These will be followed up today by &ldquo;\u003Ca href=\"https://twitter.com/Mobileye/status/1349082966031560705\" target=\"_blank\" rel=\"noopener noreferrer\">Technological Megashifts Impacting our World\u003C/a>,&rdquo; Prof. Shashua&rsquo;s interview with \u003Cem>New York Times\u003C/em> columnist Thomas Friedman on the future of artificial intelligence.\u003C/p>\n\u003Cp>The primary \u003Ca href=\"https://www.linkedin.com/feed/update/urn:li:activity:6754849064845639680\" target=\"_blank\" rel=\"noopener noreferrer\">focus this year was on REM&trade;\u003C/a> &ndash; Mobileye&rsquo;s innovative, disruptive method for crowd-sourced mapping of the world&rsquo;s roadways to support autonomous vehicles. Thanks to the universality of REM&rsquo;s approach, we&rsquo;re expanding testing of our camera-only test AV from Jerusalem and Munich to several more cities in the coming months, including Detroit, Tokyo, Shanghai, Paris, and (pending regulation) New York.\u003C/p>\n\u003Cp>Alongside REM, Shashua also detailed the development of \u003Ca href=\"https://static.mobileye.com/website/corporate/media/radar-lidar-fact-sheet.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">our own radar and LiDAR sensing systems\u003C/a>, which utilize our parent company Intel&rsquo;s expertise. This new take on radar and LiDAR specifically designed for self-driving applications is meant to address limitations in environmental modelling capabilities for both radar and LiDAR that would be needed to support AVs. The developments, which include a new LiDAR SoC, will open the door to more accurate and cheaper ways to bring the autonomous future to the masses.\u003C/p>\n\u003Cp>Shashua also provided updates on our camera-only AV subsystem, \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">latest advanced driver-assistance system\u003C/a>, \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">Responsibility-Sensitive Safety\u003C/a> framework, and \u003Ca href=\"https://www.mobileye.com/solutions/drive/\" target=\"_blank\" rel=\"noopener\">driverless Mobility-as-a-Service\u003C/a> solutions.\u003C/p>\n\u003Cp>Watch the highlight reels and the full recordings of all the sessions (and more) in the playlist below.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/videoseries?list=PLWCfS_Yhbvs40PTH9__2xtzvRApXa4axr\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2021-01-13T08:00:00.000Z",{"id":1926,"type":5,"url":1927,"title":1928,"description":1929,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1929,"image":1930,"img_alt":1931,"content":1932,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1924,"tags":1933},84,"ceo-amnon-shashua-on-the-technological-megashifts-impacting-our-world","Shashua on the Technological Megashifts Impacting our World","In his final session of CES 2021, Prof. Amnon Shashua sat down virtually with prizewinning journalist Thomas Friedman to discuss emerging developments in the field of artificial intelligence.","https://static.mobileye.com/website/us/corporate/images/eea87ba7ec584dc961c4111e6acfe706_1610558687074.jpg","Amnon Shashua and Thomas Friedman","\u003Cp>Artificial intelligence is fundamental to the development of autonomous vehicles, and is the life&rsquo;s work of our CEO, Amnon Shashua. As a world-renowned expert in the field of artificial intelligence, he was recently \u003Ca href=\"https://www.mobileye.com/news/prof-amnon-shashua-wins-the-dan-david-prize/\" target=\"_blank\" rel=\"noopener\">awarded the Dan David Prize\u003C/a> for his groundbreaking research in the field, he lectures on the subject at the Hebrew University of Jerusalem, and he has founded several companies (in addition to Mobileye!) developing \u003Ca href=\"https://www.mobileye.com/blog/were-in-the-midst-of-an-ai-revolution-says-shashua/\" target=\"_blank\" rel=\"noopener noreferrer\">real-world applications for AI\u003C/a>. In his final session at CES this year, &ldquo;Technological Megashifts Impacting our World,&rdquo;\u003Cspan style=\"color: black;\"> Shashua \"sat down\" &ndash; remotely &ndash; with Pulitzer Prize-winning columnist and author Thomas Friedman to peer into the future of artificial intelligence &ndash; how it is changing the world and affecting our daily lives.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">The two minds covered a surprisingly wide range of AI-related topics. Over the course of half an hour, they tackled issues including shared values between man and machine, the imminent arrival of \"General AI,\" the value of brute-force computational power, the next frontier of natural language processing, the dangers posed by AI, privacy issues, and more.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">Among the most relevant applications to Mobileye was how we are harnessing the power of AI &ndash; not only with our computer-vision algorithms, but for our Responsibility-Sensitive Safety framework. Shashua shared that questions \u003C/span>\u003Cspan style=\"color: #000000;\">&ndash; \u003C/span>\u003Cspan style=\"color: black;\">like: &ldquo;How do you go ahead and translate &lsquo;be careful&rsquo; into code? Into a mathematical formulism?&rdquo; \u003C/span>\u003Cspan style=\"color: #000000;\">&ndash; \u003C/span>\u003Cspan style=\"color: black;\">were what Mobileye and Intel researchers were asking themselves as they developed the concepts behind the \u003C/span>\u003Ca href=\"https://arxiv.org/pdf/1708.06374.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">first paper on RSS\u003C/a>\u003Cspan style=\"color: black;\">, three years ago. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">You can read more about RSS \u003C/span>\u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">here\u003C/a>\u003Cspan style=\"color: black;\">. Be sure to watch the full session in the video below.\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/fDiivbomPHA\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","Events, Opinion, Video, From our CEO",{"id":1935,"type":24,"url":1936,"title":1937,"description":1938,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1938,"image":1939,"img_alt":1938,"content":1940,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1941,"tags":1942},159,"ces-2021-under-the-hood-with-prof-amnon-shashua","CES 2021: Under the Hood with Prof. Amnon Shashua (Replay & Live Blog)","Prof. Amnon Shashua takes a deeper and more technical dive into the company’s latest progress on automated driving technologies.","https://static.mobileye.com/website/us/corporate/images/ca855b5d928c09e8c4c089b57aada668_1666086081120.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>On the heels of&nbsp;\u003Ca href=\"https://www.mobileye.com/news/ces-2021-under-the-hood-with-prof-amnon-shashua/\" target=\"_blank\" rel=\"noopener noreferrer\">Monday&rsquo;s Intel news conference\u003C/a>, Prof. Amnon Shashua takes a deeper and more technical dive into the company&rsquo;s latest progress on automated driving technologies.\u003C/p>\n\u003Cp>He gets under the hood of Mobileye Roadbook&trade; and how Mobileye is building a scalable, autonomous vehicle map that differs greatly from traditional high-definition maps, from how the data is harvested to the state-of-the-art algorithms that automate the map creation. He also discusses the company&rsquo;s lidar ambitions, its RSS progress and the recent announcement of SuperVision&trade;, Mobileye&rsquo;s most advanced hands-free driving system.\u003C/p>\n\u003Cul>\n\u003Cli>\u003Ca href=\"https://static.mobileye.com/website/common/files/Under-the-hood-deck-compressed.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Under the Hood Presentation Slides\u003C/a>\u003C/li>\n\u003C/ul>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Live Blog:\u003C/strong> Follow along below for real-time updates of this virtual event.\u003C/p>\n\u003Cp>10:00 a.m.: Hello and welcome! This is Jeremy Schultz, communications manager at Intel, and thank you for tuning in for Prof. Amnon Shashua&rsquo;s one-hour master class in Mobileye&rsquo;s unique and fascinating approach to bringing autonomous vehicles (AVs) to the world.\u003C/p>\n\u003Cp>In person or virtual, Amnon&rsquo;s CES presentation has been one of my favorite hours at the big show the last few years running. Let&rsquo;s ride into 2021!\u003C/p>\n\u003Cp>Coming to us from Mobileye HQ in Jerusalem, it&rsquo;s Mobileye CEO and Intel Senior VP Amnon Shashua taking the podium.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/cee31e735ed735fe2a628e5d6f4e6161_1663144644715.jpg\" alt=\"Mobileye CEO prof. Amnon Shashua\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>Amnon shared some big news during yesterday&rsquo;s news conference, and today Amnon promises &ldquo;a topic or two to deep dive.&rdquo;\u003C/p>\n\u003Cp>But first a quick business update.\u003C/p>\n\u003Cp>\u003Cstrong>10:01 a.m.\u003C/strong>: Mobileye&rsquo;s business is based on three intermingling pillars:\u003C/p>\n\u003Col>\n\u003Cli>\u003Cstrong>Driving assist\u003C/strong>, &ldquo;which goes from a simple front-facing camera up to a 360, multi-camera setting with very advanced functions&rdquo;\u003C/li>\n\u003Cli>&ldquo;\u003Cstrong>The data\u003C/strong> we collect from crowdsourcing&rdquo; &mdash; now millions of miles per day &shy;&mdash; powering not only &ldquo;our maps but also creating a new data business&rdquo;\u003C/li>\n\u003Cli>\u003Cstrong>The full-stack self-driving system\u003C/strong>: computing, algorithms, perception, driving policy, safety, hardware on up to mobility as a service.\u003C/li>\n\u003C/ol>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/daa37f354e473d33ebe69f52d6e50aef_1663144671602.png\" alt=\"Mobileye business pillars\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:02 a.m.\u003C/strong>: &lsquo;Twas a stop-and-go 2020: &ldquo;we finished 2020 with 10% year on year growth in shipping EyeQ chips,&rdquo; Amnon says, &ldquo;very impressive in my mind, given that three months we had a shutdown of auto production facilities.&rdquo;\u003C/p>\n\u003Cp>Mobileye earned 37 new design wins &mdash; to account for an additional 36 million lifetime units &mdash; joining 49 running production programs.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/626e887ce6aa402d381bb50f3cb81e80_1663144721378.png\" alt=\"Mobileye ADAS business in 2020\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:04 a.m.\u003C/strong>:&nbsp;Mobileye&rsquo;s product portfolio has expanded, Amnon explains, including: EyeQ chips and associated software; a new domain controller; a full high-end driving assist board (PCB); the full-stack self-driving system with hardware and sensors; and finally, with Moovit, which Mobileye acquired in 2020, &ldquo;the customer-facing portion of mobility as a service.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/0cab50ed5d04c74f7dfaef82f75722c8_1663144776733.png\" alt=\"Mobileye product portfolio\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:05 a.m.\u003C/strong>: &ldquo;This slide, I think is one of the most critical slides in this presentation.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/0fe28b9bab48a1ced00dc0bd18b9047d_1663144839064.png\" alt=\"The trinity of Mobileye's approach\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>&ldquo;We call this the trinity of our approach, the three components that are very unique to how Mobileye sees this world.&rdquo; Ooh, love a clever double meaning.\u003C/p>\n\u003Cp>Firstly, Mobileye doesn&rsquo;t see the difference between driving assist and autonomous driving in terms of capability or performance, but rather \u003Ca href=\"https://en.wikipedia.org/wiki/Mean_time_between_failures\" target=\"_blank\" rel=\"noopener noreferrer\">mean time between failures\u003C/a>.\u003C/p>\n\u003Cp>The same system can serve both functions, but &ldquo;if you remove the driver from the experience, the mean time between failure should be astronomically higher than the mean time between failure when a driver is there.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>10:06 a.m.\u003C/strong>: &ldquo;In order to reach those astronomical levels of mean time between failure, we build redundancies.&rdquo;\u003C/p>\n\u003Cp>This is achieved by having separate camera and lidar-radar subsystems &mdash; each capable of end-to-end autonomous driving alone, and each with its own internal redundancies. (Imagine if you could run as many alternative and parallel algorithms on the raw sensor input from your eyes as you wanted, like a \u003Ca href=\"https://en.wikipedia.org/wiki/Predator_(fictional_species)\" target=\"_blank\" rel=\"noopener noreferrer\">Predator\u003C/a> - Mobileye&rsquo;s computers can and do.)\u003C/p>\n\u003Cp>\u003Cstrong>10:07 a.m.\u003C/strong>: Trinity component two: &ldquo;the high-definition maps, which we call AV maps.&rdquo; While other companies drive specialized vehicles to collect data, Mobileye crowdsources data from current cars on the road. &ldquo;It is a very, very difficult task,&rdquo; Amnon says.\u003C/p>\n\u003Cp>&ldquo;Today we passed a threshold in which all that development is becoming very useful to our business.&rdquo; Mark that note.\u003C/p>\n\u003Cp>\u003Cstrong>10:08 a.m.\u003C/strong>: Trinity component three: safety.\u003C/p>\n\u003Cp>The car&rsquo;s &ldquo;driving policy&rdquo; governs its decisions. &ldquo;How do we define mathematically what it means to drive carefully?&rdquo; Mobileye has an answer in \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener\">RSS\u003C/a>, and &ldquo;we&rsquo;re evangelizing it through regulatory bodies, industry players around the world with quite great success.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>10:09 a.m.\u003C/strong>: So, how do you build an AV with nothing but a few common 8-megapixel cameras? (Tiny camera trivia: the first iPhone camera to reach 8MP was way back in 2011 with \u003Ca href=\"https://en.wikipedia.org/wiki/IPhone_4S\" target=\"_blank\" rel=\"noopener noreferrer\">the 4S\u003C/a>.)\u003C/p>\n\u003Cp>A rare look inside.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/160b623a52fc2831f7eda714140fb86a_1663144891955.png\" alt=\"The camera-only subsystem\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>Start with two EyeQ5 chips and add seven long range cameras and four parking cameras, all with varying fields of view. This &ldquo;gives us both long-range and short-range vision perception, no radars, no lidars.&rdquo;\u003C/p>\n\u003Cp>In our first rides of the hour, Amnon shows clips of autonomous testing in Jerusalem, Munich and Detroit. Mobileye&rsquo;s in the motor city!\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/d535f842efc38d04edf258fa51813b46_1663144932989.png\" alt=\"Mobileye car driving in Detroit\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>&ldquo;What you see here is complex driving in deep urban settings, and it&rsquo;s all done by the camera subsystem.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>10:10 a.m.\u003C/strong>: Mobileye calls this multi-camera system &ldquo;vidar&rdquo; for visual lidar. It builds an instantaneous 3D map and then uses lidar-like algorithms to detect road users.\u003C/p>\n\u003Cp>Mobileye then duplicates this stream to perform different algorithmic functions in parallel &mdash; an example of &ldquo;internal redundancies&rdquo; in the system.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/060b45a554cd553257a0fc5d6fff7344_1663144974768.png\" alt=\"Achieving the MTBF goal for L2+ with Vidar\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:12 a.m.\u003C/strong>: This camera-only self-driving system isn&rsquo;t just going into future AVs &mdash; Mobileye is making it available now for driving assist as SuperVision. &ldquo;It can do much more than simple lane-keeping assist.&rdquo;\u003C/p>\n\u003Cp>&ldquo;The first productization is going to be with Geely,&rdquo; Amnon says, \u003Ca href=\"https://www.mobileye.com/news/mobileye-av-stack/\" target=\"_blank\" rel=\"noopener noreferrer\">launching in Q4\u003C/a> of 2021. &ldquo;We&rsquo;re not talking about something really futuristic &mdash; it&rsquo;s really around the corner.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/d4402520f0fd20dac3ed75c2557676b0_1663145014751.png\" alt=\"Mobileye SuperVision\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:14 a.m.\u003C/strong>: When it comes to testing and expanding to new countries, &ldquo;The pandemic actually allowed us to be much more efficient.&rdquo; Wuzzat?\u003C/p>\n\u003Cp>In Munich, two non-engineer field support employees got things up and running with remote help in just a couple weeks (normally performed by a couple dozen engineers). &ldquo;It gave us a lot of confidence that we can scale much, much faster.&rdquo;\u003C/p>\n\u003Cp>Mobileye customers have since taken more than 300 test drives (well, rides) in Munich, and now Mobileye is expanding testing to Detroit, Tokyo, Shanghai, Paris and as soon as local regulations allow it, New York City.\u003C/p>\n\u003Cp>&ldquo;New York City is a very, very interesting geography, driving culture, complexity to test,&rdquo; Amnon says. &ldquo;We want to test in more difficult places.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/8cf1c544ab48b6bbb29a32f55b2045a3_1663145055278.png\" alt=\"Expanding footprint of SuperVision\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:16 a.m.\u003C/strong>: Sub-system two uses only lidar and radar.\u003C/p>\n\u003Cp>While most other AV companies rely on the combination of cameras and lidar, &ldquo;we excluded cameras from this subsystem.&rdquo; This &ldquo;makes life a bit more difficult.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/7c84a8c36e6bec2038a810bb64ee5e16_1663145103724.png\" alt=\"Achieving the MTBF goal for L4\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>But the result is that much higher mean time between failures needed to remove the driver. &ldquo;Everything is done at the same performance level as the camera subsystem that we have been showing, and here there are no cameras at all.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/3fc5c15572e5e5cdc39a1572eb588e00_1663145148216.png\" alt=\"Different cameras\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:18 a.m.\u003C/strong>: Time for the first deep dive: maps.\u003C/p>\n\u003Cp>A quick history: in 2015, Mobileye announced crowdsourced mapping. In 2018, cars from BMW, Nissan and Volkswagen started sending data &mdash; not images but privacy-preserving &ldquo;snippets, lanes, landmarks&rdquo; that only add up to 10 kilobytes per kilometer.\u003C/p>\n\u003Cp>\u003Cstrong>10:20 a.m.\u003C/strong>: Why are these ultra-detailed maps needed? &ldquo;The current state of AI can detect road users at fidelity approaching human perception,&rdquo; Amnon says. But do that and understand the many complexities of the driving environment &shy;&mdash; lanes, crosswalks, signals, signs, curves, right-of-way, turns &mdash; &ldquo;right now it&rsquo;s not realistic.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/d3f1e17f96b497ef3402cab5e626fabb_1663145195495.png\" alt=\"The motivation behind high resolution maps\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>The big challenge with building AV maps is scale. Covering a city or two, &ldquo;that&rsquo;s fine,&rdquo; but supporting millions of cars with driving assist means &ldquo;they need to drive everywhere.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/fd92c3fc56c573b8ade8467b2943aa6a_1663145240548.png\" alt=\"The challenges\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:22 a.m.\u003C/strong>: This will be on the test: Amnon predicts it&rsquo;ll be &ldquo;2025, in which a self-driving system can reach the performance and the cost level for consumer cars.&rdquo; Adjust your calendar accordingly.\u003C/p>\n\u003Cp>\u003Cstrong>10:24 a.m.\u003C/strong>: Maps also need to be fresh &mdash; today Mobileye&rsquo;s maps are updated monthly but the eventual target is &ldquo;a matter of minutes.&rdquo;\u003C/p>\n\u003Cp>And maps need &ldquo;centimeter-level accuracy.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>10:25 a.m.\u003C/strong>: The problem with maps built by driving specialized cars with huge 360-degree lidar sensors and cameras? The precision is overkill and the semantics is absent.\u003C/p>\n\u003Cp>Semantics? &ldquo;We divide semantics into these five layers: drivable paths, lane priority, association between traffic light and crosswalks to lane association, stopping and yield points and common speed.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/1f39114a49f6037c288ce28ea52c89c5_1663145347532.png\" alt=\"What's wrong with the typical HD map approach? Part 1\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>Common speed is how fast all the humans are going, &ldquo;in order to drive in a way that doesn&rsquo;t obstruct traffic.&rdquo; Or drive people crazy.\u003C/p>\n\u003Cp>\u003Cstrong>10:26 a.m\u003C/strong>.: &ldquo;Now these semantic layers are very, very difficult to automate. This is where the bulk of the non-scalability of building high-definition maps comes into.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>10:27 a.m.\u003C/strong>: In short: &ldquo;it&rsquo;s a zoo out there.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/2e194248f4ee266296ff208c51a32c63_1663145377311.png\" alt=\"What's wrong with the typical HD map approach? Part 2\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:29 a.m.\u003C/strong>: And that&rsquo;s why you need not a &ldquo;high definition map,&rdquo; but rather an &ldquo;AV map&rdquo; &mdash; with local accuracy and detailed semantic features.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/e36b6350e3155f87aa39f1d6602b0d62_1663145536593.png\" alt=\"Mobileye approach: &quot;AV map&quot;, not &quot;HD map&quot;\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:30 a.m.\u003C/strong>: The steps to build an AV map begin with &ldquo;harvesting&rdquo; what Mobileye calls &ldquo;road segmented data.&rdquo; It&rsquo;s sent to the cloud for the &ldquo;automatic map creation that we have been working on for the past five years.&rdquo; (This is the milestone Amnon started the drumroll for earlier.)\u003C/p>\n\u003Cp>And finally there&rsquo;s localization: where is the car, right now, on the map?\u003C/p>\n\u003Cp>\u003Cstrong>10:31 a.m.\u003C/strong>: Amnon walks through how data is combined from hundreds or thousands of cars to identify drivable paths, signs and landmarks.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/ae46afc1c9b96634f14e61f52e6aa624_1663145578476.png\" alt=\"REM&trade; under the hood\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:33 a.m.\u003C/strong>: What if there are no lane markings? The crowdsourced data reveals details like multi-path unmarked lanes and critically, the road edge.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/ceee2d3723696bfeebe70885aed0bd01_1663145622304.png\" alt=\"Aligning drivers\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:34 a.m.\u003C/strong>: A big busy roundabout with several pedestrian crossings? It makes me sweat but with enough data, Mobileye gets &ldquo;very close to perfection,&rdquo; Amnon assures.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/630d365b077bc9e03090b57b8a8d0626_1663145666075.png\" alt=\"Modeling process\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:35 a.m.\u003C/strong>: Which light goes with which lane?! Still not a problem.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/57c77235c16640748776622423926bbc_1663145723701.png\" alt=\"Why crowd sourcing is perfect for semantic understanding? Part 1\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:37 a.m.\u003C/strong>: Amnon is cruising through a number of scenarios where expected driving behavior is set not by signs or markings but rather &ldquo;can be inferred from the crowd.&rdquo; It&rsquo;s accurate and my goodness, must&rsquo;ve saved years of Mobileye engineers&rsquo; time.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/4925d3a60fdc4030205904c028980591_1663145764306.png\" alt=\"Why crowd sourcing is perfect for semantic understanding? Part 2\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:38 a.m.\u003C/strong>: I&rsquo;ve joked that AVs should have passenger-chosen driving modes, ranging possibly from &ldquo;Driving Miss Daisy&rdquo; to &ldquo;Bullitt.&rdquo; What&rsquo;s great about Mobileye&rsquo;s maps, though, is they include how fast people normally go on every stretch of local road, &ldquo;very important in order to create a good and smooth driving experience.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/949af4a89270c165a593bebc5ba36781_1663145797108.png\" alt=\"Why crowd sourcing is perfect for semantic understanding? Part 3\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:39 a.m.\u003C/strong>: The numbers are bonkers: &ldquo;we have about eight million kilometers of road being sent every day.&rdquo;\u003C/p>\n\u003Cp>&ldquo;In 2024, it&rsquo;s going to be one billion kilometers of roads being sent daily, so we are really on our way to map the entire planet.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>10:40 a.m.\u003C/strong>: Deep dive part 2! Radar and lidar.\u003C/p>\n\u003Cp>&ldquo;Why do we think that we need to get into development of radars and lidars? First let me explain that.&rdquo;\u003C/p>\n\u003Cp>For 2022, &ldquo;we are all set,&rdquo; Amnon says, to use Luminar lidar and &ldquo;stock radars&rdquo; that are sufficient for end-to-end driving with a high MTBF (speaking of MTB, Amnon does enjoy mountain bikes and motorcycles &mdash; he&rsquo;s a fan of wheels of all shapes and sizes).\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/41a0a8ff27a5284990a7308792de362b_1663145842583.png\" alt=\"The motivation behind LiDAR and Radar development\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:41 a.m.\u003C/strong>: For 2025, though, Mobileye wants lower costs &mdash; to reach &ldquo;this level of consumer AV&rdquo; &shy;&mdash; and more capability, &ldquo;closer to level 5&rdquo; (here&rsquo;s a handy graphic of the six levels). &ldquo;It&rsquo;s contradictory.&rdquo;\u003C/p>\n\u003Cp>The goal is not two-way redundancy but three, where radar and lidar can each function as standalone systems like cameras. But &ldquo;radar today doesn&rsquo;t have the resolution or the dynamic range.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>10:42 a.m.\u003C/strong>: We believe radars need to evolve to &ldquo;imaging radar&rdquo; that could stand alone, Amnon says. &ldquo;This is very very bold thinking.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>10:43 a.m.\u003C/strong>: Radar is 10 times cheaper than lidar. (Point of reference: the first iPhone with a lidar sensor is the 12 Pro just introduced in Q4 of 2020.)\u003C/p>\n\u003Cp>So, we want to drastically reduce the cost for lidars and &ldquo;push the envelope much further with radars.&rdquo; Conveniently, &ldquo;Intel has the know-how to build cutting-edge radars and lidars.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>10:44 a.m.\u003C/strong>: Time for a quick seminar on radar. I studied engineering but it was still handy for me to dig up a reminder: radar and lidar are both used to identify objects. Radar uses radio waves to do so, lidar uses infrared light from lasers. Bats and dolphins use sound &mdash; that&rsquo;s sonar.\u003C/p>\n\u003Cp>What the cars use is &ldquo;software-defined imaging radar.&rdquo; The signal it receives from an object is not a set of hard points as from a camera or lidar but rather &ldquo;all over the place.&rdquo; Separating noise from the actual object &ldquo;is very, very tricky.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/d1f38918ad2c7519171dfcf7c64cb32e_1663145882132.png\" alt=\"The goal\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>The echoes, or signals reflected back, contain noise called &ldquo;side lobes.&rdquo; A better radar should have both more resolution (able to detect small objects) and more dynamic range through higher &ldquo;side lobe level&rdquo; (more accuracy, through increased &ldquo;probability of detection&rdquo;). And thus, it should produce an image as useful as a lidar or camera.\u003C/p>\n\u003Cp>Yep, also on the test &mdash; stay frosty!\u003C/p>\n\u003Cp>\u003Cstrong>10:45 a.m.\u003C/strong>: Current radars have 192 virtual channels thanks to 12 by 16 transmitters and receivers. The goal is &ldquo;much more massive&rdquo;: 2,304 virtual channels based on 48 by 48 transmitters and receivers. This brings &ldquo;significant challenges&hellip;computational complexity increases a lot.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/b2f48412046c13997585eb8907ec02a2_1663145922647.png\" alt=\"The required capabilities\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:46 a.m.\u003C/strong>: For the dynamic range, side lobe levels should rise from 25 dBc to 40. The scale is &ldquo;logarithmic, so it&rsquo;s night and day, basically,&rdquo; to make that jump.\u003C/p>\n\u003Cp>\u003Cstrong>10:47 a.m.\u003C/strong>: What can better radars do? Let&rsquo;s see what they can see.\u003C/p>\n\u003Cp>As a motorcycle rider myself, this example puts a smile under my helmet.\u003C/p>\n\u003Cp>&ldquo;We want our radar to be able to pick up this motorcycle, even though there are many, many more powerful targets that have a much higher RCS signal.&rdquo; RCS is &ldquo;radar cross section.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>10:48 a.m.\u003C/strong>: The higher sensitivity means a lot more data to process, not suitable for &ldquo;a brute force, na&iuml;ve way.&rdquo; Advanced digital filters, however, can be &ldquo;more accurate and powerful than what you can do in an analog domain.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/922546a9b2038cfc18ae4dec6fc0efae_1663145962864.png\" alt=\"The solution: SW-defined imaging radar\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:49 a.m.\u003C/strong>: In another example, the better radar can identify pedestrians in a scene where &ldquo;visually, you can hardly see them.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/c14a1665e14110ead8be66243909009a_1663146008904.png\" alt=\"Detecting two closed pedestrians behind a vehicle\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:50 a.m.\u003C/strong>: Here the radar detects an old tire on the road, 140 meters away. &ldquo;We want the radars also to be able to detect hazards, and hazards could be low and small and far away.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/0c4da3234405796df1bbc51878765ee5_1663146050326.png\" alt=\"Stable tire detection at 140m\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:51 a.m.\u003C/strong>: The timeline for these radars? &ldquo;2024, 2025 in terms of standard production.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>10:52 a.m.\u003C/strong>: Sensor seminar number two: lidar.\u003C/p>\n\u003Cp>Current lidars use a method called &ldquo;time of flight,&rdquo; providing 3D data &mdash;objects&rsquo; size, shape and distance (this basic Wikipedia graphic shows how they &ldquo;see&rdquo;).\u003C/p>\n\u003Cp>But a new kind of lidar &mdash; frequency-modulation coherent wave, or FMCW &mdash; is &ldquo;the next frontier.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/39a08838feb70049e7d124c6755ced37_1663146096468.png\" alt=\"FMCW LiDar\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:53 a.m.\u003C/strong>: To compact a lot of detail, FMCW lidar has many advantages and it&rsquo;s also 4D &mdash; it can capture the velocity of objects.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/43c22480af82cd6831e43ddd1816ce21_1663146141224.png\" alt=\"The required capabilities for FMCW LiDARs\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:54 a.m.\u003C/strong>: Amnon shows how FMCW lidar captures velocity, has longer range and is more resilient.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/8a9ba8056139e8c9dbc667b151877a7a_1663146179261.png\" alt=\"Examples of FMCW LiDARs\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:55 a.m.\u003C/strong>: What does Intel bring to lidar? Silicon photonics. Intel is &ldquo;able to put active and passive laser elements on a chip, and this is really game-changing.&rdquo;\u003C/p>\n\u003Cp>Amnon is showing a photonic integrated circuit with &ldquo;184 vertical lines, and then those vertical lines are moved through optics.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/0cfef0194485f295f095ae411468072f_1663146219461.png\" alt=\"Harnessing Intel's Si photonics Leadership to FMCW sensor development\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>The ability to fabricate this kind of chip is &ldquo;very rare,&rdquo; Amnon says. &ldquo;This gives Intel a significant advantage in building these lidars.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>10:56 a.m.\u003C/strong>: And that&rsquo;s that for deep dives, &ldquo;I&rsquo;m now going back to update mode.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>10:57 a.m.\u003C/strong>: The RSS safety model is gaining momentum worldwide. It&rsquo;s being applied to and influencing standards with IEEE, ISO, the U.S. Dept. of Transportation and the U.K. Law Commission.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/f753c63785e2f6a07752cbe556a3be11_1663146351437.png\" alt=\"What is RSS?\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/56b3924263b338af523859ebfa050517_1663146383801.png\" alt=\"Industry standardization efforts\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/d367326596290b2187b6913afe6cef92_1663146419643.png\" alt=\"Government efforts\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:58 a.m.\u003C/strong>: Mobileye is building not only vehicle- and ride-as-a-service, but mobility-as-a-service (MAAS), too. That&rsquo;s where Moovit comes in.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/4d801705472325f6dbb23ffbea2f54f1_1663146460681.png\" alt=\"Mobility supply is shaping in two main streams\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>10:59 a.m.\u003C/strong>: Moovit is &ldquo;the biggest trip planner,&rdquo; with 950 million users active in 3,400 cities and 112 countries. It enables MAAS by adding tele ops, fleet optimization, central control, mobility intelligence and then the user experience and payment.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/c5107eef20ade6d65524a6385d289320_1663146534059.png\" alt=\"Harnessing the world's leading mobility platform to power our robotaxi service\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/4093f2e906e34444f5de61c68098c951_1663146549498.png\" alt=\"Mobileye-Moovit Driverless Maas\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>11:00 a.m.\u003C/strong>: Almost there, don&rsquo;t pack your stuff yet!\u003C/p>\n\u003Cp>The first deployment will be in Jerusalem in 2022. Following close will be France, where testing is starting next month. In Daegu City, South Korea, testing begins mid-year. And in collaboration with the Willer Group in Japan, the goal is a 2023 launch in Osaka.\u003C/p>\n\u003Cp>&ldquo;And we will expand more.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/c58164628585bb7e9f8f79ce79fa9322_1663146591166.png\" alt=\"Mobility-as-a-service global footprint\" width=\"1650\" height=\"928\" />\u003C/p>\n\u003Cp>\u003Cstrong>11:01 a.m.\u003C/strong>: &ldquo;I think this is all what I had to say.&rdquo;\u003C/p>\n\u003Cp>Amnon&rsquo;s last word: &ldquo;We would like 2025 to be the year in which we can start giving the experience of people buying a car and sitting in the back seat whenever they want and have the car drive everywhere &mdash; not just in one particular location.&rdquo;\u003C/p>\n\u003Cp>\u003Cstrong>11:02 a.m.\u003C/strong>: Class dismissed! Thank you for following along &mdash; I hope you learned as much as I did.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"CES 2021: Under the Hood with Prof. Amnon Shashua\" src=\"https://player.vimeo.com/video/772986667?h=c20ace1902&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"640\" height=\"360\" frameborder=\"0\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>","2021-01-12T08:00:00.000Z","News, Events, Video, From our CEO",{"id":1944,"type":24,"url":1945,"title":1946,"description":1947,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1947,"image":1948,"img_alt":1947,"content":1949,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1950,"tags":1951},152,"ces-2021-mobileye-avs-on-move","CES 2021: Mobileye Innovation Will Bring AVs to Everyone, Everywhere","Progress Includes Automated Crowdsourced Mapping, New Lidar SoC, Software-Defined Radar and AV Test Vehicles in Four New Countries.","https://static.mobileye.com/website/us/corporate/images/b09d2fd9e19f45002e77e366d374c5ec_1666086194082.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>\u003Cstrong>NEWS HIGHLIGHTS\u003C/strong>\u003C/p>\u003Cul>\u003Cli>Automated, worldwide autonomous vehicle (AV) mapping capability allows Mobileye to expand its AV test fleets; new vehicles expected in Detroit, Tokyo, Shanghai, Paris and (pending regulation) New York City early this year.\u003C/li>\u003Cli>Intel brings its XPU strategy, expertise and manufacturing capability in silicon photonics to develop a lidar system-on-chip (SoC) for Mobileye use in AVs starting in 2025.\u003C/li>\u003Cli>Mobileye plans a software-defined radar customized to autonomous vehicles.\u003C/li>\u003Cli>Mobileye reveals that cars using its existing technology have mapped nearly 1 billion kilometers globally, with more than 8 million kilometers mapped daily.\u003C/li>\u003C/ul>\u003Cp>Jerusalem, Jan. 11, 2021 — Mobileye, an Intel Company, today previewed the strategy and technology that will enable autonomous vehicles (AV) to fulfill their lifesaving promise globally. During two sessions at this week’s Consumer Electronics Show, Mobileye president and chief executive officer Amnon Shashua will explain how Mobileye is set up to win globally in the AV industry.\u003C/p>\u003Cp>“The backing of Intel and the trinity of our approach means that Mobileye can scale at an unprecedented manner,” Shashua said. “From the beginning, every part of our plan aims for rapid geographic and economic scalability – and today’s news shows how our innovations are enabling us to execute on that strategy.”\u003C/p>\u003Cp>\u003Cstrong>The Mobileye Trinity\u003C/strong>\u003C/p>\u003Cp>In describing the trinity of the Mobileye approach, Shashua will explain the importance of delivering a sensing solution that is orders of magnitude more capable than human drivers. He will describe how Mobileye’s technology – including Road Experience Management™ (REM™) mapping technology, rules-based Responsibility-Sensitive Safety (RSS) driving policy and two separate, truly redundant sensing subsystems based on world-leading camera, radar and lidar technology – combine to deliver such a solution.\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>Mobileye’s approach solves the scale challenge from both a technology and business perspective. Getting the technology down to an affordable cost in line with the market for future AVs is crucial to enabling global proliferation. Mobileye’s solution starts with the inexpensive camera as the primary sensor combined with a secondary, truly redundant sensing system enabling safety-critical performance that is at least three orders of magnitude safer than humans. Using True Redundancy™, Mobileye can validate this level of performance faster and at a lower cost than those who are doing so with a fused system.\u003C/p>\u003Cp>\u003Cstrong>New Radar and Lidar Technology\u003C/strong>\u003C/p>\u003Cp>Shashua explained that the company envisions a future with AVs achieving enhanced radio- and light-based detection-and-ranging sensing, which is key to further raising the bar for road safety. Mobileye and Intel are introducing solutions that will innovatively deliver such advanced capabilities in radar and lidar for AVs while optimizing computing- and cost-efficiencies.\u003C/p>\u003Cp>As described in Shashua’s “Under the Hood” session, Mobileye’s software-defined imaging radar technology with 2304 channels, 100DB dynamic range and 40 DBc side lobe level that together enable the radar to build a sensing state good enough for driving policy supporting autonomous driving. With fully digital and state-of-the-art signal processing, different scanning modes, rich raw detections and multi-frame tracking, Mobileye’s software-defined imaging radar represents a paradigm shift in architecture to enable a significant leap in performance.\u003C/p>\u003Cp>Shashua also will explain how Intel’s specialized silicon photonics fab is able to put active and passive laser elements on a silicon chip. “This is really game-changing,” Shashua said of the lidar SoC expected in 2025. “And we call this a photonic integrated circuit, PIC. It has 184 vertical lines, and then those vertical lines are moved through optics. Having fabs that are able to do that, that’s very, very rare. So this gives Intel a significant advantage in building these lidars.”\u003C/p>\u003Cp>\u003Ca href=\"https://static.mobileye.com/website/us/corporate/images/fd52c097bc091a5dd21a4bb95184c385_1668609050126.jpg\" rel=\"noopener noreferrer\" target=\"_blank\">» Click for full image\u003C/a>\u003C/p>\u003Cp>\u003Cstrong>Worldwide Maps Bring AVs Everywhere\u003C/strong>\u003C/p>\u003Cp>In Monday’s session, Shashua will explain the thinking behind Mobileye’s crowdsourced mapping technology. Mobileye’s unique and unprecedented technology can now map the world automatically with nearly 8 million kilometers tracked daily and nearly 1 billion kilometers completed to date. This mapping process differs from other approaches in its attention to semantic details that are crucial to an AV’s ability to understand and contextualize its environment.\u003C/p>\u003Cp>For AVs to realize their life-saving promise, they must proliferate widely and be able to drive almost everywhere. Mobileye’s automated map-making process uses technology deployed on nearly 1 million vehicles already equipped with Mobileye advanced driver-assistance technology.\u003C/p>\u003Cp>To demonstrate the scalable benefits of these automatic AV maps, Mobileye will start driving its AVs in four new countries without sending specialized engineers to those new locations. The company will instead send vehicles to local teams that support Mobileye customers. After appropriate training for safety, those vehicles will be able to drive. This approach was used in 2020 to enable AVs to start driving in&nbsp;\u003Ca href=\"https://www.mobileye.com/news/mobileye-avs-go-anywhere-germany/\" rel=\"noopener noreferrer\" target=\"_blank\">Munich\u003C/a>&nbsp;and Detroit within a few days.\u003C/p>\u003Cp>\u003Cstrong>Find out More\u003C/strong>\u003C/p>\u003Cp>Watch Mobileye’s day one press conference at 10 a.m. PST Monday, Jan. 11, and the deeper technology session at 10 a.m. PST Tuesday, Jan. 12. Both will be broadcast on the&nbsp;Intel Newsroom.\u003C/p>\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\u003Cp>Mobileye is the global leader in the development of computer vision and machine learning, data analysis, localization and mapping for Advanced Driver Assistance Systems and autonomous driving. Mobileye’s technology helps keep passengers safer on the roads, reduces the risks of traffic accidents, saves lives and has the potential to revolutionize the driving experience by enabling autonomous driving. Mobileye’s proprietary software algorithms and&nbsp;EyeQ® chips perform detailed interpretations of the visual field in order to anticipate possible collisions with other vehicles, pedestrians, cyclists, animals, debris and other obstacles. Mobileye’s products are also able to detect roadway markings such as lanes, road boundaries, barriers and similar items; identify and read traffic signs, directional signs and traffic lights; create a&nbsp;Mobileye Roadbook™ of localized drivable paths and visual landmarks using REM™; and provide mapping for autonomous driving.\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>[**]gallery:ces-2021-mobileye-avs-on-move-gallery-1[**]\u003C/p>","2021-01-11T15:00:00.000Z","News, Events, Driverless MaaS, Autonomous Driving",{"id":1953,"type":24,"url":1954,"title":1955,"description":1956,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1956,"image":1957,"img_alt":1958,"content":1959,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1960,"tags":1961},81,"mobileye-ces-2021","Join Mobileye Online at CES 2021","See what we have in store for you during this year’s virtual tech expo. \n","https://static.mobileye.com/website/us/corporate/images/cc023f2bc0712ccdc0e51c6c6cdac1fc_1609765812622.jpg","Mobileye at CES 2021","\u003Cp>\u003Cspan style=\"background-color: inherit;\">The year&rsquo;s biggest tech expo is almost upon us. And while this year&rsquo;s CES is being held exclusively online in a virtual format, Mobileye will still be present in a big way with several events that participants can tune into from the comfort of their homes, offices, and home offices. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">All events will be available to view at \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"http://www.intel.com/CES\" target=\"_blank\" rel=\"noopener noreferrer\">intel.com/CES\u003C/a>\u003Cspan style=\"background-color: inherit;\">. Here&rsquo;s what we have in store for you this year.\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>City of Doers &ndash; Intel &amp; Mobileye's Virtual CES Booth\u003C/strong>\u003C/p>\n\u003Cp>This year, all of CES is virtual &ndash; including the booths. Explore how Intel and Mobileye technology impact the way people work and live at the \"City of Doers\" &ndash; including a virtual Mobileye Self-Driving Shuttle.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: inherit;\">Amnon Shashua: It&rsquo;s Time to Go &ndash; Intel CES News Conference\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">Monday, January 11, 2021 @ 10:00-10:20am PST\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">Take an intimate virtual tour of the Mobileye garage lab with our CEO \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Prof. Amnon Shashua\u003C/a>\u003Cspan style=\"background-color: inherit;\"> as he connects today&rsquo;s progress and technology with tomorrow&rsquo;s vision. Prof. Shashua will discuss the unique interplay between \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/blog/buying-a-new-car-here-are-four-adas-features-to-look-for/\" target=\"_blank\" rel=\"noopener noreferrer\">advanced driver-assistance systems\u003C/a>\u003Cspan style=\"background-color: inherit;\"> (ADAS) and our \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/blog/munich-av-video/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomous vehicle\u003C/a>\u003Cspan style=\"background-color: inherit;\"> technology, and how this connection has shaped Mobileye&rsquo;s strategy. \u003C/span>This session will feature a 1:1 with Ed \u003Cspan style=\"color: #000000;\">Niedermeyer\u003C/span>, Director of Communications at PAVE (Partners for Automated Vehicle Education).\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: inherit;\">Under the Hood with Amnon\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">Tuesday, January 12, 2021 @ 10:00-11:00am PST\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">Join Prof. Shashua as he reveals our latest progress on automated driving technologies &ndash; including roadmap revelations, mapping milestones, RSS/safety progress, new and growing \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/news/mobileye-is-bringing-driverless-maas-to-the-uae/\" target=\"_blank\" rel=\"noopener\">partnerships for mobility-as-a-service\u003C/a>\u003Cspan style=\"background-color: inherit;\">, and more. He will provide an in-depth look at how Mobileye delivers today on its global promises and technologies. He will also shed light on \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye SuperVision&trade;\u003C/a>\u003Cspan style=\"background-color: inherit;\">, our most advanced hands-free driving system, and perform a deep dive into Mobileye&rsquo;s disruptive high-definition mapping technology. As in previous years, viewers can expect revealing new videos that demonstrate Mobileye&rsquo;s progress.\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: inherit;\">Technological Megashifts Impacting our World &ndash; Prof. Amnon Shashua &amp; Thomas Friedman on the Future of AI \u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">Wednesday, January 13, 2021 @ 9:45am PST\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">Prof. Shashua and Thomas Friedman of \u003C/span>\u003Cem style=\"background-color: inherit;\">The New York Times\u003C/em>\u003Cspan style=\"background-color: inherit;\"> explore the global impact of Artificial Intelligence. Shashua and Friedman will tackle fundamental questions related to the ethics and values governing today&rsquo;s technology, the challenges facing a rapid pace of change and automation, and solutions for maximizing opportunities in a world that is fast, fused and deep.\u003C/span>\u003C/p>","2021-01-07T08:00:00.000Z","Events, Amnon Shashua, News",{"id":1963,"type":5,"url":1964,"title":1965,"description":1966,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1966,"image":1967,"img_alt":1968,"content":1969,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1970,"tags":1546},82,"mobileye-new-logo","New Logo, Same Vision","Mobileye’s new logo and tagline – Mobileye, Driven by Vision™ – reflects the company’s continued leadership in autonomous driving and driver assist technologies. ","https://static.mobileye.com/website/us/corporate/images/c482d51403950ec18a4d92f0f7821e7f_1609840667353.jpg","Mobileye's new logo","\u003Cp>\u003Cspan style=\"background-color: inherit;\">Mobileye started over two decades ago in a garage with only five employees. While the company was small, the drive and the vision were big: to leverage camera-based technology to make mobility safer. \u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"background-color: inherit;\">As the years went by, the drive and the vision remained the same, but the company’s groundbreaking work in computer vision and AI vastly expanded what the technology could accomplish. From its roots in advanced driver-assistance systems, Mobileye has developed an AV capable of navigating the crowded streets of Jerusalem, which has also been able to quickly adapt to driving around Munich. \u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"background-color: inherit;\">Building scalable AV technology meant we needed to develop a crowdsourced mapping solution, resulting in our award-winning \u003C/span>\u003Ca href=\"https://www.mobileye.com/technology/rem/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"background-color: inherit; color: rgb(5, 99, 193);\">Road Experience Management™\u003C/a>\u003Cspan style=\"background-color: inherit;\"> technology for AV mapping. REM\u003C/span>™\u003Cspan style=\"background-color: inherit;\"> also lies at the heart of \u003C/span>\u003Ca href=\"https://www.mobileye.com/en/data/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"background-color: inherit; color: rgb(5, 99, 193);\">Mobileye Data Services™\u003C/a>,\u003Cspan style=\"background-color: inherit;\"> which provides asset and mobility data to local governments, road operators, utilities and others.\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"background-color: inherit;\">While REM was one major building block of AV technology, our focus has always been on balancing safety and usefulness. To harmonize these two goals, we developed the \u003C/span>\u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"background-color: inherit; color: rgb(5, 99, 193);\">Responsibility-Sensitive Safety\u003C/a>\u003Cspan style=\"background-color: inherit;\"> (RSS) model. These technologies, in turn, led to our newest ADAS system: \u003C/span>\u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"background-color: inherit; color: rgb(5, 99, 193);\">Mobileye SuperVision™\u003C/a>\u003Cspan style=\"background-color: inherit;\">.\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"background-color: inherit;\">Next on the horizon is the robotaxi, which we believe will be the first scalable, practical implementation of AV technology. These on-demand self-driving vehicles are due to begin driverless Mobility-as-a-Service first in Tel Aviv before branching out around the world, from Paris to Dubai to Japan.\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"background-color: inherit;\">All of this grew out of a camera and AI combined with boundless drive and vision. And now it’s time for our logo and tagline to reflect where we are, where we’re going and, perhaps most crucially, how we’ll get there: \u003C/span>\u003Cstrong style=\"background-color: inherit;\">Mobileye, Driven by Vision™\u003C/strong>\u003Cspan style=\"background-color: inherit;\"> reflects the company’s commitment to computer vision as the heart of mobility technology along with the drive to implement our vision of a world with equal access to safe, efficient mobility. \u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"background-color: inherit;\">For the past 20 years our drive and vision have taken us places we could hardly have dreamt of. Now we’re looking forward to the next 20 years, and beyond.\u003C/span>\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Ciframe class=\"ql-video\" frameborder=\"0\" allowfullscreen=\"true\" src=\"https://www.youtube.com/embed/lt5GxUpPC-s\" height=\"315\" width=\"603\">\u003C/iframe>\u003Ciframe class=\"ql-video\" frameborder=\"0\" allowfullscreen=\"true\" src=\"https://www.youtube.com/embed/qCUAEDvaJsY\" height=\"315\" width=\"560\">\u003C/iframe>\u003Cp>\u003Cbr>\u003C/p>","2021-01-04T22:00:00.000Z",{"id":1972,"type":5,"url":1973,"title":1974,"description":1975,"primary_tag":658,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1975,"image":1976,"img_alt":1977,"content":1978,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1979,"tags":1980},80,"looking-back-on-2020-at-mobileye","Looking Back on 2020 at Mobileye","2020 was a challenging year, but Mobileye had a number of significant successes in ADAS development, data collection and self-driving cars. ","https://static.mobileye.com/website/us/corporate/images/0d07ef857616738b0879d5ceb68f2df7_1609674908709.jpg","2020 at Mobileye","\u003Cp>2020... what a year. It was difficult for all of us. Despite all the obstacles, and without minimizing the undoubted suffering worldwide, at Mobileye we&rsquo;re proud of the fact that we were able to not only weather the challenges of 2020, but even to make enormous strides forward. In fact, when we look back at the year, there is a lot to be proud of. So, as 2020 comes to a close, it&rsquo;s a good time to reflect on this year&rsquo;s highlights:\u003C/p>\n\u003Cp>1. \u003Ca href=\"https://www.mobileye.com/news/welcoming-moovit-to-the-fold/\" target=\"_blank\" rel=\"noopener\">Intel&rsquo;s purchase of Moovit\u003C/a> &ndash; Moovit brought with it hundreds of millions of users, strong relationships with public transportation companies worldwide, and a large proprietary transportation dataset. Working together, \u003Ca href=\"https://www.mobileye.com/opinion/there-is-more-to-our-moovit-acquisition-than-meets-the-eye/\" target=\"_blank\" rel=\"noopener\">we&rsquo;re aiming towards offering the full stack of MaaS technology\u003C/a>, from powering third-party fleets to running our own robotaxi fleet.\u003C/p>\n\u003Cp>2. \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye SuperVision\u003C/a>&trade; &ndash; In 2020, our \u003Ca href=\"https://www.mobileye.com/blog/everything-you-need-to-know-about-adas/\" target=\"_blank\" rel=\"noopener noreferrer\">ADAS technology\u003C/a> took a major leap forward with the launch of Mobileye SuperVision. This system allows for a more comfortable as well as safer drive, using both REM and our \u003Ca href=\"https://www.mobileye.com/blog/responsibility-sensitive-safety-gains-traction-worldwide/\" target=\"_blank\" rel=\"noopener noreferrer\">RSS safety model\u003C/a>. Not just an amazing product in and of itself, its development showed that we can use our self-driving car technology now to develop advanced products for our ADAS customers. In fact, this technology was the linchpin of a major deal we struck with \u003Ca href=\"https://www.mobileye.com/opinion/our-new-deal-with-geely-is-a-game-changer-says-shashua/\" target=\"_blank\" rel=\"noopener\">Geely\u003C/a> &ndash; one of China&rsquo;s largest car manufacturers.\u003C/p>\n\u003Cp>3. \u003Ca href=\"https://www.mobileye.com/news/mobileye-is-bringing-driverless-maas-to-the-uae/\" target=\"_blank\" rel=\"noopener\">Driverless MaaS in the UAE\u003C/a> &ndash; At Mobileye, we believe that the \u003Ca href=\"https://www.mobileye.com/blog/how-robotaxis-will-lead-the-way-toward-the-fully-autonomous-future/\" target=\"_blank\" rel=\"noopener noreferrer\">first deployment of true self-driving cars will be as robotaxis\u003C/a>. Plans to deploy these vehicles got a significant boost this year when Mobileye signed a strategic cooperation agreement for driverless Mobility-as-a-Service with the Al-Habtoor Group based in Dubai, one of the first deals signed after the Abraham Accords. The agreement between Al-Habtoor and Mobileye calls for equipping vehicles with Mobileye 8 Connect&trade; in early 2021 in order to map Dubai&rsquo;s roads and infrastructure, with initial deployment of self-driving cars there later in the year and the introduction of robotaxis in 2022.\u003C/p>\n\u003Cp>4. \u003Ca href=\"https://www.mobileye.com/blog/munich-av-video/\" target=\"_blank\" rel=\"noopener noreferrer\">REM&trade; demo in Munich\u003C/a> &ndash; Mapping is one of the keys to self-driving cars and Mobileye had some major successes with our Road Experience Management&trade; (REM) technology this year. Thanks to REM, our self-driving test vehicles were able to drive in Munich within just days of their arrival there. In addition, REM was recognized as a breakthrough technology, earning a \u003Ca href=\"https://www.mobileye.com/news/mobileye-wins-prestigious-2020-pace-award-for-rem-mapping-tech/\" target=\"_blank\" rel=\"noopener\">2020 Pace Award\u003C/a>.\u003C/p>\n\u003Cp>5. \u003Ca href=\"https://www.mobileye.com/blog/\" target=\"_blank\" rel=\"noopener\">Launch of the Mobileye.com blog\u003C/a> &ndash; As a leader in ADAS, self-driving cars and roadway data collection, it&rsquo;s important for our voice to be heard regarding the vital issues affecting these technologies. To help get the word out we introduced the Mobileye.com blog, which covers important technological developments such as \u003Ca href=\"https://www.mobileye.com/blog/av-safety-demands-true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">True Redundancy\u003C/a>&trade;, \u003Ca href=\"https://www.mobileye.com/blog/moving-our-machine-learning-to-the-cloud-inspired-innovation/\" target=\"_blank\" rel=\"noopener noreferrer\">how Mobileye moved machine learning to the cloud\u003C/a> and our \u003Ca href=\"https://www.mobileye.com/blog/mobileye-leads-the-industry-in-embracing-linux-for-safety-related-applications/?utm_source=facebook&amp;utm_medium=post&amp;utm_term=organic\" target=\"_blank\" rel=\"noopener noreferrer\">adoption of Linux for safety-related applications\u003C/a>.\u003C/p>\n\u003Cp>6. \u003Ca href=\"https://www.jpost.com/israel-news/mobileye-appeals-for-social-leadership-during-the-corona-crisis-622034\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye for the Community\u003C/a> &ndash; The pandemic affected us all, but there were certain groups were particularly hard-hit. Mobileye committed itself to do its part to help those in need through &ldquo;Mobileye for the Community.&rdquo; This program distributed over $3 million to non-profit organizations helping the economy and society, with grants going towards everything from food distribution to online psychological services.\u003C/p>\n\u003Cp>7. Moving On(line) &ndash; Like many companies around the globe, not least our parent Intel, Mobileye had to adjust to having many of our employees work from home. This demanded an extraordinary effort by our IT department to quickly develop an infrastructure allowing for more than a thousand secure home workstations, with almost no advance warning. Fortunately, they were more than up to the task, allowing the company to continue functioning (and even striving) with barely a hitch.\u003C/p>\n\u003Cp>8. \u003Ca href=\"https://www.mobileye.com/en/data/\" target=\"_blank\" rel=\"noopener noreferrer\">IMS and Smart Cities\u003C/a> &ndash; Mobility is a critical part of smart cities, and that doesn&rsquo;t mean just robotaxis. Mobileye Data Services&trade; helps cities, road operators, and others track and manage their assets, including roads &ndash; even offering insights into \u003Ca href=\"https://www.mobileye.com/en/data/blog/keeping-cyclists-safe-in-their-lanes/\" target=\"_blank\" rel=\"noopener noreferrer\">bike lanes in New York\u003C/a>, for example. In 2020, this division began a number of new projects, surveying roadside assets and monitoring mobility data in cities worldwide, including \u003Ca href=\"https://www.mobileye.com/en/data/webinar/webcast-road-asset-monitoring/\" target=\"_blank\" rel=\"noopener noreferrer\">Rome\u003C/a> and \u003Ca href=\"https://www.mobileye.com/en/data/case-study/actionable-information-what-mobileye-data-services-can-tell-you-about-your-roads/\" target=\"_blank\" rel=\"noopener noreferrer\">Barcelona\u003C/a>.\u003C/p>\n\u003Cp>9. The New Mobileye Campus in Jerusalem &ndash; While this year many employees were forced to work from home, as usual Mobileye was looking to the future. Construction on our new Jerusalem headquarters went full-steam ahead throughout 2020. This new state-of-the-art complex has plenty of room to meet the needs of our growing company: it will hold 2,700 employees, will have 8 floors, and include 65 conference rooms, 64 kitchenettes, 2 dining halls, 7 parking levels, numerous labs, a green roof, a gym and a spa.\u003C/p>\n\u003Cp>10. Despite the challenges of 2020, Mobileye has managed not only to survive, but thrive, expanding into new markets and new regions. What&rsquo;s next? Follow us on \u003Ca href=\"https://twitter.com/Mobileye\" target=\"_blank\" rel=\"noopener noreferrer\">Twitter\u003C/a>, \u003Ca href=\"https://www.facebook.com/Mobileye\" target=\"_blank\" rel=\"noopener noreferrer\">Facebook\u003C/a> or \u003Ca href=\"https://www.linkedin.com/company/24017/admin/\" target=\"_blank\" rel=\"noopener noreferrer\">LinkedIn\u003C/a> (and of course here on our blog!) to learn more.\u003C/p>","2020-12-31T08:00:00.000Z","Opinion",{"id":1982,"type":24,"url":1983,"title":1984,"description":1985,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1985,"image":1986,"img_alt":1985,"content":1987,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":1988,"tags":928},153,"mobileye-avs-go-anywhere-germany","Mobileye AVs Can Go Anywhere in Germany","Mobileye today published a nearly hourlong video of its autonomous driving test vehicle (AV) navigating the complexities of urban and highway driving in Munich, Germany.","https://static.mobileye.com/website/us/corporate/images/e5bc128f6386cf209c3e4cd5a2b28e8c_1666085670292.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>\u003Cstrong>What’s New:\u003C/strong>&nbsp;Mobileye today published a nearly hourlong&nbsp;\u003Ca href=\"https://youtu.be/A1qNdHPyHu4\" rel=\"noopener noreferrer\" target=\"_blank\">video\u003C/a>&nbsp;of its autonomous driving test vehicle (AV) navigating the complexities of urban and highway driving in Munich, Germany. The video demonstrates the company’s unmatched ability to drive AVs “everywhere” due in large part to Mobileye’s crowd-sourced, high-definition (HD) mapping technology known as Road Experience Management™ (REM™).\u003C/p>\u003Cp>“High-definition maps are crucial to a safe and robust self-driving system. Because it is crowd-sourced across production vehicles in large volume, our Road Experience Management technology satisfies the near real-time and scale challenges necessary for an effective map.” –Prof. Amnon Shashua, president and chief executive officer of Mobileye, an Intel Company\u003C/p>\u003Cp>\u003Cstrong style=\"font-family: intelone-display-regular, Inter, sans-serif;\">How Mobileye Set an Unprecedented Timetable:\u003C/strong>&nbsp;Because Mobileye has invested in a scalable, sustainable approach to mapping, the company was able to drive its cars autonomously on highways and urban roads – pretty much everywhere – within just a few days of delivering the AV to Munich. The ability to land in a new region and drive almost immediately is unprecedented as most companies must spend weeks – if not months – using special vehicles to build maps before their vehicles can start driving. What’s more, competing mapping approaches are quickly outdated and unable to account for road changes such as detours.\u003C/p>\u003Cp>In Mobileye’s case, the worldwide HD map already exists thanks to unique technology that uses the power of the crowd to generate and continuously update the map. REM technology is actively generating data about more than 15 million kilometers of roads daily to build the map and keep it up to date. This crucial tool for AV deployment is unique to Mobileye and made possible because of the company’s established global advanced driver-assistance systems footprint.\u003C/p>\u003Cp>\u003Cstrong>How to Watch It Now:\u003C/strong>&nbsp;As seen in the&nbsp;\u003Ca href=\"https://youtu.be/A1qNdHPyHu4\" rel=\"noopener noreferrer\" target=\"_blank\">video\u003C/a>, Mobileye’s Munich-based AV conducts many challenging maneuvers using only a camera-based self-driving system (more below).&nbsp;Watch the full video to see how the AV reaches speeds of up to 130 kilometers per hour, navigates a left turn on green through a busy intersection, maneuvers to avoid an open door, executes an unprotected left turn, navigates to avoid a bus pulled to the side of the road, changes lanes on a highway at high speeds, navigates a congested street, maneuvers around a vehicle that is parallel parking, and safely moves around stopped emergency vehicles.\u003C/p>\u003Cp>\u003Cstrong>About the AV:\u003C/strong>&nbsp;The vehicle shown in the video is using a camera-only subsystem of Mobileye’s&nbsp;level-4&nbsp;development AV. This camera subsystem runs on two of Mobileye’s EyeQ®5 systems-on-chips (SoCs) processing 11 cameras. It’s important to note that Mobileye’s production L4 solution (not shown in this video) includes a second sensing subsystem using radar and lidar for True Redundancy™.\u003C/p>\u003Cp>\u003Cstrong>More About Mobileye in Germany:\u003C/strong>&nbsp;Germany’s independent technical service provider granted Mobileye a permit for AV testing in Germany in July 2020. The permit allows the company to test vehicles in real-world traffic on all German roads at speeds up to 130 kilometers per hour.\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>\u003Ca href=\"http://youtu.be/A1qNdHPyHu4\" rel=\"noopener noreferrer\" target=\"_blank\">» Watch video: \"Unedited 1-Hour Mobileye AV Ride in Munich\"\u003C/a>\u003C/p>","2020-12-15T17:00:00.000Z",{"id":1990,"type":5,"url":1991,"title":1992,"description":1993,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":1993,"image":1994,"img_alt":1995,"content":1996,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":1997,"tags":1998},76,"munich-av-video","Watch How our Camera-Only AV Handles the Streets of Munich","The power of REM™, making our AV tech effective in Germany.\n","https://static.mobileye.com/website/us/corporate/images/15074ce19e918f824927715a15bf5989_1607853931572.jpg","Screen capture from video showing Mobileye's developmental AV navigating the Autobahn near Munich.","\u003Cp>Though it&rsquo;s being engineered here in our headquarters, we&rsquo;re developing our AV to be able to function anywhere in the world. So after having released footage this past summer of our rolling testbed \u003Ca href=\"https://www.mobileye.com/news/mobileye-releases-unedited-footage-of-a-40-minute-av-ride-through-jerusalem/\" target=\"_blank\" rel=\"noopener\">handling the often-chaotic streets of Jerusalem\u003C/a>, we&rsquo;re pleased to debut this clip of our camera-only AV doing what it&rsquo;s designed to do on roadways in and around Munich.\u003C/p>\n\u003Cp>The key to its adaptability is the universality of \u003Ca href=\"https://www.mobileye.com/news/mobileye-wins-prestigious-2020-pace-award-for-rem-mapping-tech/\" target=\"_blank\" rel=\"noopener\">REM&trade;\u003C/a>. Crowdsourcing from the masses of vehicles already on the road and equipped with Mobileye technology, REM is \u003Cspan style=\"color: black;\">speedily building a comprehensive and regularly updated database of high-definition\u003C/span>, highly precise, and highly detailed maps of roadways around the world, in \u003Ca href=\"https://www.mobileye.com/blog/mobileye-hits-the-autobahn-with-german-permit/\" target=\"_blank\" rel=\"noopener noreferrer\">Germany\u003C/a> and far beyond. \u003Ca href=\"https://www.mobileye.com/news/mobileye-avs-go-anywhere-germany/\" target=\"_blank\" rel=\"noopener noreferrer\">Read more in the news release\u003C/a>, or watch the video below to see our AV drive itself around Munich.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/A1qNdHPyHu4\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2020-12-15T08:00:00.000Z","Autonomous Driving, Mapping & REM, Video",{"id":2000,"type":5,"url":2001,"title":2002,"description":2003,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2003,"image":2004,"img_alt":2005,"content":2006,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2007,"tags":583},70,"av-safety-demands-true-redundancy","Autonomous-Vehicle Safety Demands True Redundancy™","At Mobileye we’re developing two completely separate self-driving systems, an approach that will create an AV that’s both safer and can be deployed sooner.","https://static.mobileye.com/website/us/corporate/images/135898959f97768edf3d2cec56daa9a0_1606650607791.jpg","True Redundancy is Mobileye's approach to fail-safe sensors for autonomous vehicles.","\u003Cp>\u003Cspan style=\"background-color: inherit;\">Redundancy is essential when engineering safety-critical systems. The goal is to equip a system with multiple components or subsystems that perform the same function, so that if one were to fail, the overall system could still complete its task safely. Autonomous vehicles, more so than many other technological engineering feats, require exceptional precision, accuracy, and sophistication, and their tolerance for failure is virtually zero. This makes \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.edn.com/redundancy-for-safety-compliant-automotive-other-devices/\" target=\"_blank\" rel=\"noopener noreferrer\">redundancies\u003C/a>\u003Cspan style=\"background-color: inherit;\"> critical to understand and implement. But when it comes to the \u003C/span>\u003Cem>true\u003C/em> redundancy of sensing systems, \u003Ca href=\"https://www.mobileye.com/opinion/when-it-comes-to-av-safety-experience-counts/\" target=\"_blank\" rel=\"noopener\">not all AV platforms are the same\u003C/a>.\u003C/p>\n\u003Cp>\u003Cstrong>Fusion vs Redundancy\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">The common practice across the industry is to equip autonomous vehicles with a multitude of sensors, including \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.wired.com/story/the-know-it-alls-how-do-self-driving-cars-see/\" target=\"_blank\" rel=\"noopener noreferrer\">cameras, radar, and LiDAR\u003C/a>\u003Cspan style=\"background-color: inherit;\">. In many AV platforms, these sensors combine to create a single world model (or digital construct of the vehicle&rsquo;s environment) in a process known as &ldquo;\u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.sciencedirect.com/topics/engineering/sensor-fusion\" target=\"_blank\" rel=\"noopener noreferrer\">sensor fusion\u003C/a>\u003Cspan style=\"background-color: inherit;\">.&rdquo; While sensor fusion for autonomous driving may appear to offer redundancy, in essence, it is really offering \u003C/span>\u003Cem>complementary\u003C/em> sensors, as all the sensors together are relied upon to create a single world model. Multiple sensors, one world model, one AV system.\u003C/p>\n\u003Cp>Mobileye&rsquo;s differentiated approach of \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">True Redundancy&trade;\u003C/a> is to separate the sensors into two channels &ndash; one for cameras and another for radar and LiDAR &ndash; and task both with sensing all elements of the driving environment. In this way, we achieve full system redundancy by having each of those channels create their own independent and diverse world models, each filtered independently through our \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">Responsibility-Sensitive Safety\u003C/a> framework. Multiple sensors, multiple world models, multiple AV subsystems.\u003C/p>\n\u003Cp>To ensure that each subsystem is capable of operating independently of the other, our R&amp;D team is running two separate fleets of developmental AVs: one using only cameras (with no radar or LiDAR), and another using only radar and LiDAR (with no cameras). When combined into a complete, production-ready AV, the camera-only subsystem becomes the backbone, while the radar/LiDAR subsystem serves as a diversified and redundant safety back-up.\u003C/p>\n\u003Cp>You can \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">read more about True Redundancy here\u003C/a>.\u003C/p>\n\u003Cp>\u003Cstrong>Safer, Faster\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">By splitting the self-driving platform into two subsystems that can each operate on their own, we are able to build a more reliable (and therefore safer) AV. As \u003C/span>\u003Ca href=\"https://www.mobileye.com/opinion/the-challenge-of-supporting-av-at-scale/\" target=\"_blank\" rel=\"noopener\">our CEO Prof. Amnon Shashua framed it\u003C/a>, True Redundancy &ldquo;is like having both iOS and Android smartphones in my pocket and asking myself: What is the probability that they both crash simultaneously?&rdquo; By the same token, the likelihood of a complete system failure is drastically reduced when you have two redundant and diverse subsystems operating independently.\u003C/p>\n\u003Cp>True Redundancy also yields a faster, more agile development process. Perfecting the technology required for a vehicle to operate autonomously demands extensive testing and validation. By separating our AV platform into two independent subsystems, development of (and subsequent updates to) each subsystem can be validated on a much smaller data set. That can mean the difference between tens of thousands (instead of millions) of hours of data for validation, and that comparative agility means we can safely get our AV platform out on the road faster than we&rsquo;ve found we could with a sensor-fusion approach.\u003C/p>\n\u003Cp>\u003Cstrong>Tomorrow&rsquo;s Tech Today\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">An added benefit of developing our self-driving platform based on these two independent pillars is the ability employ the camera-based subsystem for ADAS as well. With \u003C/span>\u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye SuperVision&trade;\u003C/a>, we have taken the surround-view array from our AV R&amp;D program and applied it to our most advanced driver-assistance system to date, offering hands-free ADAS capabilities. So not only will True Redundancy make AVs safer tomorrow; it can make human-driven passenger vehicles safer today.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Click here to learn more about Mobileye SuperVision\u003C/a> and how it benefits from our AV R&amp;D.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2020-12-09T08:00:00.000Z",{"id":2009,"type":5,"url":2010,"title":2011,"description":2012,"primary_tag":954,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2012,"image":2013,"img_alt":2014,"content":2015,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2016,"tags":1566},73,"were-in-the-midst-of-an-ai-revolution-says-shashua","We’re in the Midst of an AI Revolution, says Shashua","Mobileye’s CEO sits down (remotely) with Samsung president Young Sohn to talks business, autonomous vehicles, and advancements in the field of Artificial Intelligence.","https://static.mobileye.com/website/us/corporate/images/b2489e886d005c1c472e53ce900a6f76_1606658919899.jpg","Mobileye CEO Prof. Amnon Shashua in video conference with Samsung Electronics president Young Sohn.","\u003Cp>&ldquo;We are in the midst of a revolution&rdquo; in the field of artificial intelligence. &ldquo;Things are moving very, very fast.&rdquo; If there&rsquo;s one thing to take away from the recent webcast conversation between our CEO and one of Samsung&rsquo;s top executives, let that be it.\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Professor Amnon Shashua\u003C/a> recently joined Young Sohn, the President and Chief Strategy Officer at Samsung Electronics, for the latest episode of the latter&rsquo;s web series &ldquo;The Next Wave.&rdquo; Over the course of 21 minutes, the two discussed a broad range of subjects, from the \u003Ca href=\"https://www.mobileye.com/opinion/the-challenge-of-supporting-av-at-scale/\" target=\"_blank\" rel=\"noopener\">challenges inherent in developing autonomous vehicles\u003C/a> to the latest breakthroughs in AI and even a bit of advice for budding entrepreneurs eager to follow in Shashua&rsquo;s footsteps.\u003C/p>\n\u003Cp>As Prof. Shashua relates, our CTO Prof. Shai Shalev-Shwartz once observed that &ldquo;There are two types of products. One family of products is very, very sophisticated from a technological point of view, but low accuracy&rdquo; &ndash; like smartphones and personal computers, which (for all their capabilities) can afford to crash from time to time with little risk beyond annoyance and inconvenience. &ldquo;The second family of products are not so much sophisticated but very accurate&rdquo; &ndash; like airplanes, which are based on time-tested technology, but cannot afford to fail. &ldquo;Autonomous driving is both,&rdquo; noted Shashua: &ldquo;very, very sophisticated but on the other hand very, very accurate. The tolerance for failure is almost zero. And this is where the conundrum, this is where the big challenge is.&rdquo;\u003C/p>\n\u003Cp>Successfully and reliably replacing the human driver with sensors and computers requires harnessing the latest advancements in \u003Ca href=\"https://www.mobileye.com/blog/moving-our-machine-learning-to-the-cloud-inspired-innovation/\" target=\"_blank\" rel=\"noopener noreferrer\">artificial intelligence\u003C/a>. Not the least of those developments are being undertaken at companies that Shashua has cofounded &ndash; including Mobileye, of course, but also OrCam (which develops computer-vision systems for the visually impaired), AI21Labs (which specializes in natural language processing), and a digital bank &ndash; on top of his academic career and \u003Ca href=\"https://www.mobileye.com/news/prof-amnon-shashua-wins-the-dan-david-prize/\" target=\"_blank\" rel=\"noopener\">awards received\u003C/a> in the field.\u003C/p>\n\u003Cp>&ldquo;The kind of AI that we are used to is Narrow AI. Solve particular problems: play chess, play Go, have a perception system for autonomous driving, do decision-making for autonomous driving&hellip; these are all narrowly defined problems. The new frontier right now is language: natural language understanding.&rdquo;\u003C/p>\n\u003Cp>Whereas current methods of training AI require manually labeling elements for the system to recognize, said Shashua, &ldquo;the way you train these language models is completely self-supervised. It means that you can build these monster networks that are trained on unlimited amount of data.&rdquo; That, in turn, unlocks virtually unlimited potential &ndash; and it&rsquo;s all advancing at a rapid pace: &ldquo;\u003Cspan style=\"color: black;\">If we simply continue this path, within this decade we will have Broad AI &ndash; which is something that two-three years ago I would not say.&rdquo;\u003C/span>\u003C/p>\n\u003Cp>Watch the full interview in the video below.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/-mHzI8-ACMM\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2020-12-08T08:00:00.000Z",{"id":2018,"type":5,"url":2019,"title":2020,"description":2021,"primary_tag":658,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2021,"image":2022,"img_alt":2023,"content":2024,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2025,"tags":1980},72,"moving-our-machine-learning-to-the-cloud-inspired-innovation","Moving our Machine Learning to the Cloud Inspired Innovation","Mobileye algorithm developer Chaim Rand details the novel approaches his team took to conducting our Machine Learning through Amazon Web Services. \n","https://static.mobileye.com/website/us/corporate/images/5a768ca0c6f4680206a8e16c154d0e6b_1606651444190.jpg","Computers in the trunk of Mobileye's autonomous development vehicle","\u003Cp>Mobileye embraces \u003Ca href=\"https://www.zdnet.com/article/what-is-cloud-computing-everything-you-need-to-know-about-the-cloud/\" target=\"_blank\" rel=\"noopener noreferrer\">the power of the cloud\u003C/a> &ndash; not only for connecting \u003Ca href=\"https://www.mobileye.com/news/mobileye-tech-makes-the-grade-under-euro-ncaps-new-assisted-driving-standard/\" target=\"_blank\" rel=\"noopener\">vehicles equipped with our technology\u003C/a> out on the road, but for our own internal development as well. The switch has unlocked enormous potential, but also came with no small share of \u003Ca href=\"https://www.mobileye.com/opinion/the-challenge-of-supporting-av-at-scale/\" target=\"_blank\" rel=\"noopener\">challenges\u003C/a>. As our machine-learning algorithm developer Chaim Rand outlined in a series of recent articles and presentations, overcoming those challenges has required coming up with novel solutions. And some of those solutions innovated by our team have \u003Ca href=\"https://www.mobileye.com/blog/mobileye-leads-the-industry-in-embracing-linux-for-safety-related-applications/\" target=\"_blank\" rel=\"noopener noreferrer\">charted new territory\u003C/a> in the fields of cloud computing, artificial intelligence, and deep learning &ndash; blazing a trail for others to follow across various industries.\u003C/p>\n\u003Cp>We first started using \u003Ca href=\"https://searchaws.techtarget.com/definition/Amazon-Web-Services\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Web Services\u003C/a> at large scales a little over two years ago in order to supplement our own on-premises data center. Whereas an &ldquo;on-prem&rdquo; server farm is inherently limited to the amount of hardware physically installed, AWS can provide practically \u003Cem>un\u003C/em>limited storage and computational resources, on demand, and is constantly upgrading the installed infrastructure available to its users. By tapping into these resources, we have greatly increased our \u003Ca href=\"https://www.mobileye.com/news/mobileye-releases-unedited-footage-of-a-40-minute-av-ride-through-jerusalem/\" target=\"_blank\" rel=\"noopener\">development\u003C/a> capabilities. But the switch also meant that our large volumes of \u003Ca href=\"https://www.mobileye.com/en/data/\" target=\"_blank\" rel=\"noopener noreferrer\">data\u003C/a> would now need to be transmitted for use on servers separate from where they would be stored. Moving all that data around in an effective and efficient manner presented its own unique challenges.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/24d3b68376ebd6284179acdd87a30186_1626185509437.jpg\" alt=\"Mobileye is forging a new path by moving from an on-premises server farm to cloud computing with Amazon Web Services\" />\u003C/p>\n\u003Cp>As Rand presented at last year&rsquo;s AWS re:Invent conference, our machine-learning team found a solution using &ldquo;Pipe Mode&rdquo; in SageMaker (Amazon&rsquo;s cloud-based machine-learning platform), which allows for feeding the data directly to the algorithm, without significant delay or the need for (even temporary) local storage. Converting all the necessary data into a single supported format also brought the added benefit of significantly streamlining our data-creation flow in the process.\u003C/p>\n\u003Cp>This new mechanism for streaming training data also required employing new data-manipulation techniques. For example, Rand&rsquo;s team implemented three levels of data shuffling, each using a different method, and boosted under-represented classes of data by compiling customized manifest files. They also leveraged \u003Ca href=\"https://towardsdatascience.com/how-to-scale-training-on-multiple-gpus-dae1041f49d2\" target=\"_blank\" rel=\"noopener noreferrer\">multiple GPUs\u003C/a> to speed up more \u003Ca href=\"https://www.mobileye.com/blog/why-tops-arent-tops-when-it-comes-to-av-processors/\" target=\"_blank\" rel=\"noopener noreferrer\">compute-intensive\u003C/a> training jobs, and saved resources by both running less intensive, single-GPU evaluations separately on Amazon EC2 and taking advantage of unused capacity by using Spot Instances. All of this had to be done without exceeding the number of available &ldquo;pipes,&rdquo; and (as should come as little surprise to any developer) still required establishing procedures for debugging.\u003C/p>\n\u003Cp>Of course this is a deeply abridged summary of just one aspect of the extensive work being undertaken by our machine-learning team, which in turn is just one of the many departments operating at Mobileye. But it ought to give you a bit of a glimpse into some of what goes on behind the scenes here. For more detail on our how we run our machine-learning operations in the cloud, you can watch Rand&rsquo;s full presentation in the video below and read his \u003Ca href=\"https://medium.com/@julsimon/making-amazon-sagemaker-and-tensorflow-work-for-you-893365184233\" target=\"_blank\" rel=\"noopener noreferrer\">in-depth guest post\u003C/a>, shared by AWS technical evangelist Julien Simon &ndash; along with Rand&rsquo;s subsequent articles on \u003Ca href=\"https://medium.com/@julsimon/deep-dive-on-tensorflow-training-with-amazon-sagemaker-and-amazon-s3-12038828075c\" target=\"_blank\" rel=\"noopener noreferrer\">training\u003C/a>, \u003Ca href=\"https://towardsdatascience.com/tensorflow-performance-analysis-314b56dceb59\" target=\"_blank\" rel=\"noopener noreferrer\">performance analysis\u003C/a>, and \u003Ca href=\"https://towardsdatascience.com/debugging-in-tensorflow-392b193d0b8\" target=\"_blank\" rel=\"noopener noreferrer\">debugging\u003C/a> in TensorFlow (with more soon to follow).\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/iW0RASdjnOk\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2020-12-03T08:00:00.000Z",{"id":2027,"type":5,"url":2028,"title":2029,"description":2030,"primary_tag":40,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2030,"image":2031,"img_alt":2032,"content":2033,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2034,"tags":45},63,"how-robotaxis-will-lead-the-way-toward-the-fully-autonomous-future","Robotaxis on the Road to the Fully Autonomous Future","AVs will be at your disposal before they’ll be available to purchase. Here are three reasons why we believe self-driving robotaxis are the right way forward.","https://static.mobileye.com/website/us/corporate/images/03a2d036055667dc213b37800870fe7c_1602679632002.jpg","An illustration of a future robotaxi powered by Mobileye technology","\u003Cp>\u003Cspan style=\"color: #333333;\">Before we do anything here at Mobileye, we think long and hard about the right path forward. And our conclusion, when it comes to ramping up to the \u003C/span>\u003Ca style=\"color: black;\" href=\"https://www.mobileye.com/news/mobileye-releases-unedited-footage-of-a-40-minute-av-ride-through-jerusalem/\" target=\"_blank\" rel=\"noopener\">fully autonomous future\u003C/a>\u003Cspan style=\"color: #333333;\">, is that \u003C/span>\u003Ca style=\"color: #333333;\" href=\"https://www.mobileye.com/opinion/navigating-the-winding-road-toward-driverless-mobility/\" target=\"_blank\" rel=\"noopener\">robotaxis must come first\u003C/a>\u003Cspan style=\"color: #333333;\">.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #333333;\">Our \u003C/span>\u003Ca style=\"color: black;\" href=\"https://www.mobileye.com/news/welcoming-moovit-to-the-fold/\" target=\"_blank\" rel=\"noopener\">acquisition of Moovit\u003C/a>\u003Cspan style=\"color: #333333;\"> represents a major step in our pursuit of that path. As does every new robotaxi program we embark upon in locations already including France, \u003C/span>\u003Ca style=\"color: black;\" href=\"https://www.mobileye.com/blog/mobileye-to-deploy-robotaxis-in-japan-with-willer/\" target=\"_blank\" rel=\"noopener noreferrer\">Japan\u003C/a>\u003Cspan style=\"color: #333333;\">, South Korea, and the \u003C/span>\u003Ca style=\"color: black;\" href=\"https://www.mobileye.com/news/mobileye-is-bringing-driverless-maas-to-the-uae/\" target=\"_blank\" rel=\"noopener\">United Arab Emirates\u003C/a>\u003Cspan style=\"color: #333333;\">&hellip; with more sure to follow.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>What is a Robotaxi?\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #333333;\">A robotaxi, for those unfamiliar with the concept, is essentially a driverless car (or shuttle bus) that you order to your location when you need it, instead of buying it and keeping it on hand for when you will. Or to put it another way, it&rsquo;s like a taxi, but driven by computer (instead of a human cab driver).\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong class=\"ql-size-large\">3 Reasons Why Robotaxis are the Right Way Forward\u003C/strong>\u003C/p>\n\u003Cp>\u003Cstrong>1) Cost\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/56dd0ccb06e1a96177df7e6c5c356bed_1626251554505.jpg\" alt=\"Mobileye&rsquo;s robotaxis represent a more cost-effective way to introduce self-driving mobility to cities around the world\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #333333;\">So why are we convinced that driverless robotaxis represent the right way forward, before autonomous vehicles can enter mass production? For a number of reasons &ndash; the first of which is cost. The hardware alone that&rsquo;s required for a vehicle to operate entirely autonomously will, at the outset, be very expensive. AVs, per the Mobileye approach, will have two fully independent subsystems &ndash; one with cameras-only, one with radars and LiDars, in addition to \u003C/span>\u003Ca style=\"color: black;\" href=\"https://www.mobileye.com/blog/why-tops-arent-tops-when-it-comes-to-av-processors/\" target=\"_blank\" rel=\"noopener noreferrer\">powerful processors\u003C/a>\u003Cspan style=\"color: #333333;\"> in order to completely and safely replace the human driver.\u003C/span>\u003C/p>\n\u003Cp>\u003Ca style=\"color: black;\" href=\"https://www.mobileye.com/us/fleets/\" target=\"_blank\" rel=\"noopener noreferrer\">Fleet operators\u003C/a>\u003Cspan style=\"color: #333333;\"> can afford that cost, which they&rsquo;ll recuperate incrementally with each ride they serve up, amortizing the expense over time as the vehicles spend far more time on the road than a privately owned vehicle would be. As development and production of all this equipment ramps up, the cost will eventually come down to a level at which it will be affordable enough for private customers.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>2) Regulation\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/e680f218f38b21bb39782bb77c78b3d5_1626251570269.jpg\" alt=\"Mobileye is leading the way in helping governments formulate the right new policies to regulate self-driving vehicles\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #333333;\">The second is regulation. Current laws \u003C/span>\u003Ca style=\"color: black;\" href=\"https://www.mobileye.com/blog/mobileye-hits-the-autobahn-with-german-permit/\" target=\"_blank\" rel=\"noopener noreferrer\">governing motor-vehicle use\u003C/a>\u003Cspan style=\"color: #333333;\"> are almost universally predicated on the notion of their being operated by human drivers. Changing that model to one in which the vehicle will be essentially operated by computer won&rsquo;t happen overnight, but can more readily be phased in gradually with limited fleets of robotaxis.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #333333;\">Professional fleet operators can be more closely monitored and regulated than private consumers realistically can be. And based on their records, regulators will be better able to prepare for more widespread deployment of autonomous vehicles.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>3) Scale\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/64e9f1b56390eeadaf7b364ab65b1dce_1626251584612.jpg\" alt=\"Robotaxis like Mobileye&rsquo;s will be able to be rolled out at scale around the world thanks to Mobileye&rsquo;s scalable approach\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #333333;\">Finally, there&rsquo;s the question of geographic scale &ndash; which principally comes down to map coverage. The general consensus across the industry is that autonomous vehicles will require highly accurate road maps, far more precise than existing GPS-based geolocation technology. Just think of how many times your satellite navigation system has gotten confused about where exactly you are and which direction you&rsquo;re facing (let alone which lane you&rsquo;re in), and you&rsquo;ll understand why ordinary GPS isn&rsquo;t nearly accurate enough for a fully autonomous vehicle.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #333333;\">That&rsquo;s why we developed our \u003C/span>\u003Ca style=\"color: black;\" href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">Road Experience Management (REM)\u003C/a>\u003Cspan style=\"color: #333333;\">&trade; system, which is steadily gathering the required information for compilation into our Roadbook&trade; database through our extensive crowd of Mobileye-equipped vehicles&hellip; but the process naturally takes time. Robotaxis will initially be geofenced within digitally delineated areas that will already have been sufficiently mapped. All the while, we (and our colleagues across the industry) can work on intricately mapping the rest of the world&rsquo;s roads.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>Unlocking Enormous Potential\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #333333;\">Each of those challenges would be difficult enough to overcome on their own. Taken together, they pose an enormous roadblock on the highway to fully autonomous mobility and the potential benefits it holds in store. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #333333;\">With the intermediate step of deploying robotaxi services, parties both private and public will be better positioned to drive down the cost of the technology, formulate and enact the required regulations, and create the extensive high-precision maps that autonomous vehicles will need in order to proliferate into the mainstream. Far from a mere stopgap, though, robotaxis represent a total available market (TAM) \u003C/span>\u003Ca style=\"color: black;\" href=\"https://www.mobileye.com/opinion/there-is-more-to-our-moovit-acquisition-than-meets-the-eye/\" target=\"_blank\" rel=\"noopener\">estimated to be worth $160 billion by 2030\u003C/a>\u003Cspan style=\"color: #333333;\">, in turn presenting an economic opportunity far too promising to dismiss.\u003C/span>\u003C/p>","2020-11-12T08:00:00.000Z",{"id":2036,"type":5,"url":2037,"title":2038,"description":2039,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2039,"image":2040,"img_alt":2041,"content":2042,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2043,"tags":583},66,"avs-and-the-drive-for-pedestrian-safety","Autonomous Vehicles and the Drive for Pedestrian Safety","Autonomous vehicles should lead to reduced pedestrian collision rates. For National Pedestrian Safety Month, we look at the challenges of this technology .","https://static.mobileye.com/website/us/corporate/images/db4fe5def0f67a6070e5e7f28e028074_1603807413745.jpg","Digital illustration showing pedestrian-safety technology in a future autonomous vehicle","\u003Cp>Since the NHTSA has declared October \u003Ca href=\"https://www.nhtsa.gov/press-releases/national-pedestrian-safety-month-october#:~:text=U.S.%20Department%20of%20Transportation%20Designates%20October%20as%20National%20Pedestrian%20Safety%20Month%20%7C%20NHTSA\" target=\"_blank\" rel=\"noopener noreferrer\">Pedestrian Safety Month\u003C/a>, it&rsquo;s a good time to look at the impact that \u003Ca href=\"https://www.mobileye.com/news/mobileye-ranked-5-in-guidehouse-insights-automated-driving-leaderboard/\" target=\"_blank\" rel=\"noopener\">autonomous vehicles\u003C/a> (AVs) are expected to have on \u003Ca href=\"https://www.mobileye.com/blog/how-adas-and-data-can-lead-the-way-in-pedestrian-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">pedestrian safety\u003C/a> and what Mobileye has learned in its two decades of developing advanced driver assistance technology for vehicles.\u003C/p>\n\u003Cp>According to the NHTSA, 94% of all collisions are due to human error, so naturally the expectation is that AVs will lead to lower collision rates in general &ndash; and specifically lower pedestrian accidents &ndash; by removing the human actor from the equation. However, the challenges pedestrians pose to AVs themselves are quite complex.\u003C/p>\n\u003Cp>\u003Cstrong>Eliminating Pedestrian Deaths\u003C/strong>\u003C/p>\n\u003Cp>Gabi Hayon, Mobileye&rsquo;s Executive Vice President for Research and Development, discussed the likely effects of the advent of AVs on pedestrian safety in a \u003Ca href=\"https://pavecampaign.org/events-backup/pave-virtual-panel-autonomous-vehicles-and-the-long-road-to-eliminating-pedestrian-deaths/\" target=\"_blank\" rel=\"noopener noreferrer\">PAVE panel\u003C/a>, entitled \u003Cem>Autonomous Vehicles and the Long Road to Eliminating Pedestrian Deaths\u003C/em>.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/KrxhXzpc5cw\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>Hayon described some of these challenges, &ldquo;Usually a vehicle drives on a route. It&rsquo;s rare for a vehicle to make a very erratic movement. A walking pedestrian [however] can stop suddenly, start running, surprise you.&rdquo;\u003C/p>\n\u003Cp>Mobileye&rsquo;s AV technology tackles the problem of protecting these potentially erratic pedestrians in a few different ways. First, Mobileye&rsquo;s AV system is very robust. According to Hayon, &ldquo;We take multiple approaches. Some are based on object detection, others on stereoscopic analysis, our VIDAR concept, and others on pixel labeling.&rdquo;\u003C/p>\n\u003Cp>In addition, the pedestrian detection component of the system is learning how to interpret a human&rsquo;s actions in order to understand their intentions. This second element entails trying to predict how pedestrians are going to act or what they seek to achieve in given scenarios. For instance, the hand gestures of a policeman in the street guiding traffic may mean something completely different from similar gestures by someone waiting at a crosswalk.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/637ee9cacd2efb00b9c6d54158b5fa95_1626253263009.jpg\" alt=\"The self-driving technologies pioneered by Mobileye promise to keep pedestrians safer than human-driven vehicles\" />\u003C/p>\n\u003Cp>\u003Cstrong>A Mathematical Model for AV Safety\u003C/strong>\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">Responsibility-Sensitive Safety\u003C/a>&trade; is the third element in Mobileye&rsquo;s approach to ensuring pedestrian safety. RSS&trade; is a technology-neutral safety model developed by Mobileye. It oversees an AV&rsquo;s driving decisions, ensuring that the vehicle is making safe choices.\u003C/p>\n\u003Cp>One of the big challenges to pedestrian safety, especially in urban areas, is that pedestrians often appear suddenly, out of occluded areas &ndash; for example, from between two parked cars, not necessarily next to a pedestrian crossing sign. RSS takes this into account in the 4\u003Csup>th\u003C/sup> RSS rule, creating a &ldquo;visibility area&rdquo; and calculating a reasonable &ldquo;worst case scenario&rdquo; so the vehicle, as it were, expects the unexpected. This may involve giving parked vehicles a wider berth or slowing down in low visibility areas.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/763ba3e71c489921f4579d0e9575dba6_1603870361244.png\" alt=\"The Responsibility-Sensitive Safety model (RSS) formulated by Mobileye is designed to safeguard pedestrians\" />\u003C/p>\n\u003Cp>Here, it&rsquo;s important to note one of the most important features of RSS: it&rsquo;s an open system. This means that the calculations defining a &ldquo;worst case scenario&rdquo; are open and can even be approved by regulators. In addition, RSS is a flexible system, able to adapt to local driving conditions. Dr. Hayon mentions that Mobileye&rsquo;s AV technology is being tested in two widely divergent urban settings &ndash; \u003Ca href=\"https://www.mobileye.com/blog/mobileye-hits-the-autobahn-with-german-permit/\" target=\"_blank\" rel=\"noopener noreferrer\">Munich\u003C/a> and \u003Ca href=\"https://www.mobileye.com/news/mobileye-releases-unedited-footage-of-a-40-minute-av-ride-through-jerusalem/\" target=\"_blank\" rel=\"noopener\">Jerusalem\u003C/a> &ndash; each of which poses its own unique challenges unlike those experienced in the other.\u003C/p>\n\u003Cp>The AV revolution promises to significantly improve safety for pedestrians &ndash; not to mention other road users &ndash; and it may, indeed, help us reach Vision Zero&rsquo;s goal of no casualties on our roads. The obstacles along the way will not be easy to navigate, but Mobileye is proud to be playing its part in that effort.\u003C/p>","2020-10-28T07:00:00.000Z",{"id":2045,"type":5,"url":2046,"title":2047,"description":2048,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2048,"image":2049,"img_alt":2050,"content":2051,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2052,"tags":563},65,"how-adas-and-data-can-lead-the-way-in-pedestrian-safety","How ADAS and Data Can Lead the Way in Pedestrian Safety","During Pedestrian Safety Month we look back at the technologies Mobileye has developed to help ensure pedestrian safety and forward to future safety tech. ","https://static.mobileye.com/website/us/corporate/images/3309591f754b2c57122fb9ee0578684d_1603360789882.jpg","Pedestrian-detection technology in action at a busy downtown intersection crosswalk.","\u003Cp>According to the NHTSA, October is Pedestrian Safety Awareness Month – and this year the issue of pedestrian safety takes on a special importance. In a \u003Ca href=\"https://www.ghsa.org/resources/Pedestrians20\" rel=\"noopener noreferrer\" target=\"_blank\">report\u003C/a> issued early in 2020, the Governors Highway Safety Association (GHSA) announced that, according to their preliminary statistics, US pedestrian deaths in 2019 rose to 6,590. That is the highest number of pedestrian fatalities in more than 30 years.\u003C/p>\u003Cp>Perhaps equally concerning, the group’s statistics showed that safety advances have had far less effect on pedestrians than others such as vehicle occupants – the percentage increase in the number of fatalities among pedestrians had risen by 53% between 2009-18 as opposed to 2% for other traffic deaths.\u003C/p>\u003Cp>Mobileye is committed to fighting this trend with both collision avoidance technologies and through Mobileye Data Services, which provides the data needed by localities, authorities and others to plan and maintain safer roads.\u003C/p>\u003Cp>\u003Cstrong>In the Vehicle\u003C/strong>\u003C/p>\u003Cp>\u003Cstrong>﻿\u003C/strong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/d1151788612dd2ff2f5f9a9941550625_1626259805916.jpg\" alt=\"Mobileye introduced the first pedestrian detection and warning systems with automatic emergency braking\">\u003C/p>\u003Cp>Throughout its history, Mobileye has been at the forefront in developing safety technology designed to protect road users, such as pedestrians. A decade ago, Mobileye introduced the first pedestrian detection and warning systems with automatic emergency braking. This allowed car manufacturers to implement systems warning drivers of potential collisions with pedestrians in front of their vehicle and then, if the driver doesn’t take action, apply the brakes, slowing the vehicle, even if the driver is distracted.\u003C/p>\u003Cp>In 2019, Mobileye released the Mobileye 8 Connect, an aftermarket product providing pedestrian collision warnings even at night*. This is especially important as \u003Ca href=\"https://injuryfacts.nsc.org/motor-vehicle/road-users/pedestrians/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"color: windowtext;\">75% of pedestrian\u003C/a> fatalities occur at night.\u003C/p>\u003Cp>\u003Cstrong>Around the Vehicle\u003C/strong>\u003C/p>\u003Cp>\u003Cstrong>﻿\u003C/strong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/53717d21562263536a2660d497412510_1626259823718.jpg\" alt=\"Collision-avoidance systems like Mobileye Shield+™ help bus drivers watch out for pedestrians in their huge blind spots\">\u003C/p>\u003Cp>\u003Ca href=\"https://www.mobileye.com/us/fleets/blog/driving-blind-blind-spot-protection-and-collision-avoidance/\" rel=\"noopener noreferrer\" target=\"_blank\">Blind spots, especially around large vehicles\u003C/a> such as trucks and buses, pose a particular hazard to pedestrians. According to the Amalgamated Transit Union, one \u003Ca href=\"https://www.atu.org/\" rel=\"noopener noreferrer\" target=\"_blank\">pedestrian is killed in America\u003C/a> every 10 days due to dangerous blind spots in buses.\u003C/p>\u003Cp>To help protect pedestrians from these large vehicles, we offer Mobileye Shield+™, which is equipped with our standard collision-avoidance features, helping eliminate the dangerous blind spot in front of the vehicle. In addition, cameras mounted on the vehicle’s sides warn drivers of pedestrians in the blind spots on the vehicle’s left and right. This feature helps prevent the hazards these vehicles pose when the vehicle is turning right while a pedestrian is passing on the right.\u003C/p>\u003Cp>\u003Cstrong>On the Roads \u003C/strong>\u003C/p>\u003Cp>\u003Cstrong>﻿\u003C/strong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/501317895f06c2daa104d821f29eecad_1626259839682.jpg\" alt=\"Mobileye Data Services collects data on where road traffic comes close to hitting pedestrians\">\u003C/p>\u003Cp>Mobileye doesn’t just protect pedestrians through collision warnings, we’re also helping make the roads themselves safer. \u003Ca href=\"https://www.mobileye.com/en/data/\" rel=\"noopener noreferrer\" target=\"_blank\">Mobileye Data Services\u003C/a> is able to conduct a road risk assessment, collecting data on where vehicles are experiencing numerous “near-misses” with pedestrians and cyclists, allowing local authorities to identify these dangerous areas and take steps to avoid pedestrian accidents before they occur. These steps might include installation of pedestrian crosswalks, increasing crosswalk visibility, adding traffic lights or other safety measures.\u003C/p>\u003Cp>\u003Cstrong>In City Planning Rooms\u003C/strong>\u003C/p>\u003Cp>\u003Cstrong>﻿\u003C/strong>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/bb2d240da08444a45762f551e29855b3_1626259856728.png\" alt=\"Mobileye Data Services help city planners make their urban infrastructure safer for pedestrians\">\u003C/p>\u003Cp>Leveraging the power of our data collection system, Mobileye can supply local planners with highly detailed data about pedestrian mobility along with road assets like pedestrian crossing signs. This information, for instance, can have a significant impact on pedestrian safety. For example, if large numbers of pedestrians are using a street with no (or very narrow) sidewalks, managers may want to install (or widen) sidewalks.\u003C/p>\u003Cp>Data may show areas so heavily trafficked that they should be turned into pedestrian boulevards. Mobility and asset data can be combined for even deeper insights, locating areas where pedestrians are crossing streets without crosswalks. Since this data is updated on a recurring basis, based on the plan a customer has and availability, localities can look at historical data to judge how effective their actions have been.\u003C/p>\u003Cp>Pedestrian Safety Month is a good time to look back, take stock of the challenges we face and to look forward to solutions both technological and human to help make our pedestrians safer.\u003C/p>\u003Cp>\u003Cem class=\"ql-size-small\">*\u003C/em>\u003Cem class=\"ql-size-small\" style=\"color: black;\">The system can work in extremely low light but not in complete darkness. Exact specifications and limitations are set out in the relevant user manual.\u003C/em>\u003C/p>","2020-10-21T21:00:00.000Z",{"id":2054,"type":24,"url":2055,"title":2056,"description":2057,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2057,"image":2058,"img_alt":2059,"content":2060,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2061,"tags":2062},61,"mobileye-tech-makes-the-grade-under-euro-ncaps-new-assisted-driving-standard","Mobileye Tech Makes the Grade by Euro NCAP’s New Standard","Eight of the ten vehicles selected for evaluation feature Mobileye technology, including two of the three top performers.","https://static.mobileye.com/website/us/corporate/images/70e85490da9a9bae07ee1ef671e9cdcf_1603114274084.jpg","An illustration of Highway Assist ADAS functions like those supported by Mobileye","\u003Cp>Automotive safety is increasingly broadening in focus from equipping a vehicle to \u003Cem>withstand\u003C/em> the forces of impact in a collision to helping \u003Cem>avoid\u003C/em> collisions altogether and making the driving experience altogether as safe as possible. That in turn demands an equally fundamental change in the way automotive safety is evaluated, which fortunately has been gradually taking place over the past several years. This latest development marks a significant step forward on that front, and we applaud both the initiative and the initial findings, which were dominated by vehicles equipped with Mobileye technology.\u003C/p>\n\u003Cp>This month, the European New Car Assessment Programme (Euro NCAP) rolled out its \u003Ca href=\"https://www.euroncap.com/en/vehicle-safety/safety-campaigns/2020-assisted-driving-tests/\" target=\"_blank\" rel=\"noopener noreferrer\">new assisted-driving grading\u003C/a> framework. Following \u003Ca href=\"https://www.euroncap.com/en/vehicle-safety/safety-campaigns/2018-automated-driving-tests/\" target=\"_blank\" rel=\"noopener noreferrer\">preliminary testing\u003C/a> conducted in 2018, Euro NCAP&rsquo;s new grading system aims to evaluate the \u003Ca href=\"https://www.mobileye.com/blog/buying-a-new-car-here-are-four-adas-features-to-look-for/\" target=\"_blank\" rel=\"noopener noreferrer\">Highway Assist\u003C/a> (or Traffic Jam Assist) features offered on a wide and growing range of new cars. These systems combine Adaptive Cruise Control and Lane Centering functions to keep a vehicle so-equipped cruising at a steady speed while maintaining a safe distance from the vehicle in front of it and remaining well within the boundaries of its lane.\u003C/p>\n\u003Cp>Euro NCAP chose \u003Ca href=\"https://www.euroncap.com/en/ratings-rewards/assisted-driving-gradings/\" target=\"_blank\" rel=\"noopener noreferrer\">ten vehicles\u003C/a>, equipped with some of the most advanced highway-assist systems on the market, to evaluate in this round of testing. As one of the leading suppliers of \u003Ca href=\"https://www.mobileye.com/blog/everything-you-need-to-know-about-adas/\" target=\"_blank\" rel=\"noopener noreferrer\">ADAS technology\u003C/a> in the industry, we&rsquo;re proud to report that eight of those ten incorporate Mobileye tech &ndash; including the Audi Q8 and BMW 3 Series, representing two of the three vehicles that received Very Good ratings (the highest of four grades assigned under the new framework).\u003C/p>\n\u003Cp>&ldquo;Assisted driving technologies offer enormous benefits by reducing fatigue and encouraging safe driving,&rdquo; Euro NCAP Secretary General Dr. Michiel van Ratingen said in \u003Ca href=\"https://www.euroncap.com/en/press-media/press-releases/euro-ncap-launches-assisted-driving-grading/\" target=\"_blank\" rel=\"noopener noreferrer\">announcing\u003C/a> the new framework. &ldquo;The best systems offer a balance between the amount of assistance they provide and the level of driver engagement &ndash; and should be supported by an effective safety backup.&rdquo;\u003C/p>\n\u003Cp>&ldquo;The results of this round of tests demonstrate that driving assistance is fast becoming better and more readily available, but until driver monitoring is significantly improved, the driver needs to remain responsible at all times,&rdquo; noted van Ratingen.\u003C/p>\n\u003Cp>We enthusiastically share the emphasis that Euro NCAP has placed on the human driver&rsquo;s centrality in the codification of its new assisted-driving standard. It is, to a large degree, with these same considerations in mind that we are rolling out \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye SuperVision&trade;\u003C/a>. Our most advanced driver-assistance system yet, Mobileye SuperVision leverages technology derived directly from \u003Ca href=\"https://www.mobileye.com/news/mobileye-releases-unedited-footage-of-a-40-minute-av-ride-through-jerusalem/\" target=\"_blank\" rel=\"noopener\">our autonomous-vehicle development program\u003C/a> to deliver an array of hands-free driving capabilities, while still crucially leaving the driver in overall control of the vehicle.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/XLGv8V_c2o0\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2020-10-19T07:00:00.000Z","Awards, Industry, ADAS, Mobileye Inside, News",{"id":2064,"type":24,"url":2065,"title":2066,"description":2067,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2067,"image":2068,"img_alt":2069,"content":2070,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2071,"tags":928},64,"ee-times-publicly-examines-mobileyes-av-technology","EE Times Publicly Examines Mobileye’s AV Technology ","Noted tech journalist Junko Yoshida scrutinizes two unedited films of Mobileye’s AV navigated the crowded streets of Jerusalem. ","https://static.mobileye.com/website/us/corporate/images/7d896a22d5ca12c3b397cc6a7565b83e_1602779851200.jpg","A screen capture from the 40-minute video showing Mobileye's autonomous vehicle driving through the streets of Jerusalem","\u003Cp>Over the past year or so,&nbsp;Mobileye&nbsp;has&nbsp;released two \u003Ca href=\"https://www.mobileye.com/news/mobileye-releases-unedited-footage-of-a-40-minute-av-ride-through-jerusalem/\">unedited&nbsp;films&nbsp;of our AV\u003C/a> navigating the busy streets of Jerusalem,&nbsp;which&nbsp;\u003Cem>EE Times&nbsp;\u003C/em>has dubbed us&nbsp;&ldquo;unique&rdquo;&nbsp;within the industry.&nbsp;The primary goal&nbsp;of these&nbsp;films, released&nbsp;(and still available)&nbsp;\u003Ca href=\"https://youtu.be/kJD5R_yQ9aw\" target=\"_blank\" rel=\"noopener noreferrer\">on YouTube\u003C/a>, was to help the general public understand where our AV technology stood at the time of publication. It was also an opportunity for&nbsp;anyone to freely &ldquo;look under the hood&rdquo; and examine, on their own, how our autonomous vehicle functions under real&nbsp;world circumstances. While these&nbsp;films&nbsp;were well received by both the public and experts,&nbsp;only recently were they really&nbsp;put to the test&nbsp;when&nbsp;\u003Cem>EE Times\u003C/em>,&nbsp;a&nbsp;leading&nbsp;online magazine covering the electronics industry,&nbsp;undertook to scrutinize them, frame-by-frame,&nbsp;to&nbsp;try and understand&nbsp;if&nbsp;AV technology&nbsp;is&nbsp;really&nbsp;&ldquo;seeing&rdquo; the world&nbsp;as&nbsp;humans&nbsp;do.&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">The online magazine notes that while other companies have also released footage of their AVs traveling city streets, &ldquo;[t]he&nbsp;trouble&nbsp;with these movies is that, often, some scenes are either obviously edited out or cleverly sped up.&rdquo;&nbsp;\u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">In an&nbsp;\u003C/span>\u003Ca style=\"background-color: inherit; color: #0563c1;\" href=\"https://www.eetimes.com/is-av-software-driver-detecting-what-we-are-seeing/\" target=\"_blank\" rel=\"noopener noreferrer\">article\u003C/a>\u003Cspan style=\"background-color: inherit;\">&nbsp;outlining the&nbsp;publication's findings,&nbsp;respected&nbsp;tech journalist Junko Yoshida notes that analyzing the films is particularly important in light of Mobileye&rsquo;s \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/opinion/our-new-deal-with-geely-is-a-game-changer-says-shashua/\">new agreement with Geely\u003C/a>\u003Cspan style=\"background-color: inherit;\">&nbsp;which, in&nbsp;her&nbsp;words, &ldquo;is the first time AV software and hardware intended for self-driving vehicles are directly targeting consumer ADAS vehicles&hellip;.&rdquo;&nbsp;\u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">The&nbsp;article&nbsp;mentions that&nbsp;detailed analysis of&nbsp;Mobileye&rsquo;s&nbsp;video&nbsp;was made possible&nbsp;because&nbsp;the video&nbsp;included&nbsp;footage from&nbsp;three&nbsp;different angles,&nbsp;allowing&nbsp;comparison of the video footage with&nbsp;a representation of how the&nbsp;technology&nbsp;perceives the same scene.&nbsp;In other words,&nbsp;they could compare between the \"real world\" and the&nbsp;world as seen by the technology.\u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">Its intense analysis left the \u003C/span>\u003Cem style=\"background-color: inherit;\">EE Times \u003C/em>\u003Cspan style=\"background-color: inherit;\">impressed overall, but Yoshida does note&nbsp;a number of&nbsp;what she&nbsp;calls&nbsp;&ldquo;red flags,&rdquo;&nbsp;and&nbsp;so she interviews&nbsp;several experts in&nbsp;the field. They&nbsp;point&nbsp;out that&nbsp;these&nbsp;&ldquo;red flags,&rdquo;&nbsp;such as the technology&rsquo;s apparent inability to distinguish between a truck and bus,&nbsp;are not&nbsp;&ldquo;potential problems.&rdquo;&nbsp;&nbsp;\u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">Yoshida also interviews&nbsp;\u003C/span>Mobileye CTO Shai Shalev-Shwartz\u003Cspan style=\"background-color: inherit;\">, asking him about these&nbsp;same&nbsp;issues.&nbsp;Shai&nbsp;gives&nbsp;detailed explanations,&nbsp;noting, for&nbsp;example, that while&nbsp;an&nbsp;object may seem to &ldquo;disappear&rdquo; from the&nbsp;visualization footage, it does not&nbsp;disappear&nbsp;from the AV&rsquo;s sensors. He&nbsp;also&nbsp;clarifies&nbsp;that the&nbsp;technology&nbsp;understands&nbsp;that&nbsp;in reality &ldquo;things don&rsquo;t simply disappear&rdquo;&nbsp;and&nbsp;our algorithms&nbsp;account&nbsp;for that.&nbsp;\u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">Yoshida concludes by expressing&nbsp;her&nbsp;appreciation&nbsp;that Mobileye has&nbsp;boldly&nbsp;made this kind of detailed information available for public&nbsp;scrutiny.&nbsp;For us&nbsp;at Mobileye, this&nbsp;transparency is a crucial component in AV development. We believe there is little chance of the&nbsp;public accepting these vehicles&nbsp;without&nbsp;first&nbsp;understanding&nbsp;how they make&nbsp;the decisions they make.&nbsp;We&nbsp;encourage all those developing AVs&nbsp;to&nbsp;take up this challenge,&nbsp;which we believe is a critical step&nbsp;toward&nbsp;bringing&nbsp;AVs onto&nbsp;our streets.\u003C/span>&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2020-10-15T07:00:00.000Z",{"id":2073,"type":5,"url":2074,"title":2075,"description":2076,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2076,"image":2077,"img_alt":2078,"content":2079,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2080,"tags":563},59,"buying-a-new-car-here-are-four-adas-features-to-look-for","Buying a New Car? Here Are Four ADAS Features to Look For","The array of available Advanced Driver-Assistance System features keeps getting longer (and more confusing), so allow us to help you out with this list of what to look for when shopping for your next set of wheels.","https://static.mobileye.com/website/us/corporate/images/11b56659cae4c6f7f9e24b2f53aecf36_1601558405765.jpg","Shopping for a new car can be bewildering, especially with the proliferation of ADAS features.","\u003Cp>From ABS to ZEV, decoding all the acronyms on a new car&rsquo;s spec sheet can be bewildering &ndash; especially when it comes to ADAS.\u003C/p>\n\u003Cp>That&rsquo;s the first one you need to know. It stands for \u003Ca href=\"https://www.mobileye.com/blog/everything-you-need-to-know-about-adas/\" target=\"_blank\" rel=\"noopener noreferrer\">Advanced Driver-Assistance Systems\u003C/a>, and it encompasses all the electronic features that can make your car safer and easier to operate out on the road, with all its attendant dangers and hassles.\u003C/p>\n\u003Cp>But even within the realm of ADAS features, there are still ever-more acronyms to decipher and technologies to understand. So we&rsquo;ve highlighted four of the most crucial ADAS features that &ndash; in our opinion &ndash; you should look for when shopping for a new car.\u003C/p>\n\u003Cp>\u003Cstrong class=\"ql-size-large\">Lane-Keeping Assist (LKA)\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/b2c2922916bbafff8f97cb6266e11f9d_1629787928614.jpg\" alt=\"Lane-Keeping Assist (LKA) enabled by Mobileye computer-vision ADAS technology helps keep vehicles in their lane\" />\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cem>What is LKA?\u003C/em>\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">Veering unintentionally out of your lane (or off the road entirely) is one of the most dangerous events that can occur behind the wheel. In fact \u003C/span>\u003Ca style=\"color: black;\" href=\"https://www.iihs.org/news/detail/drivers-who-drift-from-lane-and-crash-often-dozing-or-ill\" target=\"_blank\" rel=\"noopener noreferrer\">the Insurance Institute for Highway Safety (IIHS) reported\u003C/a>\u003Cspan style=\"color: black;\"> that incidents involving just single vehicles departing the roadway accounted for a staggering 40 percent of fatal crashes in America in 2014 alone&hellip; and even more head-on collisions and sideswipes with other vehicles.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">Drifting out of lane is also one of the first indicators of a driver&rsquo;s drowsiness (along with other forms of impairment or distraction), which brings with it additional dangers. Lane-Keeping Assist, as the name implies, is designed to help mitigate those risks by keeping vehicles so enabled in their lane.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cem>How does LKA work?\u003C/em>\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">Whereas passive Lane-Departure Warning technology is designed to help mitigate these risks by alerting the driver when drifting out of lane without signaling, Lane-Keeping Assist takes it a step further by monitoring the road ahead, detecting if the vehicle is veering out of lane, and (if necessary) gently guiding it back onto the proverbial straight-and-narrow.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cem>What are the benefits of LKA?\u003C/em>\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">The evidence supports the importance of this ADAS feature. A \u003C/span>\u003Ca style=\"color: black;\" href=\"https://www.itskrs.its.dot.gov/its/benecost.nsf/ID/7178f19ffd0a24ff852583240068b379\" target=\"_blank\" rel=\"noopener noreferrer\">2015 study\u003C/a>\u003Cspan style=\"color: black;\">, for instance, determined that 67 percent of incidents in which the driver drifted out of lane and departed the roadway could (assuming a sufficient shoulder) have been avoided with the intervention of Lane-Keeping Assist technology. And that doesn&rsquo;t even account for other dangers that this technology can help mitigate, like the nightmare scenario of veering into oncoming traffic.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong class=\"ql-size-large\">Automatic Emergency Braking (AEB)\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/eadb0217e613f472f6b919477d8586df_1629787957925.jpg\" alt=\"Automatic Emergency Braking (AEB) enabled by Mobileye computer-vision ADAS technology automatically applies the brakes to avoid collision\" />\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cem>What is AEB?\u003C/em>\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">Most experienced drivers have felt the shock of having to suddenly hit the brakes. In those moments, we are beyond grateful to have been able to react quickly enough. Far too often, though, human drivers are too slow to react, and a crash ensues. That&rsquo;s where Automatic Emergency Braking comes in. \u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cem>How does AEB work?\u003C/em>\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">AEB is based on the same idea as Forward Collision Warning (FCW): detecting and preventing an imminent collision. But where FCW \u003C/span>\u003Cem style=\"color: black;\">passively\u003C/em>\u003Cspan style=\"color: black;\"> warns the driver to take action, AEB \u003C/span>\u003Cem style=\"color: black;\">actively\u003C/em>\u003Cspan style=\"color: black;\"> intervenes by automatically applying the brakes to avoid collision &ndash; typically reacting much more quickly than a human driver ever could.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cem>What are the benefits of AEB?\u003C/em>\u003C/strong>\u003C/p>\n\u003Cp>No wonder that a \u003Ca href=\"https://www.itskrs.its.dot.gov/its/benecost.nsf/ID/63713f1fa62dfaf185258466005e5db3\" target=\"_blank\" rel=\"noopener noreferrer\">recent survey\u003C/a> indicated AEB to be one of the most desirable driver-assistance technologies on the market, with 46 percent of respondents indicating they were &ldquo;very interested&rdquo; in this feature &ndash; second only to Blind-Spot Monitoring.\u003C/p>\n\u003Cp>Only AEB is demonstrably more effective: the IIHS found &ldquo;forward collision warning systems plus autobrake&rdquo; (as it terms AEB) to be capable of reducing rear-end collisions by 50 percent (and resulting injuries by 56 percent), representing a far greater benefit than Blind-Spot Monitoring&hellip; or any other ADAS feature it evaluated, for that matter.\u003C/p>\n\u003Cp>\u003Cstrong class=\"ql-size-large\">Adaptive Cruise Control (ACC)\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/f02d00abe75b6572bc72842221957472_1629787973307.jpg\" alt=\"Adaptive Cruise Control (ACC) enabled by Mobileye technology makes minute adjustments to the vehicle&rsquo;s speed to adapt to the flow of traffic\" />\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cem>What is ACC?\u003C/em>\u003C/strong>\u003C/p>\n\u003Cp>We&rsquo;re all familiar by now with cruise control &ndash; the feature that allows the driver to set a speed for the vehicle to maintain. Cruise control has been available for decades already, but Adaptive Cruise Control takes the idea even further.\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cem>How does ACC work?\u003C/em>\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">ACC makes minute adjustments to the vehicle&rsquo;s speed to adapt to the flow of traffic, thus making the drive as smooth as possible. If an ACC-enabled vehicle comes up on slower-moving traffic, the technology will slow the vehicle down, then bring it back up to the set speed again once the way is clear, so the driver doesn&rsquo;t have to switch the system off and on again.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cem>What are the benefits of ACC?\u003C/em>\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">In the realm of ADAS, ACC might be considered more of a &ldquo;convenience&rdquo; than a &ldquo;safety&rdquo; feature. But this technological feature can be rather convenient indeed, especially for drivers who spend a lot of time cruising down (or stuck in traffic on) the highway. Thirty-one percent of respondents to that same survey mentioned above indicated they were &ldquo;very interested&rdquo; in ACC, placing it third in the study (tied with Pedestrian Detection, and behind only AEB and Blind-Spot Monitoring).\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong class=\"ql-size-large\">Traffic-Jam Assist (TJA)\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/cf41ebcbe8aea461c8c6930b5a09f141_1629787993068.jpg\" alt=\"Traffic-Jam Assist (TJA) enabled by Mobileye computer-vision ADAS technology combines Adaptive Cruise Control with Lane-Centering\" />\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cem>What is TJA?\u003C/em>\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">Another &ldquo;convenience&rdquo; feature, Traffic-Jam Assist (sometimes called Highway Assist) is essentially Adaptive Cruise Control combined with Lane-Centering (another technology, which automatically steers the vehicle to help it stay centered between the lane markings). But where ACC is helpful mostly for smoothly cruising the highway, TJA (as its name suggests) is more helpful when encountering heavy traffic.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cem>How does TJA work?\u003C/em>\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">When the highway begins to resemble a parking lot, TJA manages the throttle, brakes, and steering to help alleviate the nuisance of crawling through stop-and-go traffic, while monitoring surrounding traffic to ensure it keeps clear.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong>\u003Cem>What are the benefits of TJA?\u003C/em>\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">Traffic-Jam Assist can be particularly helpful for drivers who regularly encounter heavy congestion on their daily commute, and also helps reduce the prospect of lower-speed fender-benders.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong class=\"ql-size-large\">Where Can I Find These Features?\u003C/strong>\u003C/p>\n\u003Cp>Each of these ADAS features is available today from various manufacturers, in many cases enabled by a single forward-facing camera powered by Mobileye technology. And \u003Ca href=\"https://www.mobileye.com/us/fleets/\" target=\"_blank\" rel=\"noopener noreferrer\">fleet operators\u003C/a> that don&rsquo;t replace their vehicles at the rapid pace of ADAS development need not be left behind: they can take advantage of ADAS technologies by retrofitting our stand-alone aftermarket device to their existing vehicles.\u003C/p>\n\u003Cp>We&rsquo;ve also taken a big step forward with the launch of \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye SuperVision&trade;\u003C/a>, which combines a surround-view multi-camera array, high-definition digital maps, and driving policy &ndash; all derived directly from our autonomous-vehicle program &ndash; to deliver an array of hands-free driving capabilities. It&rsquo;s our most comprehensive ADAS suite yet, and it&rsquo;s already heading into its first production application thanks to our new \u003Ca href=\"https://www.mobileye.com/opinion/our-new-deal-with-geely-is-a-game-changer-says-shashua/\" target=\"_blank\" rel=\"noopener\">collaboration with Chinese automotive giant Geely\u003C/a>.\u003C/p>\n\u003Cp>No matter which of these technologies your vehicle may be equipped with, though, remember that they still fall short of \u003Ca href=\"https://www.mobileye.com/news/mobileye-releases-unedited-footage-of-a-40-minute-av-ride-through-jerusalem/\" target=\"_blank\" rel=\"noopener\">full automation\u003C/a>. So stay alert out there, and continue to heed Jim Morrison&rsquo;s lyrical advice, which rings as true today as it did when The Doors first recorded &ldquo;Roadhouse Blues&rdquo; over half a century ago: &ldquo;Keep your eyes on the road, your hands upon the wheel.&rdquo;\u003C/p>","2020-10-05T07:00:00.000Z",{"id":2082,"type":654,"url":2083,"title":2084,"description":2085,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2085,"image":2086,"img_alt":2087,"content":2088,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2089,"tags":2090},58,"our-new-deal-with-geely-is-a-game-changer-says-shashua","Our New Deal with Geely is a Game-Changer, Says Shashua","Mobileye CEO Prof. Amnon Shashua details exactly why our latest deal, signed with China’s Geely Auto Group, is so pivotal.","https://static.mobileye.com/website/us/corporate/post/images/ffeb78641d16c2f14c868212848c77ec_1600919776892.jpg","Lynk & Co Zero Concept premium electric vehicle with advanced ADAS features powered by Mobileye","\u003Cp>The cutting-edge technologies we develop at Mobileye, we firmly believe, are only as valuable as their implementation. That&rsquo;s what makes this latest development such a &ldquo;game changer,&rdquo; in the words of our chief executive.\u003C/p>\n\u003Cp>This long-term agreement will extend to enabling ADAS features on numerous models across multiple brands under the vast worldwide umbrella of Geely &ndash; one of China&rsquo;s largest carmakers and an increasingly vital player in the global automotive industry. But beyond the scope of the deal itself, this development marks a major milestone for Mobileye with the launch of an entirely new product &ndash; one that brings multiple core competencies together in a way we have never brought to market before. In the words of our CEO Amnon Shashua, &ldquo;the Geely deal is significant for its size, scope, and timeline. But most importantly, it proves that there is a valuable use case for AV technology in the most advanced driver assistance systems.&rdquo;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/post/images/65b0388f192d76fe2c84e525ea1f2a70_1600938960066.png\" alt=\"Mobileye SuperVision&trade; for hands-free ADAS\" width=\"1240\" height=\"791\" />\u003C/p>\n\u003Cp>\u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye SuperVision&trade;\u003C/a> is our new full-stack solution that puts tomorrow&rsquo;s technology on the road today. It incorporates a surround-view camera array, processors, driving policy, and high-definition maps &ndash; all derived directly from our ongoing autonomous-vehicle program. And with over-the-air update capabilities, vehicles so-equipped will be able to benefit from further technological advancements as they come down the pipeline.\u003C/p>\n\u003Cp>Many of these features will power the ADAS in the new Lynk &amp; Co Zero Concept premium electric vehicle. Because we&rsquo;re basing these features on our proven technologies, we&rsquo;re able to support Geely in bringing this new model into production on an unusually tight timeframe.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/post/images/99cd79b1d2f320fed7a493dded57092e_1600938978364.png\" alt=\"Mobileye SuperVision&trade; for hands-free ADAS - system architecture\" width=\"1240\" height=\"720\" />\u003C/p>\n\u003Cp>This development, in short, is the rolling manifestation of a key concept we both espouse and embrace at Mobileye &ndash; namely, in \u003Ca href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">the words of our chief executive\u003C/a>, that &ldquo;it is critical to first get the technology right before bringing it to production to enable rapid scaling.&rdquo;\u003C/p>\n\u003Cp>Read more context on the deal in the \u003Ca href=\"https://www.mobileye.com/opinion/shashua-mobileye-av-stack/\" target=\"_blank\" rel=\"noopener\">latest editorial\u003C/a> from Intel senior vice president and Mobileye CEO Professor Amnon Shashua, and the \u003Ca href=\"https://www.mobileye.com/opinion/shashua-mobileye-av-stack/\" target=\"_blank\" rel=\"noopener\">details of the deal itself\u003C/a> in the news release.\u003C/p>","2020-09-24T07:00:00.000Z","ADAS, Mobileye Inside, Opinion, From our CEO",{"id":2092,"type":654,"url":2093,"title":2094,"description":2095,"primary_tag":190,"author_name":1367,"is_hidden":11,"lang":12,"meta_description":2095,"image":2096,"img_alt":2097,"content":2098,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":2099,"tags":2100},144,"shashua-mobileye-av-stack","Why the Geely Auto Group Win is a Game Changer","Fully Integrated Mobileye SuperVision System Enables Faster Production for OEMs","https://static.mobileye.com/website/us/corporate/images/0b83dfc359e3a4c7d85dfb4c6bc6f241_1666086729179.png","Amnon Shashua - \nSenior Vice President President and CEO of Mobileye, an Intel Company","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\u003Ch2>\u003Cstrong>By Amnon Shashua\u003C/strong>\u003C/h2>\u003Cp>As I have said consistently, it is critical to first get the technology right before bringing it to production to enable rapid scaling. The deal we announced with Geely Auto Group today, which utilizes our proven surround-view advanced driver-assistance system (ADAS), pays off that philosophy.\u003C/p>\u003Cp>Geely Auto Group will use Mobileye’s full-stack, 360-degree camera solution in its brand-new, premium-model, L2+ electric vehicle (EV) from Lynk &amp; Co – the Zero Concept – reaching consumers in late 2021. This system, which today we are launching as Mobileye SuperVision™, is a direct derivative of our autonomous driving program and utilizes the camera-only portion of our \u003Ca href=\"https://arxiv.org/abs/2009.03301\" rel=\"noopener noreferrer\" target=\"_blank\">truly redundant\u003C/a> sensing suite that we are developing for \u003Ca href=\"https://static.mobileye.com/dev/website/us/corporate/images/be013ffc23e3d75babbda0ed4a5019ea_1663241548611.jpg\" rel=\"noopener noreferrer\" target=\"_blank\">Level 4\u003C/a> autonomous vehicles (AVs).\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>Delivery of such a complex solution in about a year – just one-third the usual design time – is unprecedented in the automotive industry but crucial to Geely’s ability to meet customer demand. Mobileye can meet such a tight schedule because of the time and effort spent refining and testing our camera-only, self-driving system for our AVs.\u003C/p>\u003Cp>Because we have remained heads-down in perfecting this technology and are not distracted by various go-to-market schemes, we can now deliver Mobileye SuperVision for commercial use quickly and in volume to our OEM partners.\u003C/p>\u003Cp>This win marks the first time Mobileye will be responsible for the full solution stack, including hardware and software, driving policy and control. Due to the complexity of the project, we will also supply a multidomain controller that will be validated for automotive and serve as a subsystem for very advanced ADAS solutions worldwide.\u003C/p>\u003Cp>It also marks the first time that an OEM has publicly noted Mobileye’s plan to provide over-the-air updates to the system after deployment. While this capability has always been in our repertoire, Geely and Mobileye want to assure customers that we can easily scale their driving-assistance features and keep everything up to date across the car’s lifetime.\u003C/p>\u003Cp>Our SuperVision camera-only solution is based on two Mobileye EyeQ5® system-on-chips – complete with seven long-range and four close-range cameras – and delivers a 360-degree surround view to enable a scalable feature bundle supporting highway hands-free, navigation-based highway-to-highway, arterial, and up to urban hands-free driving. We’re also providing our \u003Ca href=\"https://static.mobileye.com/website/common/files/rss-fact-sheet.pdf\" rel=\"noopener noreferrer\" target=\"_blank\">Responsibility-Sensitive Safety (RSS)\u003C/a> based driving policy, which helps the vehicle operate safely where lane markings may not be visible and other road users might pose a hazard. The Mobileye SuperVision system supports the capabilities we have shown in our \u003Ca href=\"https://youtu.be/kJD5R_yQ9aw\" rel=\"noopener noreferrer\" target=\"_blank\">drone-view videos\u003C/a> of our AVs driving in Jerusalem.\u003C/p>\u003Cp>The Geely deal is significant for its size, scope and timeline. But most importantly, it proves that there is a valuable use case for AV technology in the most advanced driver-assistance systems. We are thrilled that Geely selected Mobileye SuperVision for its new Lynk &amp; Co Zero Concept EV, and we look forward to quickly commercializing what is arguably the world’s most advanced driver-assistance solution.\u003C/p>\u003Cp>\u003Ca href=\"https://www.mobileye.com/amnon-shashua/\" rel=\"noopener noreferrer\" target=\"_blank\">\u003Cem>Professor Amnon Shashua\u003C/em>\u003C/a>\u003Cem> is senior vice president at Intel Corporation and president and chief executive officer of Mobileye, an Intel company. \u003C/em>\u003C/p>","2020-09-23T15:00:00.000Z","Autonomous Driving, News, From our CEO",{"id":2102,"type":24,"url":2103,"title":2104,"description":2105,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2105,"image":2106,"img_alt":2105,"content":2107,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":2108,"tags":1298},143,"mobileye-av-stack","Mobileye, Geely to Offer Most Robust Driver-Assistance Features","Geely Auto Group unveiled the highly anticipated premium electric vehicle (EV), Zero Concept.","https://static.mobileye.com/website/us/corporate/images/4c47ea521e7b549bb3f46caacc51ba94_1666087211328.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\u003Ch2>\u003Cstrong>New Lynk &amp; Co Electric Vehicle to Feature Mobileye SuperVision for Scalable ADAS\u003C/strong>\u003C/h2>\u003Cp>Chinese automaker Geely Auto Group unveiled its premium electric vehicle, the Zero Concept from Lynk &amp; Co, in September 2020 at the Beijing Auto Show. The Zero Concept EV will feature Lynk &amp; Co’s CoPilot solution powered by Mobileye SuperVision surround-view advanced driver-assistance system with over-the-air update capabilities. (Credit: Geely Auto Group)\u003C/p>\u003Cp>BEIJING, Sept. 24, 2020 – Geely Auto Group, the largest privately held auto manufacturer in China, unveiled the highly anticipated premium electric vehicle (EV), Zero Concept, from Lynk &amp; Co – a brand under Geely Auto Group – at a Lynk &amp; Co brand event held in conjunction with the Beijing Auto Show. The new Zero Concept EV will feature Lynk &amp; Co’s CoPilot solution powered by Mobileye SuperVision™ surround-view advanced driver-assistance system (ADAS) with over-the-air (OTA) update capabilities. Utilizing Mobileye’s production-ready SuperVision system based on Mobileye’s leading EyeQ5® system-on-chip (SoC) alongside Geely’s accelerated production capabilities will enable Geely Auto Group to deliver a new suite of advanced driver-assist features to consumers beginning in fall 2021.\u003C/p>\u003Cp>“We created the Lynk &amp; Co brand in 2016 with the goal of providing a new, premium experience for global consumers; to date, we have delivered over 300,000 Lynk &amp; Co units to customers. In the next phase of our growth, we will collaborate with Mobileye to deliver an entirely new driving experience that is truly unmatched,” said An Conghui, Geely Auto Group chief executive officer. “Lynk &amp; Co CoPilot powered by Mobileye’s SuperVision system will bring the most advanced vision-based driving-assistance technology to the production version of the Lynk &amp; Co Zero Concept, making it soon to be one of the world’s leading premium vehicles with the most robust driver-assist features.”\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>“Our collaboration with Geely is a game changer for the global automotive industry as it brings our industry-leading surround-vision technology to market in one of the most advanced driver-assistance systems,” said Amnon Shashua, senior vice president at Intel and president and chief executive officer of Mobileye, an Intel company. “We are thrilled to help Geely offer Lynk &amp; Co drivers an exciting and advanced package of high-level driver aids and safety features, including point-to-point highway pilot and traffic-jam assist, all powered by Mobileye’s SuperVision surround-view driver-assistance system and kept current with OTA updates.”\u003C/p>\u003Cp>The collaboration between Geely and Mobileye comes amid a growing demand for electric vehicles in China and beyond, as well as increased interest in safer, cleaner transportation solutions. The future production-ready Zero Concept EV featuring Mobileye SuperVision ADAS technology will present a new, groundbreaking option for consumers as China’s EV market rapidly expands.\u003C/p>\u003Cp>Lynk &amp; Co CoPilot, powered by Mobileye’s SuperVision system, is a first-of-its-kind ADAS-to-AV scalable system, supported by the unprecedented use of surround-view cameras and other driving policy and navigation technologies powered by two EyeQ5 SoCs, Mobileye’s most advanced SoC. The solution brings cutting-edge safety technology to assist human drivers in a multitude of different driving scenarios.\u003C/p>\u003Cp>In addition to enabling high-level driver assistance in the Zero Concept EV over several years, Geely and Mobileye announced a high-volume ADAS agreement to equip a variety of Geely Auto Group makes and models with Mobileye vision-sensing technology. The long-term agreement will see multiple Geely Auto Group brands and vehicles outfitted with Mobileye-powered ADAS features such as automatic emergency braking and lane-keeping assist.\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>\u003Csup>\u003Cstrong>About Geely Auto Group\u003C/strong>\u003C/sup>\u003C/p>\u003Cp>\u003Csup>Geely Auto Group is a leading automobile manufacturer based in Hangzhou, China, and was founded in 1997 as a subsidiary unit of Zhejiang Geely Holding Group (ZGH). The group manages several leading brands including Geely Auto, Lynk &amp; Co, Proton Cars, Lotus, and Geometry. The group employs more than 50,000 people, operates 12 plants, five global R&amp;D centers in Hangzhou, Ningbo, Gothenburg, Coventry and Frankfurt. The Group also boasts five global design studios in Shanghai, Gothenburg, Barcelona, California and Coventry, respectively, with over 900 members of staff in total. The Geely Auto brand has been listed on the Hong Kong stock exchange since 2005. In 2019, the brands under Geely Auto Group management sold over 1.46 million units, with Geely Auto retaining its position as the best-selling Chinese brand for three consecutive years, Lynk &amp; Co setting a new annual sales record, and a revitalized Proton returning to second place in its home market of Malaysia. The controlling shareholder in Geely Auto is Zhejiang Geely Holding Group, which is also the parent company of Volvo Car Group, Geely Commercial Vehicles Group, Geely New Technology Group and Mitime Group. Zhejiang Geely Holding Group is committed to vigorously pushing the development of world-renowned automotive and mobility technology brands providing high-quality products in multiple market segments to meet different levels of consumer demands. For more information, refer to \u003C/sup>\u003Ca href=\"http://global.geely.com/\" rel=\"noopener noreferrer\" target=\"_blank\">\u003Csup>\u003Cstrong>http://global.geely.com\u003C/strong>\u003C/sup>\u003C/a>\u003Csup>. \u003C/sup>\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>\u003Csup>\u003Cstrong>About Mobileye\u003C/strong>\u003C/sup>\u003C/p>\u003Cp>\u003Csup>Mobileye is the global leader in the development of computer vision and machine learning, for advanced driver assistance. Mobileye’s proven technology helps keep passengers safer on the roads, reduces the risks of traffic accidents and saves lives. Mobileye’s proprietary software algorithms and EyeQ® chips perform detailed interpretations of the visual field in order to anticipate possible collisions with other vehicles, pedestrians, cyclists, animals, debris and other obstacles. Mobileye’s products are also able to detect roadway markings such as lanes, road boundaries, barriers and similar items, as well as identify and read traffic signs, directional signs and traffic lights to assist drivers.\u003C/sup>\u003C/p>","2020-09-23T12:00:00.000Z",{"id":2110,"type":24,"url":2111,"title":2112,"description":2113,"primary_tag":934,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2113,"image":2114,"img_alt":2115,"content":2116,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2117,"tags":2118},57,"mobileye-is-bringing-driverless-maas-to-the-uae","Mobileye is Bringing Driverless MaaS to the UAE","New collaboration with Al Habtoor Motors will ramp up to deploying self-driving robotaxis in the United Arab Emirates in 2023. ","https://static.mobileye.com/website/us/corporate/post/images/c08944f8f74812e84d546544b5b3b2ed_1600864841534.jpg","Mobileye CEO Prof. Amnon Shashua signs deal with Al Habtoor Motors in the United Arab Emirates","\u003Cp>\u003Cspan style=\"background-color: inherit;\">Today our CEO \u003C/span>\u003Ca style=\"background-color: inherit; color: #0563c1;\" href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Prof. Amnon Shashua\u003C/a>\u003Cspan style=\"background-color: inherit;\"> revealed the details of a far-reaching collaboration with Al Habtoor Group to deploy robotaxis in Dubai in the coming years. The deal brings together both our \u003C/span>data services\u003Cspan style=\"background-color: inherit;\"> and \u003C/span>MaaS operations\u003Cspan style=\"background-color: inherit;\"> to ramp up to full service in four phases.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">The first will involve the creation of a comprehensive set of smart-city solutions, enabled by our \u003C/span>\u003Ca style=\"background-color: inherit; color: #0563c1;\" href=\"https://www.mobileye.com/us/fleets/products/mobileye-8-connect/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye&reg; 8 Connect\u003C/a>\u003Cspan style=\"background-color: inherit; color: #0563c1;\">&trade;\u003C/span>\u003Cspan style=\"background-color: inherit;\">, including Road Pavement Condition Monitoring, Infrastructure Assets Monitoring, and Dynamic Mobility Mapping (which in turn includes traffic-flow management and real-time emergency response). This first phase will enable Mobileye to create high-definition maps of the UAE&rsquo;s roadways and the digital infrastructure necessary to implement the following phases &ndash; the second of which will involve local \u003C/span>\u003Ca style=\"background-color: inherit; color: #0563c1;\" href=\"https://www.mobileye.com/blog/mobileye-hits-the-autobahn-with-german-permit/\" target=\"_blank\" rel=\"noopener noreferrer\">testing of our autonomous vehicles\u003C/a>\u003Cspan style=\"background-color: inherit;\">, targeted to begin mid-2021.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">In the third phase, we&rsquo;ll begin testing our MaaS platform on the ground, initially with safety drivers on board, including tele-operation and mobility intelligence, expected to begin in 2022. Once that step is complete, we aim to launch our fully autonomous, on-demand robotaxis on the streets of the UAE in 2023.\u003C/span>\u003C/p>\n\u003Cp>\u003Ca style=\"background-color: inherit; color: #0563c1;\" href=\"https://www.habtoor.com/en/\" target=\"_blank\" rel=\"noopener noreferrer\">Al Habtoor Group\u003C/a>\u003Cspan style=\"background-color: inherit;\"> is a well-established business with a half-century of experience. Its interests in the United Arab Emirates and around the world encompass real estate, hospitality, education, publishing, and automobiles. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">We look forward to working with Al Habtoor as we bring our Mobility-as-a-Service to yet another global location, following previously announced ventures underway in \u003C/span>\u003Ca style=\"background-color: inherit; color: #0563c1;\" href=\"https://www.mobileye.com/blog/mobileye-to-deploy-robotaxis-in-japan-with-willer/\" target=\"_blank\" rel=\"noopener noreferrer\">Japan\u003C/a>\u003Cspan style=\"background-color: inherit;\">, \u003C/span>\u003Ca style=\"background-color: inherit; color: #0563c1;\" href=\"https://techcrunch.com/2020/01/07/mobileye-expands-its-robotaxi-footprint-with-a-new-deal-in-south-korea/\" target=\"_blank\" rel=\"noopener noreferrer\">South Korea,\u003C/a> and France.\u003C/p>","2020-09-23T07:00:00.000Z","Mapping & REM, Driverless MaaS, News",{"id":2120,"type":5,"url":2121,"title":2122,"description":2123,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2123,"image":2124,"img_alt":2125,"content":2126,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2127,"tags":997},56,"why-tops-arent-tops-when-it-comes-to-av-processors","Why TOPs Aren’t Tops When It Comes to AV Processors","Overall performance is more important (and far more complex) than the peak number of operations a processor can execute in synthetic conditions.","https://static.mobileye.com/website/us/corporate/post/images/888e49790af667ad90401e213a6709bc_1600244948545.jpg","An illustration of Mobileye's EyeQ device on integrated circuitry","\u003Cp>There’s a race underway in the \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" rel=\"noopener noreferrer\" target=\"_blank\">autonomous-vehicle\u003C/a> industry. Not just to see who can make \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" rel=\"noopener noreferrer\" target=\"_blank\">the safest AV\u003C/a>, or who can get one to market first, but a “race for TOPs.” Is the outcome of that race, however, what really matters most when selecting the right processor for an AV?\u003C/p>\u003Cp>Shorthand for Trillions of Operations Per Second, TOPs is one metric often cited for measuring the \u003Cem>power\u003C/em> of a processor. But it’s not the only metric – or, we’d argue, even the most relevant one – in measuring the processor’s actual \u003Cem>performance\u003C/em>.\u003C/p>\u003Cp>\u003Cstrong>Looking Beyond TOPs\u003C/strong>\u003C/p>\u003Cp>To demonstrate what we mean, we commissioned Strategy Analytics to produce this paper, which poses the vital question: \u003Ca href=\"https://www.strategyanalytics.com/access-services/automotive/autonomous-vehicles/reports/report-detail/TOPS_AV_Processors\" rel=\"noopener noreferrer\" target=\"_blank\">“Should TOPs be Top of Your List When Choosing an AV Processor?”\u003C/a> The report takes a much broader look at the question than a simple TOPs count will tell you.\u003C/p>\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/075c14194bade990a161498d8ff4f5fb_1626618526033.jpg\" alt=\"Mobileye’s EyeQ System-on-a-Chip (SoC) prioritizes performance over outright power, sidestepping the industry race for TOPs\">\u003C/p>\u003Cp>At least as important is understanding the multiple workloads that an AV platform has to support. Or the impact which the level of integration between hardware and software will have on performance. Over-optimizing for today’s network topologies can limit future use, what’s good for one application may not be right for another, computing power impacts both cost and usability, and scalability brings definite benefits.\u003C/p>\u003Cp>\u003Cstrong>Far More Complex\u003C/strong>\u003C/p>\u003Cp>In the end, the report finds, “Raw TOPs count is not a useful metric when it comes to evaluating how much ‘headroom’ your system has for future feature growth.” That’s why we’ve avoided competing in the “race for TOPs” in developing successive iterations of our \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" rel=\"noopener noreferrer\" target=\"_blank\">EyeQ series\u003C/a> of system-on-a-chip devices, which have proven capable of supporting complex and computationally intense vision processing, while crucially maintaining relatively low levels of power consumption.\u003C/p>","2020-09-15T21:00:00.000Z",{"id":2129,"type":5,"url":2130,"title":2131,"description":2132,"primary_tag":140,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2132,"image":2133,"img_alt":2134,"content":2135,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2136,"tags":2137},22,"mobileye-leads-the-industry-in-embracing-linux-for-safety-related-applications","Embracing Linux for Safety-Related Applications","In an industry first, we’re switching to a Linux-based operating system for our next-generation EyeQ®5 chip, and opening the door for others to join us.","https://static.mobileye.com/website/us/corporate/post/images/9defe9050b35e7a1efa0cd1e6b11c745_1597827154229.jpg","Mobileye Embraces the Potential of Linux","\u003Cp>Openness and transparency are integral to everything we do at Mobileye. That’s why we’re switching to a Linux-based operating system for our next-generation \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" rel=\"noopener noreferrer\" target=\"_blank\">EyeQ®5 chip\u003C/a> – the workhorse and backbone for all our technologies, from \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" rel=\"noopener noreferrer\" target=\"_blank\">ADAS\u003C/a> to \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" rel=\"noopener noreferrer\" target=\"_blank\">autonomous vehicles\u003C/a>. And we’re opening it up to the world to both contribute to and benefit from the advancements we’re making.\u003C/p>\u003Cp>This pivotal, industry-first approach has already been verified by an external safety assessor and by our longtime collaborator BMW, both of which have crucially assisted us in proving the use of Linux in automotive safety-related applications.\u003C/p>\u003Cp>By switching from a proprietary, custom operating system to Linux for our new state-of-the-art EyeQ5, we stand to leverage an already robust and proven platform, improve dependability, and increase development flexibility, while (crucially) opening up to the entire global Linux community to help spread and fine-tune our approach to kernel qualification.\u003C/p>\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/c1e7657b4e70db81bc03e90ed2c7f592_1626618068955.jpg\" alt=\"Mobileye leads the industry by switching its EyeQ5 System-on-Chip to a Linux-based operating system\">\u003C/p>\u003Cp>\u003Cstrong>Enabling Linux In Safety Applications\u003C/strong>\u003C/p>\u003Cp>Once the hard work is done, we plan to share our approach with the world through the \u003Ca href=\"https://elisa.tech/\" rel=\"noopener noreferrer\" target=\"_blank\">Enabling Linux In Safety Applications (ELISA)\u003C/a> project, in which Mobileye and Intel have taken an active role. This key step will enable members of the automotive and other industries to adopt a similar approach in developing their future safety systems, and encourage standards organizations (like the ISO 26262 committee) to increase the scope of existing automotive safety standards to include a process for open-source software qualification, with the hope of increasing safety for everyone, beyond our own applications.\u003C/p>\u003Cp>“Intel and Mobileye see the Linux operating system as an important player in the functional safety software ecosystem,” said Simone Fabris, Senior Director of System Safety at Mobileye, an Intel Company, and one of five members of the ELISA governing board. “The impact and skills of the open source community will be harnessed through the ELISA project to increase the safety integrity of future embedded systems while, at the same time, contributing to a better quality, reduction of development costs and speed up the delivery of complex functional safety systems across multiple industry domains including autonomous driving and avionics.”\u003C/p>","2020-08-22T21:00:00.000Z","AV Safety, Autonomous Driving, Industry, ADAS",{"id":2139,"type":5,"url":2140,"title":2141,"description":2142,"primary_tag":140,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2142,"image":2143,"img_alt":2144,"content":2145,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2146,"tags":2147},25,"responsibility-sensitive-safety-gains-traction-worldwide","Responsibility-Sensitive Safety Gains Traction Worldwide","Mobileye’s RSS autonomous-vehicle safety model is being embraced by a growing number of public and private companies and organizations around the world.","https://static.mobileye.com/website/us/corporate/post/images/6b190b0a2109e194bb495fd17ca8188e_1597842432531.jpg","Mobileye VP Jack Weast speaks at Mobileye DRIVES summit","\u003Cp>\u003Cspan style=\"background-color: inherit;\">Since its introduction in 2017, \u003C/span>\u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"background-color: inherit;\">Responsibility-Sensitive Safety (RSS)\u003C/a>\u003Cspan style=\"background-color: inherit;\"> has become a leading model for global AV safety frameworks. Numerous standards bodies are beginning to include RSS in their standards. Regulators and policymakers are looking at RSS as tool for defining what it means for an AV to drive \"safely.\" Researchers are digging into the application of RSS and looking for the boundaries of its efficacy.\u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"background-color: inherit;\">Standards progress has been especially robust, as RSS has advanced its way into both IEEE and ISO standards efforts recently. Intel Senior Principal Engineer and Mobileye VP of Automated Vehicle Standards \u003C/span>\u003Ca href=\"https://www.mobileye.com/blog/the-very-definition-of-safe-driving/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"background-color: inherit;\">Jack Weast\u003C/a>\u003Cspan style=\"background-color: inherit;\"> is chairing the IEEE effort to adopt a formal technical standard known as IEEE P2846: A Formal Model for Safety Considerations in Automated Vehicle Decision Making. The workgroup includes other industry representatives from, amongst others, Aptiv, Uber, FCA, Google, Nvidia and more. The full membership list (still growing) and progress on the newly-formed IEEE workgroup can be found \u003C/span>\u003Ca href=\"https://sagroups.ieee.org/2846/members/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"background-color: inherit; color: inherit;\">here\u003C/a>\u003Cspan style=\"background-color: inherit;\">. If all goes as planned, IEEE 2846 1.0 will be published early next year. \u003C/span>\u003C/p>\u003Ciframe class=\"ql-video\" frameborder=\"0\" allowfullscreen=\"true\" src=\"https://www.youtube.com/embed/EceAB6TUYzo\" height=\"315\" width=\"560\">\u003C/iframe>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>\u003Cspan style=\"background-color: inherit;\">In addition to the IEEE effort, ISO (the International Organization for Standardization) has also adopted the \u003C/span>\u003Cspan style=\"background-color: inherit; color: inherit;\">Safety First for Automated Driving (SaFAD) \u003C/span>\u003Cspan style=\"background-color: inherit;\"> paper as a technical report, which is widely seen as a first step toward turning it into a standard. This paper was published by Intel and 10 other automotive industry representatives (BMW, Daimler, VW, and more), and \u003C/span>includes RSS in the Drive Planning Element\u003Cspan style=\"background-color: inherit;\">. China ITS, the standards body for the world’s largest passenger vehicle market, has approved a proposal to use RSS as the basis for a forthcoming AV safety standard. \u003C/span>\u003C/p>\u003Cp>\u003Cspan style=\"background-color: inherit;\">In addition to these standards organizations, businesses and think-tanks are also working with RSS: \u003C/span>\u003C/p>\u003Cul>\u003Cli>\u003Cstrong style=\"background-color: inherit;\">Baidu\u003C/strong>\u003Cspan style=\"background-color: inherit;\"> – the Chinese technology company demonstrated a successful implementation of RSS during CES 2019. This was the world’s first open-source implementation of RSS. \u003C/span>\u003C/li>\u003Cli>\u003Ca href=\"https://valeo.com/en/valeo-signs-an-agreement-with-mobileye-to-develop-a-new-autonomous-vehicle-safety-standard/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"background-color: inherit;\">\u003Cstrong>Valeo\u003C/strong>\u003C/a>\u003Ca href=\"https://valeo.com/en/valeo-signs-an-agreement-with-mobileye-to-develop-a-new-autonomous-vehicle-safety-standard/\" rel=\"noopener noreferrer\" target=\"_blank\"> \u003C/a>– \u003Cspan style=\"background-color: inherit;\">the top-tier European-based automotive supplier will collaborate on policies and technologies intended to bolster the adoption of AV safety standards in Europe, the U.S. and China. This involves drafting of frameworks for the verification and commercial deployment of safe AVs and funding of public research on the RSS model. \u003C/span>\u003C/li>\u003Cli>\u003Ca href=\"https://www.rand.org/content/dam/rand/pubs/research_reports/RR2600/RR2662/RAND_RR2662.pdf\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"background-color: inherit;\"> \u003Cstrong>The RAND Corporation\u003C/strong>\u003C/a>\u003Cstrong style=\"background-color: inherit;\"> \u003C/strong>\u003Cspan style=\"background-color: inherit;\">– the leading think tank cited RSS as a way to implement a “safety envelope,” which \u003C/span>\u003Cspan style=\"background-color: inherit; color: inherit;\">RAND\u003C/span> s\u003Cspan style=\"background-color: inherit;\">ays is needed for AVs to achieve “roadmanship.” \u003C/span>\u003C/li>\u003Cli>\u003Cspan style=\"background-color: inherit;\"> \u003C/span>\u003Cstrong style=\"background-color: inherit; color: inherit;\">Arizona’s \u003C/strong>\u003Ca href=\"https://www.azcommerce.com/news-events/news/arizona-governor-doug-ducey-creates-institute-for-automated-mobility-in-arizona/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"background-color: inherit; color: inherit;\">\u003Cstrong>Institute \u003C/strong>\u003C/a>\u003Ca href=\"https://www.azcommerce.com/news-events/news/arizona-governor-doug-ducey-creates-institute-for-automated-mobility-in-arizona/\" rel=\"noopener noreferrer\" target=\"_blank\" style=\"background-color: inherit;\">\u003Cstrong>of Automated Mobility (IAM)\u003C/strong> \u003C/a>\u003Cspan style=\"background-color: inherit;\">will use RSS as the foundation for its research and testing of AV safety, and recently published a proposed set of safety assessment metrics based on RSS to assess the performance of an AV in the real world. \u003C/span>\u003C/li>\u003C/ul>\u003Ciframe class=\"ql-video\" frameborder=\"0\" allowfullscreen=\"true\" src=\"https://www.youtube.com/embed/HYMnIkqYEIM\" height=\"315\" width=\"560\">\u003C/iframe>\u003Cp>\u003Cbr>\u003C/p>","2020-08-09T21:00:00.000Z","AV Safety",{"id":2149,"type":24,"url":2150,"title":2151,"description":2152,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2152,"image":2153,"img_alt":2154,"content":2155,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2156,"tags":2157},41,"nissan-rogue-to-showcase-mobileyezf-100-degree-adas-camera","Nissan Rogue Showcases Mobileye/ZF 100-Degree ADAS Camera","The latest implementation of our ADAS vision technology employs three cameras and a wider field of view.","https://static.mobileye.com/website/us/corporate/post/images/ec3497a7e307b9af85135ef8c3935dba_1597840019899.jpg","Mobileye/ZF 100-Degree ADAS Camera","\u003Cp>At Mobileye, we pride ourselves on offering a wide range of advanced driver assistance capabilities based solely on cameras. Now that range has grown even wider with \u003Ca href=\"https://www.prnewswire.com/news-releases/worlds-leading-automotive-camera-producer-zf-launches-next-generation-adas-cameras-301102043.html\" rel=\"noopener noreferrer\" target=\"_blank\">the launch of longtime partner ZF’s new S-Cam 4.8\u003C/a>.\u003Cspan style=\"color: rgb(209, 52, 56);\"> \u003C/span>\u003C/p>\u003Cp>Powered by our \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" rel=\"noopener noreferrer\" target=\"_blank\">EyeQ4 processor\u003C/a>, this latest \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" rel=\"noopener noreferrer\" target=\"_blank\">ADAS\u003C/a> camera is among the first such devices to offer a 100-degree horizontal field of view. In addition to the principal sensor, the S-Cam 4.8 incorporates an additional telephoto lens (for improved long-distance sensing capabilities) and a wide-angle fish-eye lens (for improved short-range sensing). Those added capabilities enable the deployment of more advanced semi-automated driving and represent an important step toward achieving future Euro NCAP 5-Star Safety Ratings and IIHS Top Safety Pick+ certification.\u003C/p>\u003Cp>Among the first applications of the new technology is the US-market Nissan Rogue – an immensely popular vehicle which represents an ideal platform for expanding the reach and market penetration of Mobileye technology. Last year Nissan sold over 350,000 Rogue and Rogue Sport crossovers in the United States alone, accounting for roughly a third of its total sales. Those numbers only stand to grow as the all-new, more technologically advanced 2021 Nissan Rogue begins reaching dealers this fall.\u003C/p>\u003Cp>\"Mobileye continues to lead the industry as it transitions towards wider FOV cameras which enhance the capacity of automatic emergency braking to address a wider range of scenarios where objects are crossing into the car's path or where the car is turning,\" said Tomer Baba, Mobileye’s Vice President for Sensing Algorithms. \"The wider FOV also allows lane-keeping and lane-centering applications to better handle sharp curves.\"\u003C/p>\u003Cp>\"The S-Cam 4.8 will offer ZF customers the opportunity to further refine systems like Automatic Emergency Braking for pedestrians and cyclists while offering best-in-class lane keeping system performance,\" added Christophe Marnat, executive vice president and general manager of ZF's Electronics and ADAS division. \"It will also offer the prospect of more semi-automated driving convenience functions like Highway Driving and Traffic Jam Assist, and ZF can provide these technologies across the full spectrum of light vehicles.\"\u003C/p>","2020-07-28T21:00:00.000Z","ADAS, Mobileye Inside, News",{"id":2159,"type":24,"url":2160,"title":2161,"description":2162,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2162,"image":2163,"img_alt":2164,"content":2165,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2166,"tags":2167},42,"a-big-step-for-both-ford-and-mobileye","‘A Big Step for Both Ford and Mobileye’","Over the past week, journalists and analysts have had a lot to say about the importance of our new deal with Dearborn.","https://static.mobileye.com/website/us/corporate/post/images/e4a983c421e94b3847cb5fcd2d15dd9f_1597840133165.jpg","Mobileye CEO Prof. Amnon Shashua on The First Trade","\u003Cp>With over 60 million vehicles (and counting) on the road around the world already equipped with our technology, Mobileye has emerged as a leader in the field of automotive safety tech. To date, more than 25 major global automakers rely on our innovations to make their vehicles safer, offering between them more than 300 models with \u003Ca href=\"https://www.mobileye.com/blog/tag/mobileye-inside/\" target=\"_blank\" rel=\"noopener\">Mobileye inside\u003C/a>. But that doesn&rsquo;t mean we don&rsquo;t have room to grow. And that&rsquo;s precisely what our newly expanded partnership with Ford, \u003Ca href=\"https://www.mobileye.com/blog/ford-bets-big-on-mobileye-tech/\" target=\"_blank\" rel=\"noopener noreferrer\">announced just last week\u003C/a>, represents. Here&rsquo;s a quick roundup of what&rsquo;s been said about this milestone in the first week following the announcement:\u003C/p>\n\u003Cp>Speaking on \u003Cem>Yahoo! Finance \u003C/em>show\u003Cem> \u003C/em>\u003Ca href=\"https://finance.yahoo.com/video/ford-intels-mobileye-expand-partnership-141043427.html\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cem>The First Trade\u003C/em>\u003C/a>, our CEO \u003Ca href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Prof. Amnon Shashua\u003C/a> called this &ldquo;the biggest deal that Mobileye has ever done.&rdquo; But you don&rsquo;t have to take his word for it alone.\u003C/p>\n\u003Cp>&ldquo;Mobileye continues to dominate in computer-vision-based safety systems in automotive,&rdquo; Gartner analyst Michael Ramsey told \u003Ca href=\"https://www.autonews.com/automakers-suppliers/ford-chooses-mobileye-pave-way-enhanced-driver-assist-features\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cem>Automotive News\u003C/em>\u003C/a> and \u003Ca href=\"https://www.fierceelectronics.com/electronics/intel-mobileye-and-ford-pair-driver-assist-tech-despite-car-sales-slump\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cem>Fierce Electronics\u003C/em>\u003C/a>. &ldquo;Even if higher levels of vehicle autonomy aren&rsquo;t near at hand, there is a lot of opportunity in the remainder of the fleet.&rdquo;\u003C/p>\n\u003Cp>&ldquo;As the battle to supply the global auto industry with technology for self-driving vehicles heats up, Intel&rsquo;s Mobileye notched a big win on Monday,&rdquo; \u003Ca href=\"https://fortune.com/2020/07/20/ford-intel-mobileye-self-driving-cars/\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cem>Fortune\u003C/em>\u003C/a> reported. \u003Ca href=\"https://finance.yahoo.com/news/ford-picks-intels-mobileye-technology-for-autonomous-driving-100116814.html\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cem>Yahoo! Finance\u003C/em>\u003C/a> echoed the sentiment, saying &ldquo;Intel&rsquo;s autonomous driving unit Mobileye continues to park some big business wins in its garage.&rdquo; \u003Ca href=\"https://www.slashgear.com/ford-intel-mobileye-eyeq-adas-smart-car-vision-crowdsource-road-changes-20629605/\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cem>SlashGear\u003C/em>\u003C/a> called the Ford deal &ldquo;a big win for Intel, which has been positioning Mobileye as its major play in ADAS tech for some time now.&rdquo;\u003C/p>\n\u003Cp>A key element of the Ford deal will see Mobileye branding appear in every new vehicle the automaker produces with our tech on board. &ldquo;The deal could also help Mobileye raise its profile among consumers,&rdquo; noted \u003Ca href=\"https://www.zdnet.com/article/ford-expands-partnership-with-mobileye-intels-autonomous-driving-business/\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cem>ZDNet\u003C/em>\u003C/a>: &ldquo;Ford plans to display the Mobileye logo in its SYNC driver-assist communication displays&rdquo; &ndash; a pivotal development that \u003Ca href=\"https://www.zacks.com/stock/news/1010391/intels-intc-mobileye-and-ford-expand-adas-partnership\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cem>Zacks\u003C/em>\u003C/a> said &ldquo;reflects the auto-maker&rsquo;s confidence-building exercise among its customers as well as growing popularity of Mobileye.&rdquo;\u003C/p>\n\u003Cp>&ldquo;Ford vehicles with Mobileye EyeQ systems will display the Mobileye logo on the SYNC infotainment screen during the boot up, much as PCs with Intel CPUs have had an &lsquo;Intel Inside&rsquo; sticker for the past couple of decades,&rdquo; \u003Ca href=\"https://www.forbes.com/sites/samabuelsamid/2020/07/20/ford-goes-mobileye-inside-for-driver-assist-systems/#54466fb738ee\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cem>Forbes\u003C/em>\u003C/a> noted, adding that Mobileye &ldquo;dominates the field of machine vision for ADAS systems.&rdquo;\u003C/p>\n\u003Cp>Equally important is the long-term bet that Ford is placing on Mobileye technology. &ldquo;While the two have worked together in the past, this is the first time Ford will commit to the company's technology for an entire lifecycle of vehicles,&rdquo; reported \u003Ca href=\"https://www.cnet.com/roadshow/news/ford-co-pilot360-mobileye-intel-active-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cem>CNET\u003C/em>\u003C/a>, characterizing the partnership as &ldquo;a pretty big step for both Ford and Mobileye.&rdquo;\u003C/p>\n\u003Cp>&ldquo;Nobody is close to Mobileye in ADAS. Its hardware-software solution has been the leader for a decade or so,&rdquo; automotive industry analyst Egil Juliussen told \u003Ca href=\"https://www.eetimes.com/ford-goes-all-in-with-mobileye-in-adas/\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cem>EE Times\u003C/em>\u003C/a>, whose editors discussed the development in the \u003Ca href=\"https://www.eetimes.com/podcasts/more-than-moore-ford-bets-on-mobileye-the-srn1-serial-g-12-4/\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cem>EE Times On Air\u003C/em>\u003C/a> podcast: &ldquo;Mobileye isn&rsquo;t just a hardware company. It actually offers an entire lineup of software they built on top of their hardware platform. So it&rsquo;s a very trusted partner. And in this sense, it&rsquo;s kind of interesting that Ford actually publicly said that they&rsquo;re going to do all the ADAS products in their lineup with Mobileye. That commitment is actually really unheard-of.\"\u003C/p>","2020-07-27T07:00:00.000Z","ADAS, Amnon Shashua, Mobileye Inside, News",{"id":2169,"type":5,"url":2170,"title":2171,"description":2172,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2172,"image":2173,"img_alt":2174,"content":2175,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2176,"tags":673},52,"understanding-l2-in-five-questions","Understanding L2+ in Five Questions","With Level 2+ ADAS, Mobileye is bridging the gap between Level 2 and Level 3 on the spectrum of assisted and autonomous driving capabilities.","https://static.mobileye.com/website/us/corporate/post/images/14074d9035b1e878b1e00150b127e114_1598166368222.jpg","Mobileye technology in action","\u003Cp>In 2016, as manufacturers were moving from \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">ADAS\u003C/a> toward \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomy\u003C/a>, the Society of Automotive Engineers (SAE) published its &ldquo;Levels of Driving Automation.&rdquo; The goal was to establish common benchmarks as technology progressed toward autonomous vehicles. This standard has been widely accepted by the automotive industry. These levels go from 0 (no autonomy), to 5 (fully autonomous). Recently a new concept, L2+ (Level 2 Plus), has been introduced into these levels.\u003C/p>\n\u003Cp>\u003Cstrong>1. When did the term L2+ enter the ADAS lexicon? \u003C/strong>\u003C/p>\n\u003Cp>It&rsquo;s a little-known fact that Mobileye introduced the term L2+ at CES in 2018. The L2+ category was conceived in 2017 when the team realized that we could apply our REM technology &ndash; which was developed originally for the AV &ndash; back to ADAS.\u003C/p>\n\u003Cp>\u003Cstrong>2. What is the 'Plus' in L2+? \u003C/strong>\u003C/p>\n\u003Cp>Examining the established \u003Cspan style=\"color: blue;\">six Levels of Driving Automation\u003C/span> reveals that the biggest gap in those levels is between Level 2 and Level 3. This is essentially the crossover from driver assist to some level of autonomy. In the jump between these two levels, liability shifts from the driver to the system. While it&rsquo;s not quite that simple, Mobileye found that by adding our REM maps to our ADAS solutions, we could offer something in between Level 2 and Level 3, something like enhanced ADAS that can even empower pseudo-autonomy &ndash; things like hands-free highway driving. Mobileye CEO \u003Ca href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Amnon Shashua\u003C/a> coined the term L2+ to refer to this &ndash; still technically in the driver-assist realm, but incorporating a whole new layer on top of the traditional ADAS features.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/991de65d4f631b228038486dbc828165_1674732653501.jpg\" alt=\"The 6 levels of autonomous driving\" width=\"1173\" height=\"1070\" />\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>3. How does it work? \u003C/strong>\u003C/p>\n\u003Cp>The basis of L2+ lies in heightening a vehicle&rsquo;s understanding of its path by taking data collected from the front ADAS camera and combining it with our \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye Roadbook\u003C/a>, a crowdsourced high-precision map designed to help autonomous vehicles (AVs) drive. The L2+ category enhances L1-L2 driver assistance safety features with location intelligence, providing greater utility to drivers in all driving environments. With REM maps on-board, vehicles utilize crowdsourced data to augment their sensing, reduce uncertainties, enhance advanced driving maneuvers and enable their use in more complex driving settings.\u003C/p>\n\u003Cp>\u003Cstrong>4. What functionality does L2+ offer the driver? \u003C/strong>\u003C/p>\n\u003Cp>The enhanced functionality of L2+ can be seen in the lane-keeping assist feature, for example. In L2+ vehicles, map data enables lane-centering to remain effective even in areas where sensing-only lane centering systems may have a hard time; for example, in areas without visible lane marks or low-quality lane markings, such as ramps with sharp turns, junctions, roundabouts, newly paved roads or urban settings. L2+ also supports automatic lane changes by providing information such as lane-marking types and adjusting the drive speed according to road speed/curvature. This capability is available day or night, during challenging weather conditions such as fog, low sun, heavy rain, snow or reflecting roads, despite their impact on the front camera&rsquo;s visibility.\u003C/p>\n\u003Cp>In addition to lane detection, using a map in L2+ vehicles can improve the CIPV (current in-path vehicle) selection, measurement accuracy for pedestrians and vehicles, and even speed limit traffic signs and high/low beam (HLB) functionality.\u003C/p>\n\u003Cp>With L2+ vehicles, REM also supports new features for enhanced ADAS in urban driving, such as identifying which traffic light controls which lane and various subtle driving questions (e.g. where to stop when entering a junction, and whether a rural road is one-way or bidirectional). In essence, L2+ is a new layer of safety and convenience added on to our current ADAS offerings.\u003C/p>\n\u003Cp>\u003Cstrong>5. Is L2+ already in production or is this a plan for the future? \u003C/strong>\u003C/p>\n\u003Cp>The first REM map in production vehicles is on the JDM (Japanese Domestic Market) \u003Ca href=\"https://www.mobileye.com/news/nissan-rogue-to-showcase-mobileyezf-100-degree-adas-camera/\" target=\"_blank\" rel=\"noopener\">Nissan\u003C/a> Skyline &ndash; a model closely related to the Infiniti Q50 sold overseas. Although the Nissan setup is not using the map for enhanced sensing, this first application of L2+ enables hands-free driving on Japan's highways.\u003C/p>\n\u003Cp>A Volkswagen L2+ vehicle with enhanced sensing and improved control (implemented by VW) is expected in near future. Several more major OEMs are expected to join the L2+ revolution soon as well. In fact, the mapping goals set out at CES 2018 are definitely within reach. As of today, with data from three OEMs, over 6 million kilometers of data are being mapped every day, and Mobileye AVs are driving on crowdsourced maps on a daily basis. L2+ is currently ramping up and Mobileye expects this technology to be its strongest revenue driver for years to come.\u003C/p>","2020-07-23T07:00:00.000Z",{"id":2178,"type":24,"url":2179,"title":2180,"description":2181,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2181,"image":2182,"img_alt":2181,"content":2183,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":2184,"tags":1298},145,"mobileye-ford-high-volume-agreement-adas","Mobileye and Ford Announce High-Volume Agreement for ADAS in Global Vehicles","Mobileye, an Intel company, collaborates with Ford on cutting-edge driver-assistance systems","https://static.mobileye.com/website/us/corporate/images/ac9fb8b182a615af4f963666971d384f_1666085362056.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\u003Cp>Mobileye, an Intel company, and Ford Motor Company are collaborating on cutting-edge driver-assistance systems across Ford’s global product lineup.\u003C/p>\u003Cp>As the chosen supplier of vision-sensing technology for Ford’s advanced driver-assistance systems (ADAS), Mobileye will provide its EyeQ\u003Csup>®\u003C/sup>&nbsp;family of devices, together with vision-processing software, to support&nbsp;\u003Ca href=\"https://www.mobileye.com/blog/understanding-l2-in-five-questions/\" rel=\"noopener noreferrer\" target=\"_blank\">Level 1 and Level 2 ADAS\u003C/a>&nbsp;in Ford vehicles globally.\u003C/p>\u003Cp>\u003Cbr>\u003C/p>\u003Cp>“It is a privilege to extend and expand our long-standing collaboration with a company that is so committed to safety on behalf of its global customer base,” said Professor Amnon Shashua, president and CEO of Mobileye. “We look forward to working closely together to bring these functionalities to market in the full Ford product lineup.”\u003C/p>\u003Cp>Working together, Ford and Mobileye have agreed to the following:\u003C/p>\u003Cul>\u003Cli>Ford and Mobileye will offer better camera-based detection capabilities for ADAS, including improved forward-collision warning; vehicle, pedestrian and cyclist detection; plus lane-keeping features.\u003C/li>\u003Cli>Mobileye will provide its suite of EyeQ sensing technology to support Ford Co-Pilot360\u003Csup>™\u003C/sup>&nbsp;Technology available ADAS features, such as Lane-Keeping System, auto high-beam headlamps, Pre-Collision Assist with Automatic Emergency Braking, and Adaptive Cruise Control with Stop-and-Go and Lane-Centering.\u003C/li>\u003Cli>Ford will display Mobileye’s name in vehicles through the inclusion of its logo in the automaker’s SYNC\u003Csup>®\u003C/sup>&nbsp;ADAS communication displays, making customers aware that some Ford Co-Pilot360 Technology features use sensing capabilities provided by Mobileye.\u003C/li>\u003C/ul>\u003Cp>Read the full news release on Ford’s website:&nbsp;\u003Ca href=\"https://media.ford.com/content/fordmedia/fna/us/en/news/2020/07/20/ford-mobileye-camera-based-collision-avoidance.html\" rel=\"noopener noreferrer\" target=\"_blank\">Ford and Mobileye Expand Relationship to Offer Better Camera-Based Collision Avoidance in Global Vehicles\u003C/a>\u003C/p>","2020-07-20T12:00:00.000Z",{"id":2186,"type":5,"url":2187,"title":2188,"description":2189,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2189,"image":2190,"img_alt":2191,"content":2192,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2193,"tags":2194},43,"ford-bets-big-on-mobileye-tech","Ford & Mobileye Sign Pivotal New ADAS Deal","This latest agreement will see Mobileye technology integrated into a broad range of Ford vehicles… and lets their drivers know about it, too.  ","https://static.mobileye.com/website/us/corporate/post/images/8b4ecb71f5ebe369117e1c7273e53f3b_1597840267575.jpg","Ford Bronco, Mustang Mach-E, and F-150... now with Mobileye Inside","\u003Cp>Does your new car or truck have Mobileye technology built in? The chances are good that it does, and that technology is only getting better&nbsp;&ndash; especially if you&rsquo;re driving a Ford. And what&rsquo;s more is that it&rsquo;s getting easier to tell the tech is in there&nbsp;too.&nbsp;\u003C/p>\n\u003Cp>Today the Ford Motor Company \u003Ca href=\"https://www.mobileye.com/news/mobileye-ford-high-volume-agreement-adas/\" target=\"_blank\" rel=\"noopener noreferrer\">announced a new deal\u003C/a> with Mobileye that will see our&nbsp;cutting-edge&nbsp;technology integrated into an array of current and future production models, from the latest F-150 pickup to the new Mustang Mach-E electric crossover&nbsp;and beyond.&nbsp;\u003C/p>\n\u003Cp>Building on a longstanding relationship between the two companies, this latest&nbsp;high-volume&nbsp;agreement&nbsp;will see Mobileye supply our \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ&nbsp;chip\u003C/a>&nbsp;and accompanying software to&nbsp;power&nbsp;Level 1 and Level 2 \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">driver&nbsp;assistance features\u003C/a>&nbsp;that form&nbsp;part of&nbsp;Ford&rsquo;s Co-Pilot360 system. That&nbsp;system&nbsp;includes&nbsp;features&nbsp;such as&nbsp;adaptive cruise control, automatic emergency braking, lane-keeping assist, forward collision warning, pedestrian and cyclist detection, and automatic high-beam headlights.&nbsp;\u003C/p>\n\u003Cp>For the first time, drivers of Ford vehicles so equipped&nbsp;will&nbsp;be able to tell whose technology is enabling these features as the Mobileye logo will be showcased in the vehicles&rsquo; SYNC dashboard display. We hope it will be the first of many such developments across the industry that stands to increase both transparency and public awareness&nbsp;of&nbsp;Advanced&nbsp;Driver-Assistance System (ADAS) technology.&nbsp;&nbsp;\u003C/p>\n\u003Cp>This also marks the first time that Ford is committing to using Mobileye tech throughout the life of its next-generation models, setting the two companies up for&nbsp;continued&nbsp;collaboration for years to come. Beyond this agreement, the Dearborn-based automaker is also currently evaluating implementing Mobileye&rsquo;s \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">Roadbook\u003C/a> system in its vehicles. Roadbook leverages crowd-sourced data from our technology already inside vehicles on the road to build the incredibly precise high-definition maps&nbsp;required for&nbsp;next-generation driver-assist features&nbsp;like hands-free driving&nbsp;and&nbsp;the autonomous vehicles of the future.&nbsp;&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">&ldquo;It is a privilege to extend and expand our long-standing collaboration with a company that is so committed to safety on behalf of its global customer base,&rdquo;&nbsp;Mobileye&nbsp;CEO&nbsp;\u003C/span>and&nbsp;Intel SVP \u003Ca style=\"color: black;\" href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Professor Amnon Shashua\u003C/a>\u003Cspan style=\"color: black;\"> said upon the announcement. &ldquo;We look forward to working closely together to bring these functionalities to market in the full Ford product lineup.&rdquo;\u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"color: black;\">&ldquo;Providing people with extra confidence while driving is invaluable, and it&rsquo;s exactly what our available Ford Co-Pilot360 features are designed to do,&rdquo;&nbsp;added&nbsp;Lisa Drake, Ford&rsquo;s Vice President for Global Purchasing.&nbsp;&ldquo;By customizing Mobileye&rsquo;s excellent software and sensing technology, Ford&rsquo;s great driver-assist features will continue to evolve and provide customers with confidence on the road throughout the life of their vehicles.\"\u003C/span>\u003C/p>","2020-07-20T07:00:00.000Z","ADAS, Mapping & REM, Mobileye Inside",{"id":2196,"type":5,"url":2197,"title":2198,"description":2198,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2198,"image":2199,"img_alt":2198,"content":2200,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2201,"tags":563},162,"fact-sheet-mobileye-advanced-driver-assistance-systems-adas","Fact Sheet: Mobileye Advanced Driver-Assistance Systems (ADAS)","https://static.mobileye.com/website/us/corporate/images/bbab75153cc89b4b5f4e0b99419537be_1666085941005.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525; background-color: #ffffff;\">Inspired by Human Vision\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Founded in 1999, Mobileye&reg; revolutionized the advanced driver-assistance system (ADAS) market by introducing a solution inspired by human visual perception. Using a vehicle-mounted camera to interpret the environment by &ldquo;sight,&rdquo; much as humans do, the Mobileye approach offered an economical way for global automakers to enhance safety in their new cars. Largely replacing legacy systems that required more expensive hardware, Mobileye began to play a leading role in the growth of cost-effective ADAS worldwide, thereby democratizing road safety for drivers, passengers, other road-users and pedestrians everywhere.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525; background-color: #ffffff;\">Powered by Mobileye\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Since its founding, Mobileye, an Intel Company, has built on progress in sensors, computing power and artificial intelligence to enable millions of vehicles to monitor, warn, brake and steer out of harm&rsquo;s way. At the core of Mobileye-powered ADAS are camera sensors integrated into a vehicle that constantly scan the road and stream footage to Mobileye&rsquo;s custom-built processor, the EyeQ&reg;. Using real-time inference, the onboard EyeQ detects road features and driving hazards and if necessary alerts the driver, or, in the case of more advanced active systems, directs the vehicle to actively react to prevent unsafe driving and collisions.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525; background-color: #ffffff;\">Next-Generation ADAS\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Now on its fifth-generation EyeQ chip, Mobileye has a proven record of accomplishment in developing new solutions to meet the challenges of each new level of ADAS and automation. As example, Mobileye&rsquo;s latest EyeQ chip, the EyeQ&reg;5, will support 120-degree vision at an ultra-high resolution. This wider view and higher resolution enhance an ADAS system&rsquo;s performance, ensuring robust detection and response to a wider range of objects at higher speeds.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">At the same time, Mobileye continuously strives to pioneer cutting-edge features and functionalities in ADAS and is applying technologies developed for future autonomous vehicles to cars already on the road today. In 2017 Mobileye introduced a new level of ADAS called \u003C/span>&ldquo;L2+&rdquo; w\u003Cspan style=\"color: #555555; background-color: #ffffff;\">hereby Mobileye technologies, including&nbsp;\u003C/span>\u003Ca style=\"color: #0071c5; background-color: #ffffff;\" href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye&rsquo;s REM&trade; mapping technology\u003C/a>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">, extend the benefits of ADAS into a broader range of driving scenarios (while also providing the necessary redundancy for autonomous driving). L2+ can include the application of 360-degree surround vision and Mobileye&rsquo;s model for safe decision-making &ndash;&nbsp;\u003C/span>\u003Ca style=\"color: #0071c5; background-color: #ffffff;\" href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener\">Responsibility-Sensitive Safety (RSS)\u003C/a>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">&nbsp;&ndash; whereby 360-degree surround view supports the human driver when choosing not to drive and RSS helps the driver avoid crashes due to inattention or carelessness while driving.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Bringing ADAS to more cars on the roads in more driving scenarios is the goal, and L2+ is gaining traction among the world&rsquo;s OEMs. Mobileye is the technology supplier for eight of the 11 current publicly disclosed L2+ production programs, including those of Volkswagen and Nissan. And automakers with vehicles already on the road leveraging Mobileye technology to power next-generation ADAS include GM (with SuperCruise), Nissan (with ProPILOT Assist 2.0) and NIO (with Pilot).\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525; background-color: #ffffff;\">Mobileye-Powered ADAS Features\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">A single Mobileye-powered camera mounted on the windshield can support the majority of common ADAS functions available in cars today. Common ADAS features supported by Mobileye vision perception include:\u003C/span>\u003C/p>\n\u003Cul>\n\u003Cli>\u003Cstrong style=\"color: #252525;\">Automatic Emergency Braking (AEB):\u003C/strong>&nbsp;Identifies an imminent collision and applies the brakes without any driver intervention.\u003C/li>\n\u003Cli>\u003Cstrong style=\"color: #252525;\">Adaptive Cruise Control (ACC):\u003C/strong>&nbsp;Automatically adjusts the host vehicle speed from its preset value (as in standard cruise control) in case of a slower vehicle in its path and then takes it back to the original preset speed when safe to do so.\u003C/li>\n\u003Cli>\u003Cstrong style=\"color: #252525;\">Evasive Steering Support:\u003C/strong>&nbsp;Enhances a driver&rsquo;s emergency steering when a collision is imminent.\u003C/li>\n\u003Cli>\u003Cstrong style=\"color: #252525;\">Forward Collision Warning (FCW):\u003C/strong>&nbsp;Alerts the driver that, under the current dynamics relative to the vehicle ahead, a collision is imminent.\u003C/li>\n\u003Cli>\u003Cstrong style=\"color: #252525;\">Highway Pilot/Highway Assist:\u003C/strong>&nbsp;Combining ACC and LC on highways allows the vehicle to take control of itself during highway driving.\u003C/li>\n\u003Cli>\u003Cstrong style=\"color: #252525;\">Intelligent High-Beam Control (IHBC):\u003C/strong>&nbsp;Controls&nbsp;the vehicle&rsquo;s headlights on dark unlit roads, automatically switching them from lower beam to&nbsp;high beam&nbsp;and back according to whether there is oncoming traffic.\u003C/li>\n\u003Cli>\u003Cstrong style=\"color: #252525;\">Intelligent Speed Assist (ISA):\u003C/strong>&nbsp;Detects and classifies various traffic signs and warns the driver of speeding (in passive systems) or automatically adjusts the vehicle&rsquo;s speed (in active systems).\u003C/li>\n\u003Cli>\u003Cstrong style=\"color: #252525;\">Lane Centering (LC):\u003C/strong>&nbsp;Automatically steers the vehicle to maintain a central path within the lane.\u003C/li>\n\u003Cli>\u003Cstrong style=\"color: #252525;\">Lane Departure Warning (LDW):\u003C/strong>&nbsp;Alerts the driver to an unindicated (and therefore presumably unintended) lane departure.\u003C/li>\n\u003Cli>\u003Cstrong style=\"color: #252525;\">Lane Keeping Assist (LKA):\u003C/strong>&nbsp;Automatically steers the vehicle to stay within lane boundaries.\u003C/li>\n\u003Cli>\u003Cstrong style=\"color: #252525;\">Traffic Jam Assist (TJA):\u003C/strong>&nbsp;A combination of both ACC and LC, TJA allows the vehicle to take control of itself under certain traffic jam conditions.\u003C/li>\n\u003C/ul>\n\u003Cp>\u003Cstrong style=\"color: #252525; background-color: #ffffff;\">Valued Partner of Choice\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Today Mobileye powers ADAS systems in more than 300 car models with 28 OEM partners, supplying almost all major global automakers with computer vision solutions. As of mid-2020, Mobileye collision avoidance technology had been deployed in more than 60 million vehicles worldwide, including hundreds of new car models from Audi, BMW, Ford, General Motors, Honda, Hyundai, Nissan, Volkswagen and others.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525; background-color: #ffffff;\">About Mobileye\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Mobileye is the global leader in the development of computer vision and machine learning, data analysis, localization and mapping for advanced-driver assistance systems and autonomous driving. Mobileye&rsquo;s technology helps keep passengers safer on the roads, reduces the risks of traffic accidents, saves lives and has the potential to revolutionize the driving experience by enabling autonomous driving. Mobileye&rsquo;s proprietary software algorithms and EyeQ&reg; chips perform detailed interpretations of the visual field in order to anticipate possible collisions with other vehicles, pedestrians, cyclists, animals, debris and other obstacles. Mobileye&rsquo;s products are also able to detect roadway markings such as lanes, road boundaries, barriers and similar items; identify and read traffic signs, directional signs and traffic lights; create a RoadBook&trade; of localized drivable paths and visual landmarks using REM&trade;; and provide mapping for autonomous driving.\u003C/span>\u003C/p>","2020-07-19T07:00:00.000Z",{"id":2203,"type":5,"url":2204,"title":2205,"description":2206,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2206,"image":2207,"img_alt":2208,"content":2209,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2210,"tags":1095},44,"mobileye-hits-the-autobahn-with-german-permit","Mobileye Hits the Autobahn with German Permit","Our latest AV testing permit will allow us to evaluate our self-driving system on public roads in Germany, from the streets of Munich to the country’s world-famous highway network. ","https://static.mobileye.com/website/us/corporate/post/images/94464aabd03d7d01efc97ac03831e547_1597840480697.jpg","Mobileye AV in Germany","\u003Cp>Obtaining a license to test autonomous vehicle prototypes in Germany is no simple feat. But \u003Ca href=\"https://www.mobileye.com/news/mobileye-testing-self-driving-vehicles-germany/\" target=\"_blank\" rel=\"noopener noreferrer\">Mobileye is proud to have achieved exactly that\u003C/a>.\u003C/p>\n\u003Cp>After completing a series of rigorous safety tests and technical briefs, Mobileye has obtained authorization to test its AV prototypes on public roads across Germany. That includes not only urban and rural roads, but also the country&rsquo;s famed Autobahn highway network, giving Mobileye the rare ability and authorization to test its self-driving system (SDS) in traffic at speeds as high as 130 kilometers per hour (over 80 mph).\u003C/p>\n\u003Cp>It&rsquo;s a permit typically granted only to OEMs &ndash; that is, the automakers that design, engineer, and manufacture complete automobiles... many with Mobileye technology on board. Other companies have primarily had to make do with testing in closed environments or in simulators. This rare exception makes Mobileye one of the first technology suppliers to make the proverbial grade.\u003C/p>\n\u003Cp>&ldquo;With the T&Uuml;V S&Uuml;D AV-permit we bring in our broad expertise as a neutral and independent third party on the way to safe and secure automated mobility of the future,&rdquo; said Patrick Fruth, head of the mobility division at Munich-based technical service provider T&Uuml;V S&Uuml;D, whose independent assessment made the permit possible. &ldquo;Our demanding assessment framework and test procedure considers state-of-the-art approaches to safety and combines physical real-world tests and scenario-based simulations.&rdquo;\u003C/p>\n\u003Cp>Permit in hand, Mobileye will begin testing on the streets in and around Munich &ndash; BMW&rsquo;s hometown, just an hour&rsquo;s drive south along Autobahn 9 from Audi&rsquo;s seat in Ingolstadt, and a few hours east on the A8 from the Stuttgart headquarters of Daimler and Porsche. From the Bavarian capital, AV testing can expand to other parts of Germany. Mobileye hopes to begin open-road testing in additional countries by the end of the year, among them France, \u003Ca href=\"https://www.mobileye.com/blog/mobileye-to-deploy-robotaxis-in-japan-with-willer/\" target=\"_blank\" rel=\"noopener noreferrer\">Japan\u003C/a>, and South Korea.\u003C/p>\n\u003Cp>These tests help inform the development of Mobileye&rsquo;s \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">self-driving system\u003C/a> for use in robotaxis (leveraging \u003Ca href=\"https://www.mobileye.com/news/welcoming-moovit-to-the-fold/\" target=\"_blank\" rel=\"noopener\">newly acquired sister-company Moovit&rsquo;s mobility platform\u003C/a>) and the consumer autonomous vehicles to follow.\u003C/p>\n\u003Cp>&ldquo;Mobileye is eager to show the world our best-in-class self-driving vehicle technology and safety solutions as we get closer to making safe, affordable self-driving mobility solutions and consumer vehicles a reality,&rdquo; noted Johann Jungwirth, vice president of Mobility-as-a-Service (MaaS) at Mobileye. &ldquo;The new AV Permit provides us an opportunity to instill even more confidence in autonomous driving with future riders, global automakers and international transportation agencies. We thank T&Uuml;V S&Uuml;D for their trusted collaboration as we expand our AV testing to public roads in Germany.&rdquo;\u003C/p>","2020-07-16T07:00:00.000Z",{"id":2212,"type":24,"url":2213,"title":2214,"description":2215,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2215,"image":2216,"img_alt":2217,"content":2218,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":2210,"tags":928},154,"mobileye-testing-self-driving-vehicles-germany","Mobileye Starts Testing Self-Driving Vehicles in Germany","Mobileye receives an automated vehicle testing permit recommendation from the independent technical service provider TÜV SÜD. ","https://static.mobileye.com/website/us/corporate/images/11a41ffe1fe0c15ab65657b962d84bbf_1666085036814.png","In July 2020, Mobileye announced that Germany’s independent technical service provider, TÜV Süd, had awarded it an automated vehicle testing permit.","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>\u003Cstrong>What&rsquo;s New:\u003C/strong>&nbsp;Mobileye, an Intel company, received an automated vehicle (AV) testing permit recommendation from the independent technical service provider T&Uuml;V S&Uuml;D. As one of the leading experts in the field of safe and secure automated driving, T&Uuml;V S&Uuml;D enabled Mobileye to obtain approval from German authorities by validating the vehicle and functional safety concepts of Mobileye&rsquo;s AV test vehicle. This allows Mobileye to perform AV testing anywhere in Germany, including urban and rural areas as well as the Autobahn at regular driving speed of up to 130 kilometers per hour. The AV testing in Germany in real-world traffic is starting now in and around Munich.\u003C/p>\n\u003Cp>&ldquo;Mobileye is eager to show the world our best-in-class self-driving vehicle technology and safety solutions as we get closer to making safe, affordable self-driving mobility solutions and consumer vehicles a reality. The new AV Permit provides us an opportunity to instill even more confidence in autonomous driving with future riders, global automakers and international transportation agencies. We thank T&Uuml;V S&Uuml;D for their trusted collaboration as we expand our AV testing to public roads in Germany.&rdquo; &ndash;Johann Jungwirth, vice president, Mobility-as-a-Service (MaaS), Mobileye\u003C/p>\n\u003Cp>\u003Cstrong>Why It Matters:\u003C/strong> Mobileye is one of the first non-OEM companies to receive a permit to test AVs on open roads in Germany. Until now, AV test drives in Germany have primarily taken place in closed and simulated environments. The basis for the independent vehicle assessment by T&Uuml;V S&Uuml;D in Germany builds on Mobileye&rsquo;s existing program.\u003C/p>\n\u003Cp>&ldquo;With the T&Uuml;V S&Uuml;D AV-permit we bring in our broad expertise as a neutral and independent third party on the way to safe and secure automated mobility of the future,&rdquo; sais Patrick Fruth, CEO Division Mobility, T&Uuml;V S&Uuml;D. &ldquo;Our demanding assessment framework and test procedure considers state-of-the-art approaches to safety and combines physical real-world tests and scenario-based simulations.&rdquo;\u003C/p>\n\u003Cp>With the ability to test automated vehicles with a safety operator on public roads in Germany, Mobileye is taking another significant step toward the goal of a driverless future. On the heels of Mobileye&rsquo;s&nbsp;acquisition of Moovit, a leading MaaS solutions company, as well as recent collaborations to test and deploy self-driving vehicles in France, Japan, and Korea. The new testing permit strengthens Mobileye&rsquo;s growing global leadership position as an AV technology as well as complete mobility solutions provider.\u003C/p>\n\u003Cp>\u003Cstrong>How It Works:\u003C/strong>&nbsp;The new permit will allow Mobileye to demonstrate to the global automotive industry and partners the safety, functionality and scalability of its unique self-driving system (SDS) for MaaS and consumer autonomous vehicles. The Mobileye SDS is comprised of the industry&rsquo;s most advanced vision sensing technology, True Redundancy with two independent perception sub-systems, crowd-sourced mapping in the form of Road Experience Management&trade; (REM&trade;) and its pioneering Responsibility-Sensitive Safety (RSS) driving policy.\u003C/p>\n\u003Cp>Although the first tests of AVs using Mobileye&rsquo;s SDS will be completed in Munich, the company plans to also perform AV testing in other parts of Germany. In addition, Mobileye expects to scale open-road testing in other countries before the end of 2020.\u003C/p>\n\u003Cp>In order to obtain the authorization, Mobileye-powered AV test vehicles underwent a series of rigorous safety tests and provided comprehensive technical documentation. Part of the application also included a detailed hazard analysis, vehicles safety and functional safety concepts and proof that the cars can be safely integrated into public road traffic &ndash; an assessment that was made possible using Mobileye&rsquo;s RSS.\u003C/p>\n\u003Cp>\u003Cstrong>More Context:\u003C/strong>&nbsp;As Mobileye begins self-driving vehicle testing in Germany, Mobileye and Moovit will start demonstrating full end-to-end ride hailing mobility services based on Moovit&rsquo;s mobility platform and apps using Mobileye&rsquo;s AVs. Intel is pursuing the goal of continuing to develop pioneering technologies together with Mobileye and Moovit that will make roads safer for all road users while also improving mobility access for all.\u003C/p>\n\u003Cp>In addition to the development of market-ready technologies, an important prerequisite is the worldwide mapping of roads. Mobileye has already successfully laid the foundations with REM. In cooperation with various automobile manufacturers, data from 25 million vehicles is expected to be collected by 2025. Mobileye is creating high-definition maps of the worldwide road infrastructure as the basis for safe autonomous driving. Millions of kilometers of roads across the globe are mapped every day with the REM technology.\u003C/p>\n\u003Cp>Together, Intel, Mobileye and Moovit are driving forward the implementation of their mobility-as-a-service strategy. This strategy offers society and individuals solutions to today&rsquo;s major social costs of transportation. The goal is to make mobility safe, accessible, clean, affordable and convenient, so that people can travel efficiently, flexibly and smartly from Point A to Point B. All means of transport &mdash; from public transport to car and bike sharing services to ride hailing and ride sharing with self-driving vehicles &mdash; will be bundled within one service offering of Moovit and Mobileye, smartly managed by Moovit&rsquo;s mobility intelligence platform. The advantages are manifold: traffic congestion is minimized, emissions are reduced, and people are given equal and affordable access to mobility &mdash; an approach that is a top priority at Intel.\u003C/p>",{"id":2220,"type":5,"url":2221,"title":2222,"description":2223,"primary_tag":140,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2223,"image":2224,"img_alt":2225,"content":2226,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2227,"tags":2228},45,"the-very-definition-of-safe-driving","The Very Definition of Safe Driving","Jack Weast, Mobileye’s Vice President of Autonomous Vehicle Standards, speaks with PAVE about the challenge of defining the very concept of safety for self-driving cars. ","https://static.mobileye.com/website/us/corporate/post/images/bd6c5078935f19c00dd0cd4e2011053a_1597840674337.jpg","PAVE online panel discussion","\u003Cp>Safety isn&rsquo;t just another issue at Mobileye, or&nbsp;even a mere priority. It&rsquo;s our&nbsp;\u003Cem>raison d&rsquo;&nbsp;&ecirc;tre\u003C/em>, integral to our very purpose.&nbsp;After all, we&rsquo;ve been developing car-safety technology for&nbsp;two decades already.&nbsp;But before we can develop the technologies to make \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomous vehicles\u003C/a>, we first need to define what &ldquo;safe&rdquo; actually means&nbsp;&ndash; not in general terms, of course, but within the context of autonomous vehicle technology.\u003C/p>\n\u003Cp>Seems pretty simple, right? The idea of safety is, after all,&nbsp;one of the first concepts we learn&nbsp;to embrace&nbsp;as young children. But when you&rsquo;re talking about&nbsp;human drivers&nbsp;surrendering&nbsp;any&nbsp;degree of&nbsp;control of&nbsp;their&nbsp;vehicles to a computerized system, the concept of &ldquo;safety&rdquo; and what it entails is anything but straightforward.&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"Jack-Weast-PAVE-panel\" src=\"https://player.vimeo.com/video/789962551?h=7294b42000&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"100%\" height=\"315\" frameborder=\"0\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Cp>Fortunately,&nbsp;we have some of our best minds working on it. Like that of Jack&nbsp;Weast, Mobileye&rsquo;s&nbsp;Vice&nbsp;President of Autonomous Vehicle Standards and a&nbsp;Senior&nbsp;Principal&nbsp;Researcher at our parent company Intel. And&nbsp;far from keeping our thinking a closely guarded secret,&nbsp;at Mobileye we believe it&rsquo;s critical for the industry to tackle this issue as one, and&nbsp;thereby to&nbsp;\u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">establish industry standards\u003C/a> on how to define and program for safety.&nbsp;In that vein,&nbsp;Weast recently&nbsp;took part in&nbsp;an online panel discussion hosted by Partners for Automated Vehicle Education&nbsp;(PAVE), where he was joined by Toyota researcher&nbsp;Jennifer Dawson and&nbsp;Mike&nbsp;Scrudato&nbsp;of&nbsp;Munich Reinsurance America.&nbsp;\u003C/p>\n\u003Cp>If you missed&nbsp;the live webcast, you&nbsp;can&nbsp;watch the full discussion below.\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/Gw8Jde2Nx3Y\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2020-07-15T07:00:00.000Z","AV Safety, Video, Events",{"id":2230,"type":5,"url":2231,"title":2232,"description":2233,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2233,"image":2234,"img_alt":2235,"content":2236,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2237,"tags":563},54,"everything-you-need-to-know-about-adas","Everything You Need to Know About ADAS","This handy fact sheet tells you all about Advanced Driver Assistance Systems and how Mobileye has become a leader in the field.","https://static.mobileye.com/website/us/corporate/post/images/1257cb1160fca1d734fc58ec56677baa_1599371702476.jpg","Mobileye ADAS tech on the highway.","\u003Cp>Mobileye has been developing \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Advanced Driver Assistance Systems (ADAS)\u003C/a> for over two decades now, since our founding at the turn of the millennium. No fewer than twenty-eight global automakers today depend on our advancements to power the ADAS features in more than 300 models currently on the market.\u003C/p>\n\u003Cp>By this point, our technology has been integrated into more than 60 million vehicles. To put that number into perspective, consider that if you lined up all those cars up end to end, they&rsquo;d lap around the world seven times. And what&rsquo;s more is that our numbers are only growing.\u003C/p>\n\u003Cp>Even as we broaden our horizons to new and additional realms of AI-enhanced mobility (like \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomous driving\u003C/a> and Mobility-as-a-Service), ADAS remains at the core of our business. And far from resting on our hard-earned laurels and &ldquo;simply&rdquo; expanding the application of our existing ADAS features to more vehicles, we&rsquo;re continuously honing our technology and pioneering \u003Ca href=\"https://www.mobileye.com/blog/understanding-l2-in-five-questions/\" target=\"_blank\" rel=\"noopener noreferrer\">entirely new categories of driver-assistance technologies\u003C/a> to keep moving the game forward to the benefit of everyone on the road.\u003C/p>\n\u003Cp>But what, exactly, is ADAS? What specific features does our technology enable? And how are we leveraging our leadership in the field to drive the industry forward? For the answers to these questions and more, \u003Ca href=\"https://newsroom.intel.com/articles/fact-sheet-mobileye-advanced-driver-assistance-systems-adas/\" target=\"_blank\" rel=\"noopener noreferrer\">check out this handy fact sheet\u003C/a> over in the Intel Newsroom. Even if you&rsquo;re well-schooled on the subject, you just might learn something new.\u003C/p>","2020-07-09T07:00:00.000Z",{"id":2239,"type":5,"url":2240,"title":2241,"description":2242,"primary_tag":40,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2242,"image":2243,"img_alt":2244,"content":2245,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2246,"tags":1878},40,"mobileye-to-deploy-robotaxis-in-japan-with-willer","Mobileye to Deploy Robotaxis in Japan with Willer","Self-driving Mobility-as-a-Service to expand across Asia via collaboration between Mobileye and Willer, one of the largest transport operators in the Far East.","https://static.mobileye.com/website/us/corporate/post/images/6aecb50b9282124f2a61538a4c0a79f3_1597839641581.jpg","Willer Express","\u003Cp>While it may be some time before you can buy a car that can drive itself, all by itself, in any environment, that doesn&rsquo;t mean you won&rsquo;t be able to ride in a self-driving vehicle before then. We&rsquo;re referring, of course, to robotaxis &ndash; \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomous vehicles\u003C/a> available on demand. And this latest collaboration will see Mobileye&rsquo;s technology driving robotaxis in yet further locations around the world.\u003C/p>\n\u003Cp>Mobileye has inked \u003Ca href=\"https://www.mobileye.com/news/mobileye-willer-self-driving-mobility-solutions/\" target=\"_blank\" rel=\"noopener noreferrer\">a new partnership with Willer\u003C/a>, one of the largest transportation operators in the Far East, which will see the two companies test and then deploy autonomous transportation solutions in Japan, Taiwan, and additional countries in Southeast Asia. Mobileye will supply the vehicles equipped with our self-driving system, which Willer will deploy and operate together with its local partners, specially catered to the local markets, regulations, and user preferences.\u003C/p>\n\u003Cp>We aim to begin testing robotaxis on public roads in Japan starting next year, with an eye towards launching fully self-driving ride-hailing and ride-sharing services in the Pacific island nation starting in 2023. Based on the potential success of that initial program, Mobileye and Willer can evaluate expanding to additional countries in the region. In addition to its 150 local partners in Japan, Willer currently operates a range of transportation solutions with partners in Taiwan, Vietnam, and Singapore.\u003C/p>\n\u003Cp>&ldquo;Collaboration with Mobileye is highly valuable for Willer and a big step moving forward to realize our vision of innovating transportation services: travel anytime and anywhere by anybody,&rdquo; commented Shigetaka Murase, founder and CEO of Willer. &ldquo;Innovation of transportation will lead to a smarter, safer and more sustainable society where people enjoy higher quality of life.&rdquo;\u003C/p>\n\u003Cp>The partnership with Willer will enable Mobileye to further extend the geographic reach of its \u003Ca href=\"https://www.mobileye.com/news/welcoming-moovit-to-the-fold/\" target=\"_blank\" rel=\"noopener\">Mobility-as-a-Service (MaaS)\u003C/a> ambitions. We&rsquo;re currently working with Volkswagen and its local importer Champion Motors to launch a fleet of robotaxis in Tel Aviv, and with Daegu City to launch another such program on the streets of the South Korean metropolis.\u003C/p>\n\u003Cp>&ldquo;Our new collaboration with Willer brings a meaningful addition to Mobileye&rsquo;s growing global network of transit and mobility ecosystem partners,&rdquo; added Mobileye CEO and Intel SVP \u003Ca href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Prof. Amnon Shashua\u003C/a>. &ldquo;We look forward to collaborating with Willer as we work together for new mobility in the region by bringing self-driving mobility services to Japan, Taiwan and ASEAN markets.&rdquo;\u003C/p>","2020-07-07T07:00:00.000Z",{"id":2248,"type":24,"url":2249,"title":2250,"description":2251,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2251,"image":2252,"img_alt":2251,"content":2253,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":2246,"tags":1016},155,"mobileye-willer-self-driving-mobility-solutions","Mobileye and WILLER Partner on Self-Driving Mobility Solutions for Japan, Southeast Asia","Mobileye and WILLER announce a strategic collaboration to launch an autonomous robotaxi service in Japan and markets across Southeast Asia, including Taiwan. ","https://static.mobileye.com/website/us/corporate/images/d0d93a6dcabc3195a2b3a6c42d4192d0_1666086943989.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>JERUSALEM and OSAKA, Japan, July 8, 2020 &ndash; Mobileye, an Intel Company, and WILLER, one of the largest transportation operators in Japan, Taiwan and the Southeast Asian region, today announced a strategic collaboration to launch an autonomous robotaxi service in Japan and markets across Southeast Asia, including Taiwan. Beginning in Japan, the companies will collaborate on the testing and deployment of autonomous transportation solutions based on Mobileye&rsquo;s automated vehicle (AV) technology.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>&ldquo;Our new collaboration with WILLER brings a meaningful addition to Mobileye&rsquo;s growing global network of transit and mobility ecosystem partners,&rdquo; said Prof. Amnon Shashua, Intel senior vice president and president and CEO of Mobileye. &ldquo;We look forward to collaborating with WILLER as we work together for new mobility in the region by bringing self-driving mobility services to Japan, Taiwan and ASEAN markets.&rdquo;\u003C/p>\n\u003Cp>&ldquo;Collaboration with Mobileye is highly valuable for WILLER and a big step moving forward to realize our vision of innovating transportation services: travel anytime and anywhere by anybody,&rdquo; said Shigetaka Murase, founder and CEO of WILLER. &ldquo;Innovation of transportation will lead to a smarter, safer and more sustainable society where people enjoy higher quality of life.&rdquo;\u003C/p>\n\u003Cp>Together, Mobileye and WILLER are seeking to commercialize self-driving taxis and autonomous on-demand shared shuttles in Japan, while leveraging each other&rsquo;s strengths. Mobileye will supply autonomous vehicles integrating its self-driving system and WILLER will offer services adjusted to each region and user tastes, ensure regulatory framework, and provide mobility services and solutions for fleet operation companies.\u003C/p>\n\u003Cp>The two companies aim to begin testing robotaxis on public roads in Japan in 2021, with plans to launch fully self-driving ride-hailing and ride-sharing mobility services in 2023, while exploring opportunities for similar services in Taiwan and other Southeast Asian markets.\u003C/p>\n\u003Cp>For Mobileye, the collaboration with WILLER advances the company&rsquo;s global mobility-as-a-service (MaaS) ambitions. Since announcing its intention to become a complete mobility provider, Mobileye has begun a series of collaborations with cities, transportation agencies and mobility technology companies to develop and deploy self-driving mobility solutions in key markets. The agreement with WILLER builds on Mobileye&rsquo;s existing MaaS partnerships. Examples include the&nbsp;\u003Ca href=\"https://www.mobileye.com/news/mobileyes-global-ambitions-take-shape-new-deals-china-south-korea/\" target=\"_blank\" rel=\"noopener noreferrer\">agreement with Daegu Metropolitan City, South Korea\u003C/a>, to deploy robotaxis based on Mobileye&rsquo;s self-driving system, and the&nbsp;joint venture with Volkswagen and Champion Motors&nbsp;to operate an autonomous ride-hailing fleet in Jerusalem. The collaboration with WILLER greatly expands and strengthens the company&rsquo;s global MaaS ambition.\u003C/p>\n\u003Cp>WILLER aims to unify user experiences across countries in the region; it released a MaaS app in 2019 and enabled a QR-code-based payment system this year. WILLER has partnered with Kuo-Kuang Motor Transportation, the largest bus operator in Taiwan, and Mai Linh, the largest taxi company in Vietnam, as well as invested in Car Club, a car-sharing service provider in Singapore. WILLER also partners with 150 local transportation providers in Japan. On top of these partnerships, WILLER will provide self-driving ride-hailing and ride-sharing services in the region and provide the best customer-ride experiences together with Mobileye.\u003C/p>\n\u003Cp>The collaboration between WILLER and Mobileye will add a new transportation mode to the existing range of transportation services, including highway buses, railways and car-sharing. Adding self-driving vehicles, on-demand features and sharing services will improve customer ride experiences and address social challenges such as traffic accidents, congestion and, especially, the shortage of drivers and the challenges resulting from Japan&rsquo;s aging society. Together Mobileye and WILLER will accelerate the social benefits of self-driving transportation solutions that contribute to higher quality of daily lives, making society smarter, safer and more sustainable.\u003C/p>\n\u003Cp>\u003Cstrong>About Mobileye\u003C/strong>\u003C/p>\n\u003Cp>Mobileye is the global leader in the development of computer vision and machine learning, data analysis, localization and mapping for advanced driver-assistance systems and automated driving. Mobileye&rsquo;s technology helps keep people safer on the road, reduces the risks of traffic accidents, saves lives and aims to revolutionize the driving experience by enabling autonomous driving. Mobileye&rsquo;s proprietary software algorithms and EyeQ&reg; chips perform detailed interpretations of the visual field in order to anticipate possible collisions with other vehicles, pedestrians, cyclists, animals, debris and other obstacles. Mobileye&rsquo;s products are also able to detect roadway markings such as lanes, road boundaries, barriers and similar items; identify and read traffic signs, directional signs and traffic lights; create a RoadBook&trade; of localized drivable paths and visual landmarks using REM&trade;; and provide mapping for autonomous driving. More information is available in&nbsp;Mobileye&rsquo;s press kit.\u003C/p>\n\u003Cp>\u003Cstrong>About WILLER\u003C/strong>\u003C/p>\n\u003Cp>WILLER was established in 1994 to provide society- and community-centric transportation services. WILLER pursues cutting-edge technology and marketing strategies to better customers&rsquo; ride experiences and create innovative values for society and local community. In Japan, WILLER has the largest intercity bus networks and operates a railway in Kyoto and operates unique restaurant buses that offers local cuisine area by area. Besides Japan, WILLER operates car-sharing services in Singapore and ride-hailing taxis in Vietnam.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ca href=\"https://player.vimeo.com/video/436204436\" target=\"_blank\" rel=\"noopener noreferrer\">&raquo;&nbsp;Watch video: Mobileye maps millions of KM with REM Technology\u003C/a>\u003C/p>",{"id":2255,"type":24,"url":2256,"title":2257,"description":2258,"primary_tag":2259,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2258,"image":2260,"img_alt":2261,"content":2262,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2263,"tags":2264},23,"mobileye-ranked-5-in-guidehouse-insights-automated-driving-leaderboard","Mobileye Ranked #5 in Guidehouse AV Leaderboard ","Mobileye placed high among autonomous-vehicle technology companies in this survey from a leading management consulting firm thanks to our long-term vision.",5,"https://static.mobileye.com/website/us/corporate/post/images/cb9854631f84743a9892f342affdcfa2_1597827774257.jpg","Mobileye Ranked #5 on AV Leaderboard","\u003Cp>\u003Cspan style=\"background-color: inherit;\">In Q1 2020 Guidehouse Insights, a leading market intelligence and advisory firm \u003C/span>(formerly Navigant Consulting Inc)\u003Cspan style=\"background-color: inherit;\">, published its annual report which looks at the top companies developing \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomous vehicles\u003C/a>\u003Cspan style=\"background-color: inherit;\">, ranking them according to 10 criteria: vision; go-to-market strategy; partners; production strategy; technology; sales, marketing, and distribution; product capability; product quality and reliability; product portfolio; and staying power. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">Using their own propriety leaderboard methodology, \u003C/span>\u003Ca href=\"https://guidehouseinsights.com/subscription-services/automated-vehicles\" target=\"_blank\" rel=\"noopener\">Guidehouse\u003C/a> ranked Mobileye fifth\u003Cspan style=\"background-color: inherit;\"> among all firms working in autonomous vehicle field. \u003C/span>Guidehouse\u003Cspan style=\"background-color: inherit;\"> was impressed with Mobileye&rsquo;s consumer &amp; MaaS based approach to self-driving vehicles. Mobileye was also credited for its long-term vision to bring self-driving vehicles to market, as well as leveraging its position as an \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">ADAS \u003C/a>\u003Cspan style=\"background-color: inherit;\">leader to make the autonomous future a reality. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">Guidehouse&rsquo;s analysis also noted Mobileye&rsquo;s ability to combine vision and lidar sensing to provide what \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">CEO Amnon Shashua\u003C/a>\u003Cspan style=\"background-color: inherit;\"> calls &ldquo;true redundancy&rdquo; for autonomous vehicles. The company&rsquo;s proprietary \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ chips\u003C/a>\u003Cspan style=\"background-color: inherit;\"> and software were also factors in Mobileye&rsquo;s ranking.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #d13438;\"> \u003C/span>\u003C/p>","2020-06-29T07:00:00.000Z","Autonomous Driving, Industry, News, Awards",{"id":2266,"type":5,"url":2267,"title":2268,"description":2269,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2269,"image":2270,"img_alt":2271,"content":2272,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2273,"tags":2274},24,"how-startups-can-become-successful-oem-partners","How Startups Can Become Successful OEM Partners","Mobileye VP Tal Babaioff offers some words of wisdom on the latest PACEpilot online panel","https://static.mobileye.com/website/us/corporate/post/images/c173ead7a8278e68fd968e1abbb5f5eb_1597843841773.jpg"," PACEpilot online panel","\u003Cp>\u003Cspan style=\"background-color: inherit;\">Mobileye has \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/about/\" target=\"_blank\" rel=\"noopener noreferrer\">come a long way\u003C/a>\u003Cspan style=\"background-color: inherit;\"> in the past couple of decades since our founding. From a scrappy startup, we&rsquo;ve grown into a major player in the automotive industry, securing partnerships with dozens of major global automakers and placing our technology in tens of millions of vehicles around the world. \u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">As focused as we are on the future, we&rsquo;re proud of the successes we&rsquo;ve reached until now. So when \u003C/span>\u003Cem style=\"background-color: inherit;\">Automotive News\u003C/em>\u003Cspan style=\"background-color: inherit;\"> asked us to participate in its latest \u003C/span>\u003Ca style=\"background-color: inherit; color: #0563c1;\" href=\"https://www.autonews.com/pacepilot/pacepilot-innovative-business-models-and-oem-startup-partnerships\" target=\"_blank\" rel=\"noopener noreferrer\">PACE pilot virtual panel discussion\u003C/a>\u003Cspan style=\"background-color: inherit;\"> and impart from our experience to the benefit of others eager to follow in our footsteps, we were only too glad to take part. Representing Mobileye was Tal Babaioff, our Vice President of Mapping &amp; Localization and Co-General Manager of \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">REM\u003C/a>\u003Cspan style=\"background-color: inherit;\"> &ndash; the crowdsourcing map project which recently \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/news/mobileye-wins-prestigious-2020-pace-award-for-rem-mapping-tech/\" target=\"_blank\" rel=\"noopener\">won a 2020 PACE Award\u003C/a>\u003Cspan style=\"background-color: inherit;\"> from the same publication.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">&ldquo;We see OEMs to be accepting our ability to come and innovative,&rdquo; said Babaioff. &ldquo;They always make sure that they have a backup plan, but in some cases they understand that moving into this technology of the future is the way forward. And once they embrace this approach, we have [seen] great progress with them together.&rdquo;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">&ldquo;The one thing that I would recommend is having great focus,&rdquo; Babaioff concluded. &ldquo;Make sure that you understand what you are doing. Find a real problem in the world that actually is looking to be solved and solve it. And make sure that your system is actually working in a real-world environment and not just on the test track.&rdquo;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">To hear what else Tal and his peers had to say, watch the full hour-long two-part discussion in the video below. Tal takes part in the second half, alongside Deloitte&rsquo;s Marcus Holzer, Grove Ventures partner Sigalit Klimovsky, and Caresoft Global CEO Mathew Vachaparampil &ndash; moderated by Steve Schmith, Executive Director of Custom Research &amp; Data Strategy at \u003C/span>\u003Cem style=\"background-color: inherit;\">Automotive News\u003C/em>\u003Cspan style=\"background-color: inherit;\">.\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"//players.brightcove.net/716708064/59xVcV4Zd_default/index.html?videoId=6166894720001\" height=\"470\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2020-06-25T07:00:00.000Z","Industry, ADAS, Video, Events, Mobileye Inside",{"id":2276,"type":24,"url":2277,"title":2278,"description":2279,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2279,"image":2280,"img_alt":2281,"content":2282,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2283,"tags":1517},28,"mobileye-releases-unedited-footage-of-a-40-minute-av-ride-through-jerusalem","Take a 40-Minute Ride Through Jerusalem in Mobileye's AV","In this full unedited video, Mobileye AV shows it can negotiate difficult traffic challenges.","https://static.mobileye.com/website/us/corporate/post/images/e749391b03bae24deb926f5ec08d307b_1597842901863.jpg","Unedited Footage of a 40+ Minute AV Ride Through Jerusalem","\u003Cp>\u003Cspan style=\"background-color: inherit;\">Today Mobileye released an \u003C/span>\u003Ca style=\"background-color: inherit; color: #0563c1;\" href=\"https://www.youtube.com/watch?v=kJD5R_yQ9aw\" target=\"_blank\" rel=\"noopener noreferrer\">unedited clip\u003C/a> \u003Cspan style=\"background-color: inherit;\">of a 40+ minute ride through the streets of Jerusalem by an \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomous vehicle\u003C/a>\u003Cspan style=\"background-color: inherit;\"> (AV). The AV was retrofitted in Mobileye&rsquo;s workshop with our AV Kit, made up of 12 cameras and a self-driving system powered by Mobileye&rsquo;s proprietary software and two of the company&rsquo;s \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ5 chips\u003C/a>\u003Cspan style=\"background-color: inherit;\">. Future versions of these vehicles will incorporate radar and LiDAR to provide what Mobileye CEO \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Prof. Amnon Shashua\u003C/a>\u003Cspan style=\"background-color: inherit;\"> calls &ldquo;true redundancy.&rdquo; \u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">The video shows a split screen featuring an overhead view taken by a drone flying above the vehicle, a view from inside the vehicle and a 3D-representation of the environmental model created based on what the 12 cameras &ldquo;see.&rdquo; During the ride, the AV maneuvers through busy intersections, around multi-lane roundabouts, and negotiates other tricky situations that would challenge even the most adept human drivers. \u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">Throughout the video, narration explains how Mobileye technology and driving philosophy, based on \u003C/span>\u003Ca style=\"background-color: inherit; color: inherit;\" href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">Responsibility-Sensitive Safety (RSS)\u003C/a>\u003Cspan style=\"background-color: inherit;\">, make for assertive, yet  safe, driving. The sensing display gives viewers an insight into the system&rsquo;s state-of-the-art detection and how vehicles, pedestrians and other objects are understood and classified. \u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">For more context to the video, \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://medium.com/@amnon.shashua/the-challenge-of-supporting-av-at-scale-7c06196cced2\" target=\"_blank\" rel=\"noopener noreferrer\">read Prof. Shashua&rsquo;s \u003C/a>\u003Ca style=\"background-color: inherit; color: #0563c1;\" href=\"https://medium.com/@amnon.shashua/the-challenge-of-supporting-av-at-scale-7c06196cced2\" target=\"_blank\" rel=\"noopener noreferrer\">editorial\u003C/a> \u003Cspan style=\"background-color: inherit;\">detailing the philosophy behind Mobileye&rsquo;s AV development. \u003C/span>&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/kJD5R_yQ9aw\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2020-05-28T07:00:00.000Z",{"id":2285,"type":654,"url":2286,"title":2287,"description":2288,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2288,"image":2289,"img_alt":2290,"content":2291,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2292,"tags":2293},32,"the-challenge-of-supporting-av-at-scale","The Challenge of Supporting AV at Scale ","Mobileye CEO Prof. Amnon Shashua addresses the less obvious implications of safety on system architectural design","https://static.mobileye.com/website/us/corporate/post/images/2b1c4186e73c17b7523f12376c1c3d76_1597831379316.jpg","A Mobileye AV prototype on the road in Jerusalem","\u003Cp>\u003Cspan style=\"background-color: inherit;\">Make no mistake about it: developing a vehicle that can effectively drive itself is a huge undertaking &ndash; one that might have seemed like science fiction not long ago. But even if you get a prototype working properly, deploying \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomous vehicles\u003C/a>\u003Cspan style=\"background-color: inherit;\"> en masse presents an entirely new set of difficulties to overcome.\u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">Fortunately\u003C/span>\u003Cspan style=\"background-color: inherit; color: #d13438;\">,\u003C/span>\u003Cspan style=\"background-color: inherit;\">&nbsp;we here at Mobileye aren&rsquo;t merely cognizant of the challenges inherent in deploying autonomous vehicles at&nbsp;scale; we&rsquo;re actively working on tackling the problem. The approach we&rsquo;re taking is rather unique in the industry, introducing a concept we call &ldquo;true redundancy&rdquo; in a way that which we believe allows for not only better safety, but also faster validation. And we&rsquo;re doing it all with an eye fixed firmly on the level of safety that we believe our technology must achieve if autonomous vehicles are to make the crucial leap from the prototype phase to widespread deployment.\u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">Our CEO \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Prof. Amnon Shashua\u003C/a>\u003Cspan style=\"background-color: inherit;\"> wrote about just this issue in a recent&nbsp;\u003C/span>\u003Ca style=\"background-color: inherit; color: #0563c1;\" href=\"https://medium.com/@amnon.shashua/the-challenge-of-supporting-av-at-scale-7c06196cced2\" target=\"_blank\" rel=\"noopener noreferrer\">editorial on Medium\u003C/a>\u003Cspan style=\"background-color: inherit;\">. Head on over to get the inside scoop, straight from the top, on how Mobileye is approaching &ldquo;The Challenge of Supporting AV at&nbsp;Scale.&rdquo;\u003C/span>\u003C/p>","2020-05-26T07:00:00.000Z","Opinion, Autonomous Driving, From our CEO",{"id":2295,"type":5,"url":2296,"title":2297,"description":2298,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2298,"image":2299,"img_alt":2300,"content":2301,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2302,"tags":2303},50,"mobileye-ceo-amnon-shashua-explains-why-the-company-is-all-in-on-mobility-at-ecomotion-2020","We're 'All In' on Mobility, Shashua Says at EcoMotion 2020","Interview touches on MaaS, robotaxis, and how startups can weather the Coronavirus crisis.","https://static.mobileye.com/website/us/corporate/post/images/d63b114fe294fc432306732469c1bd39_1597847046807.jpg","Mobileye CEO Prof. Amnon Shashua speaks with Enroute CEO Aviv Frenkel at EcoMotion 2020","\u003Cp>\u003Cspan style=\"background-color: inherit;\">In an&nbsp;\u003C/span>\u003Ca style=\"background-color: inherit; color: #0563c1;\" href=\"https://www.youtube.com/watch?time_continue=1&amp;v=noEkfPQSjpY&amp;feature=emb_logo\" target=\"_blank\" rel=\"noopener noreferrer\">interview\u003C/a>\u003Cspan style=\"background-color: inherit;\">&nbsp;with&nbsp;Enroute CEO Aviv Frenkel, Mobileye CEO \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Amnon Shashua\u003C/a>\u003Cspan style=\"background-color: inherit;\"> spoke about a wide range of subjects including Mobileye&rsquo;s plans to enter the MaaS market, insights into the AV industry overall, and why \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">self-driving cars\u003C/a>\u003Cspan style=\"background-color: inherit;\"> are &ldquo;unstoppable.&rdquo;&nbsp;\u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/noEkfPQSjpY\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">As part of Mobileye&rsquo;s entry into&nbsp;MaaS, the company is going &ldquo;all in,&rdquo;&nbsp;towards&nbsp;introducing a fully autonomous&nbsp;robotaxi&nbsp;service in 2022 in Tel Aviv. The recent acquisition of Moovit will help&nbsp;with&nbsp;this&nbsp;aim, providing transportation technologies such as mobility intelligence and vehicle optimization&nbsp;that will complement Mobileye&rsquo;s self-driving technology. \u003C/span>\u003Cem style=\"background-color: inherit;\">Forbes\u003C/em>\u003Cspan style=\"background-color: inherit;\">, in its coverage of the interview, \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.forbes.com/sites/bradtempleton/2020/05/21/intelmobileye-promises-self-driving-robotaxi-service-in-2022-while-others-back-off/#15495c87b8ea\" target=\"_blank\" rel=\"noopener noreferrer\">noted\u003C/a>\u003Cspan style=\"background-color: inherit;\"> that Shashua&rsquo;s statement came &ldquo;while many other companies, particularly car OEMs, are scaling back their plans and timelines on full robocar service.&rdquo;&nbsp;\u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">Shashua also spoke about consolidation in the AV industry noting that companies are discovering that developing these vehicles involves a wide range of technologies which must be developed in tandem. Reuters, which also \u003C/span>\u003Ca style=\"background-color: inherit;\" href=\"https://www.reuters.com/article/us-intel-autonomous/mobileye-ceo-sees-great-consolidation-ahead-in-autonomous-car-sector-idUSKBN22V1U5\" target=\"_blank\" rel=\"noopener noreferrer\">covered the interview\u003C/a>\u003Cspan style=\"background-color: inherit;\">, quoted Shashua as saying, &ldquo;It&rsquo;s a formidable task, and there are going to be very, very few actors who can go from silicon (chips) to self-driving systems.&rdquo;&nbsp;\u003C/span>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: inherit;\">As for the effect of the coronavirus crisis on the development of AV technology, Shashua pointed out at there are two fundamental drivers pushing the industry forward: the rise of automation and the need for mobility. Neither of these&nbsp;has been changed by the crisis.\u003C/span>\u003C/p>","2020-05-19T07:00:00.000Z","Driverless MaaS, Autonomous Driving, Events, Video, From our CEO",{"id":2305,"type":5,"url":2306,"title":2307,"description":2308,"primary_tag":140,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2308,"image":2309,"img_alt":2310,"content":2311,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2312,"tags":2147},34,"mobileye-intel-commit-to-making-the-world-a-safer-place","Mobileye & Intel Commit to Making the World a Safer Place","Our Responsibility-Sensitive Safety model is a key component in Intel’s RISE strategy for 2030 and the drive to reduce deaths from traffic accidents worldwide. ","https://static.mobileye.com/website/us/corporate/post/images/e5eb6c467d68dccab6b0629b3befca34_1597832718655.jpg","Responsibility-Sensitive Safety: A Mathematical Model for Autonomous Vehicle Safety","\u003Cp>Today our parent company Intel released&nbsp;its&nbsp;\u003Ca style=\"color: #0563c1;\" href=\"http://csrreportbuilder.intel.com/pdfbuilder/pdfs/CSR-2019-20-Full-Report.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">2019-2020 corporate social responsibility (CSR) report\u003C/a>, and with it has launched its&nbsp;\u003Ca style=\"color: #0563c1;\" href=\"https://www.intel.com/content/www/us/en/corporate-responsibility/2030-goals.html\" target=\"_blank\" rel=\"noopener noreferrer\">Responsible Inclusive Sustainable Enabling (RISE) strategy\u003C/a>&nbsp;for 2030.&nbsp;We at Mobileye, an Intel company, are proud to play our part in mapping out and achieving Intel&rsquo;s ambitious goals to help create a better world for us all.&nbsp;\u003C/p>\n\u003Cp>&ldquo;Today marks the beginning of a new era for corporate responsibility at Intel,&rdquo;&nbsp;notes&nbsp;Suzanne&nbsp;Fallender, Director of Corporate Responsibility at Intel Corporation.&nbsp;&ldquo;We&rsquo;ll drive to even higher levels of integration and collaboration to build more value for our stakeholders and deliver on our purpose to create world-changing technology that enriches the lives of every person on Earth.&rdquo;&nbsp;\u003C/p>\n\u003Cp>A key&nbsp;component&nbsp;to that commitment is&nbsp;Mobileye&rsquo;s&nbsp;\u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\">Responsibility-Sensitive Safety&nbsp;(RSS)\u003C/a> initiative, a mathematical model for ensuring that autonomous vehicles operate in as safe a manner as possible. Beyond simply engineering autonomous vehicles to be safer than human drivers, RSS&nbsp;provides specific and measurable parameters for the human concepts of responsibility and caution and defines a &ldquo;Safe State&rdquo; designed to prevent the AV from being the cause of an accident, no matter what action is taken by other vehicles.&nbsp;&nbsp;\u003C/p>\n\u003Cp>We&rsquo;ve created RSS not only to guide the development and deployment of our own autonomous vehicle technology, but as a rallying point for the entire industry.&nbsp;We hope that our partners and colleagues will answer the call to help solve this industry-wide challenge.&nbsp;\u003C/p>\n\u003Cp>&nbsp;&ldquo;1.35 million&nbsp;people die each year as a result of road traffic crashes,&rdquo; Intel&rsquo;s latest CSR&nbsp;report notes,&nbsp;citing statistics from the World Health Organization,&nbsp;&ldquo;and 93% (of those deaths) occur in low- and middle-income countries.&rdquo;&nbsp;Our technologies aim to drastically reduce those numbers, and Intel and Mobileye stand&nbsp;committed &ldquo;to make these technologies broadly accessible and affordable, in an effort to save and improve lives.&rdquo;&nbsp;\u003C/p>\n\u003Cp>Watch the video below to learn more about RSS, and head on over to our parent company Intel to read more about the&nbsp;\u003Ca style=\"color: #0563c1;\" href=\"https://www.intel.com/content/www/us/en/corporate-responsibility/2030-goals.html\" target=\"_blank\" rel=\"noopener noreferrer\">RISE strategy\u003C/a> and the full 2019-2020 report &ldquo;\u003Ca style=\"color: #0563c1;\" href=\"http://csrreportbuilder.intel.com/pdfbuilder/pdfs/CSR-2019-20-Full-Report.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Corporate Responsibility at Intel\u003C/a>.&rdquo;&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/766cEzQT6So\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2020-05-14T07:00:00.000Z",{"id":2314,"type":654,"url":2315,"title":2316,"description":2317,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2317,"image":2318,"img_alt":2319,"content":2320,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2321,"tags":2322},46,"there-is-more-to-our-moovit-acquisition-than-meets-the-eye","There's More to our Moovit Acquisition than Meets the Eye","We’re on a path to revolutionize mobility, save lives, and reduce congestion, says Mobileye CEO Prof. Amnon Shashua.","https://static.mobileye.com/website/us/corporate/post/images/8657188da0fdb0424d3438db4211821c_1597840966008.jpg","Moovit app","\u003Cp>We are thrilled to welcome Moovit to the Intel family. Moovit is the world&rsquo;s best-known transit application that aggregates data from multiple transit partners and customers. With more than 800 million users globally, Moovit collects more than 6 billion data points daily about traffic flow and user demand across more than 3,100 cities in 102 countries and serves more than 7,500 public transit operators. Quite an achievement.\u003C/p>\n\u003Cp>The natural question is why Intel? This is where Mobileye, an Intel company, completes the puzzle.\u003C/p>\n\u003Cp>Mobileye is Intel&rsquo;s arm to foray into the future of transportation. The future of mobility relies on compute and lots of it. We are engaged in designing the most high-density and efficient silicon; cutting-edge algorithms involving artificial intelligence for interpreting sensing data from cameras, radars and lidars; and algorithms for decision-making for autonomous cars all wrapped around unprecedented safety models. While doing so, we are engaged in saving lives through our leading position in driver-assistance systems with nearly 60 million cars equipped with our silicon and algorithms preventing and mitigating accidents every day.\u003C/p>\n\u003Cp>Going forward, all this technology will be integrated into a mobility-as-a-service (MaaS) business &ndash; an opportunity estimated to be worth $160 billion by 2030. We have contracts and partnerships around the globe targeting 2022 debut of various manifestations of MaaS.\u003C/p>\n\u003Cp>We have dedicated significant efforts to study the nature of the business: &nbsp;the value chain formations, the societal and economical pain-points of the urban mobility systems,&nbsp;and the path to weave driverless capabilities into the existing urban transportation fabric.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/us/corporate/images/1ee50dcbbf3d217e52c5c160c5143761_1730619549514.jpg\" alt=\"\" width=\"1650\" height=\"1070\" />\u003C/p>\n\u003Cp>We developed a multimodal XaaS strategy that will enable Intel through Mobileye, and now including Moovit, to create a value proposition out of every layer of the solution stack &ndash; from the self-driving system as standalone, all the way up to the robotaxi service and experience. This strategy is very nuanced and differs from every other company in this space.\u003C/p>\n\u003Cp>The first critical asset is Mobility Intelligence based on data-driven real-time demand and supply insights allowing for driverless technology to be surgically introduced through various service models such as origin-to-destination, first/last mile and dynamically routed shuttles.\u003C/p>\n\u003Cp>The second critical asset is the transit operators&rsquo; operational expertise. This is a core asset to be harnessed through collaborative go-to-market models. Two such canonical models include vehicle-as-a-service (VaaS) and ride-as-a-service (RaaS). In particular, RaaS enables an existing service operator to &ldquo;summon&rdquo; an automated mobility solution to cater for un-addressable demand under its service umbrella. VaaS, on the other hand, is a further integrated model in which we offer a dedicated fleet of autonomous vehicles/shuttles to be acquired and assimilated into the transit operator&rsquo;s backbone and control center together with the Mobility Intelligence software, which assures efficient use of that capital.\u003C/p>\n\u003Cp>With today&rsquo;s acquisition of Moovit, we have added another critical piece to our mobility stack and accelerated our way towards becoming a complete mobility provider. Beyond the obvious value of Moovit&rsquo;s data and user base, the company owns underlying assets, capabilities and a partners network that will enable us to turn on affordable and demand-optimized driverless mobility services almost anywhere in the world.\u003C/p>\n\u003Cp>As in Mobileye&rsquo;s case, Moovit will remain an independent subsidiary propelling its core business forward. Being a part of Intel has empowered Mobileye to dream beyond computer vision and into the driverless future, and now we intend to empower Moovit to dream bigger, reach higher and, together, make an impact on the future of transportation.\u003C/p>","2020-05-05T07:00:00.000Z","Driverless MaaS, Opinion, From our CEO",{"id":2324,"type":24,"url":2325,"title":2326,"description":2327,"primary_tag":40,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2327,"image":2328,"img_alt":2329,"content":2330,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2331,"tags":1897},26,"welcoming-moovit-to-the-fold","Welcoming Moovit to the Fold","Self-driving Mobility-as-a-Service comes closer to fruition through the marriage of Mobileye’s autonomous vehicle tech and Moovit’s mobility platform.","https://static.mobileye.com/website/us/corporate/post/images/f7be78bef31519be20a048988e58f7aa_1597830239944.jpg","Mobileye & Moovit: the Future of Mobility","\u003Cp>The road to the \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomous-vehicle\u003C/a> future, we firmly believe at Mobileye, is through robotaxis. That&rsquo;s why we&rsquo;re very excited to welcome Moovit into the fold as the \u003Ca href=\"https://www.mobileye.com/news/intel-acquires-moovit-to-accelerate-mobileyes-mobility-as-a-service-offering/\" target=\"_blank\" rel=\"noopener noreferrer\">latest acquisition\u003C/a> by our parent company Intel and a key addition to our business.\u003C/p>\n\u003Cp>For those unacquainted, Moovit is a comprehensive mobility platform that allows users to plan and execute their travel using a variety of modes of transportation, including public transit, shared bicycles and scooters, ride-hailing, and car-sharing services. Based in Tel Aviv, Moovit has partnered with over 7,500 transit agencies and operators, and serves over 800 million users in 3,100 cities across 102 countries around the world. Those are some pretty impressive numbers considering that Moovit was only founded in 2012, its growth fueled by a seven-fold increase in users in the past two years alone.\u003C/p>\n\u003Cp>&ldquo;Intel&rsquo;s purpose is to create world-changing technology that enriches the lives of every person on Earth, and our Mobileye team delivers on that purpose every day,&rdquo; said Intel CEO Bob Swan. &ldquo;Mobileye&rsquo;s ADAS technology is already improving the safety of millions of cars on the road, and Moovit accelerates their ability to truly revolutionize transportation &ndash; reducing congestion and saving lives &ndash; as a full-stack mobility provider.&rdquo;\u003C/p>\n\u003Cp>The Moovit acquisition represents a key investment that we anticipate will help us realize enormous potential. The combination of Mobileye&rsquo;s technology and Moovit&rsquo;s platform will enable us to grow together into a complete mobility-as-a-service (MaaS) provider and tap into a market projected to be worth $160 billion within the next ten years &ndash; or more than $230 billion when combined with the markets for \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">ADAS\u003C/a> and \u003Ca href=\"https://www.mobileye.com/en/data/\" target=\"_blank\" rel=\"noopener noreferrer\">data technologies\u003C/a> that are among Mobileye&rsquo;s key areas of expertise.\u003C/p>\n\u003Cp>&ldquo;Moovit&rsquo;s massive global user base, proprietary transportation data, global editors community, strong partnerships with key transit and mobility ecosystem partners, and highly skilled team is what makes them a great investment,&rdquo; noted \u003Ca href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Professor Amnon Shashua\u003C/a>, CEO of Mobileye and Senior Vice President at Intel. &ldquo;Moovit is a strong brand trusted by hundreds of millions of people globally. Together, with Mobileye&rsquo;s extensive capabilities in mapping and self-driving technology, we will be able to accelerate our timeline to transform the future of mobility.&rdquo;\u003C/p>\n\u003Cp>&ldquo;We are excited to join forces with Mobileye and lead the future revolution of new mobility services,&rdquo; added Moovit CEO Nir Erez. &ldquo;Mobility is a basic human right, and as cities become more crowded, urban mobility becomes more difficult. Combining the daily mobility habits and needs of millions of Moovit users with the state-of-the-art, safe, affordable and eco-friendly transportation enabled by self-driving vehicles, we will be able to make cities better places to live in. We share this vision and look forward to making it a reality as part of Mobileye.&rdquo;\u003C/p>","2020-05-04T07:00:00.000Z",{"id":2333,"type":24,"url":2334,"title":2335,"description":2335,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2335,"image":2336,"img_alt":2335,"content":2337,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2338,"tags":2339},163,"intel-acquires-moovit-to-accelerate-mobileyes-mobility-as-a-service-offering","Intel Acquires Moovit to Accelerate Mobileye’s Mobility-as-a-Service Offering","https://static.mobileye.com/website/us/corporate/images/ac4d37773481934ba061b94e9888200c_1666085753261.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Ch2>Moovit is a MaaS Solutions Company Known for Its Popular Urban Mobility App\u003C/h2>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">SANTA CLARA, Calif., May 4, 2020 &ndash; Intel Corporation today announced it has acquired Moovit, a mobility-as-a-service (MaaS) solutions company, for approximately $900 million ($840 million net of Intel Capital equity gain). Moovit is known for its urban mobility application that offers travelers around the world the best multimodal trip planning by combining public transportation, bicycle and scooter services, ride-hailing, and car-sharing. The addition of Moovit brings Intel&rsquo;s Mobileye closer to achieving its plan to become a complete mobility provider, including robotaxi services, which is forecast to be an estimated $160 billion opportunity by 2030.\u003C/span>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/efb0a6e7e88f63fc7f9ac923d7755d3e_1663239896710.jpg\" alt=\"Moovit app\" width=\"1650\" height=\"869\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">&ldquo;Intel&rsquo;s purpose is to create world-changing technology that enriches the lives of every person on Earth, and our Mobileye team delivers on that purpose every day,&rdquo; said Bob Swan, Intel CEO. &ldquo;Mobileye&rsquo;s ADAS technology is already improving the safety of millions of cars on the road, and Moovit accelerates their ability to truly revolutionize transportation &ndash; reducing congestion and saving lives &ndash; as a full-stack mobility provider.&rdquo;\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Moovit has established its leadership in the MaaS space with more than 800 million users and services in 3,100 cities across 102 countries. Today, Mobileye is the leading automotive solutions partner that enables advanced driver-assistance systems (ADAS) deployed on nearly 60 million vehicles with more than 25 automaker partners.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Mobileye is a growth engine for Intel as the company transforms for a world where the exponential growth of data fuels demand for technology solutions that can process, move and store more data faster. Intel is investing and expanding to serve new data-rich market opportunities, including the fast-growing market for ADAS, data and MaaS technologies, which together represent an opportunity totaling more than&nbsp;\u003C/span>\u003Cspan style=\"color: #0071c5; background-color: #ffffff;\">$230 billion by 2030\u003C/span>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">. Upon close, Moovit will join the Mobileye business while retaining its brand and existing partnerships.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Moovit was founded in 2012 and is based in Tel Aviv, with approximately 200 employees. Moovit combines information from public transit operators and authorities with live information from the user community to offer travelers a real-time picture of the best route for their journey. In the past 24 months, Moovit has achieved a seven-times increase in users. Moovit has also signed strategic partnership agreements with major ride-sharing operators and mobility ecosystem companies for analytics, routing, optimization and operations for MaaS. With this acquisition, Mobileye will be able to use Moovit&rsquo;s large proprietary transportation dataset to optimize predictive technologies based on customer demand and traffic patterns, as well as tap into Moovit&rsquo;s transit data repository of more than 7,500 key transit agencies and operators, and improve the consumer experience for more than 800 million users worldwide. Moovit&rsquo;s consumer applications and user experience will continue under its own brand.\u003C/span>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/05eff582a1b075feceb492fcf30fb9d7_1663239996999.jpg\" alt=\"Nir Erez, Moovit co-founder and CEO\" width=\"1605\" height=\"1070\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">&ldquo;We are excited&nbsp;to join forces with Mobileye and lead the future revolution of new mobility services,&rdquo; said Nir Erez, Moovit co-founder and CEO. &ldquo;Mobility is a basic human right, and as cities become more crowded, urban mobility becomes more difficult. Combining the daily mobility habits and needs of millions of Moovit users with the state-of-the-art, safe, affordable and eco-friendly transportation enabled by self-driving vehicles, we will be able to make cities better places to live in. We share this vision and look forward to making it a reality as part of Mobileye.&rdquo;\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Intel acquired Mobileye in 2017. Since then, Mobileye revenues have more than doubled on the increased adoption of ADAS based on Mobileye&rsquo;s industry-leading technology. Mobileye&rsquo;s vision-safety technology aims to make roads safer, reduce traffic congestion and save lives. Mobileye provides a complete autonomous vehicle solution stack that is technically advanced, provides unmatched agility and safety, and copes with a wide variety of driving complexities.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Mobileye&rsquo;s business model encompasses the entire automated driving value chain, including the front-facing camera that powers most of today&rsquo;s ADAS, conditional autonomy &ndash; also known as level 2+ &ndash; and the self-driving system (SDS) for self-driving shuttles and robotaxis as well as consumer autonomous vehicles (AVs). Mobileye has strong performance in every one of these categories with advanced vision sensing technology, crowd-sourced mapping capability (REM) and the Responsibility-Sensitive Safety (RSS) driving policy.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">By working together as part of Intel and Mobileye, Moovit will advance the company&rsquo;s MaaS strategy and the global adoption of autonomous transportation.\u003C/span>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/93d3861db825407d6b3cca28f46ab53e_1663240063979.jpg\" alt=\"Intel acquires Moovit\" width=\"608\" height=\"1070\" />\u003C/p>","2020-05-03T07:00:00.000Z","News, Driverless MaaS",{"id":2341,"type":24,"url":2342,"title":2343,"description":2344,"primary_tag":9,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2344,"image":2345,"img_alt":2346,"content":2347,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2348,"tags":2349},29,"mobileye-wins-prestigious-2020-pace-award-for-rem-mapping-tech","REM™ Mapping Tech Wins Prestigious 2020 PACE Award","Our Road Experience Management™ system has been recognized among the most innovative new technologies released by automotive suppliers this year.","https://static.mobileye.com/website/us/corporate/post/images/fa48c8a18bd9abc91d87a9c953c4a48b_1597830698352.jpg","Mobileye REM 2020 PACE Award","\u003Cp>The well-known PACE \u003Ca href=\"https://www.mobileye.com/news/mobileye-wins-prestigious-2020-pace-award-for-rem-mapping-tech/\" target=\"_blank\" rel=\"noopener\">award\u003C/a> carries with it a special weight due to the fact that our innovation has been singled out for recognition by our peers in the industry, those who have been watching the evolution of the automotive industry from up close for decades. We are honored to receive a 2020 Automotive News PACE Award for one of our most valued contributions to the industry.\u003C/p>\n\u003Cp>This year \u003Ca href=\"https://www.autonews.com/awards/2020-mobileye-rem-road-experience-management\" target=\"_blank\" rel=\"noopener noreferrer\">the judges named Mobileye&rsquo;s Road Experience Management&trade; (REM) technology\u003C/a> among the 13 most innovative new technologies developed by automotive suppliers. \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">REM\u003C/a> automatically draws data from the proliferation of Mobileye camera systems installed in millions of vehicles already on the road in order to create our Roadbook &ndash; a highly accurate map of public roadways to inform both \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">Advanced Driver Assistance System\u003C/a> technologies and future autonomous vehicles. The resulting high-definition maps are accurate down to 10 centimeters (less than four inches), which, as the citation notes, represents &ldquo;a big improvement over GPS.&rdquo;\u003C/p>\n\u003Cp>The Automotive News PACE Awards are selected by an independent panel of experts made up of past and current industry executives, analysts, and academics, under the direction of J. Ferron (formerly of PricewaterhouseCoopers and J.D. Power &amp; Associates). The evaluation process included an extensive review by the judges, a comprehensive written application, a site visit by two judges, and independent verification with car manufacturers.\u003C/p>\n\u003Cp>The awards are traditionally handed out in Detroit prior to the annual convention of the Society of Automotive Engineers (SAE), but the ceremony was held online this year due to the COVID-19 pandemic. This is the first time Mobileye has been honored in the award&rsquo;s 26-year history. (CogniTens, also founded by our CEO \u003Ca href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Professor Amnon Shashua\u003C/a>, won the award in 2006 for its OptiCell industrial measuring technology.)\u003C/p>\n\u003Cp>REM won out over other automotive innovations developed by suppliers including Bosch, Hella, and WL Gore, whose technologies were shortlisted as finalists for this year&rsquo;s awards. We extend our congratulations to our colleagues at American Axle, Continental, Delphi, Ejot, Gentex, Lear, Magna, Marelli, Schaeffler, Tenneco, and Valeo who also received PACE Awards this year, alongside PACE Innovation Partnership Award recipients General Motors and Shape, and Jaguar Land Rover and American Axle.\u003C/p>\n\u003Cp>&ldquo;Mobileye is very proud to win the PACE Award for our REM activity,&rdquo; said Tal Babaioff, Mobileye&rsquo;s Vice President of Mapping and Localization and Co-General Manager of REM, in accepting the award. &ldquo;In REM, we harvest data from millions of vehicles around the world, and we create high-definition maps from this data. Our solution is based on a small footprint for the harvested data &ndash; less than 10 kilobytes per kilometer &ndash; yet our localization process provides centimeter accuracy for the vehicle itself and all the objects around it. Our crowdsourced mapping, which is fully automated, provides a cost-effective solution for a high refresh rate and always-up-to-date maps.&rdquo;\u003C/p>\n\u003Cp>Watch the highlight reel for this year&rsquo;s safety-tech finalists below, and the \u003Ca href=\"https://www.autonews.com/suppliers/pace-awards-go-13-suppliers\" target=\"_blank\" rel=\"noopener noreferrer\">complete virtual award ceremony\u003C/a> at bottom (the Mobileye part starts at 33:04).\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"//players.brightcove.net/716708064/59xVcV4Zd_default/index.html?videoId=6152411660001\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://players.brightcove.net/716708064/default_default/index.html?videoId=6152069649001\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2020-04-28T07:00:00.000Z","ADAS, Awards, Mapping & REM, News",{"id":2351,"type":24,"url":2352,"title":2353,"description":2354,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2354,"image":2355,"img_alt":2356,"content":2357,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2358,"tags":236},31,"prof-amnon-shashua-wins-the-dan-david-prize","Prof. Amnon Shashua Wins the Dan David Prize","The prestigious award recognizes our CEO’s leadership in the field of artificial intelligence.","https://static.mobileye.com/website/us/corporate/post/images/4da402cb89c4a357642e0412d02708ca_1597831892839.jpg","Mobileye CEO Prof. Amnon Shashua wins the 2020 Dan David Prize","\u003Cp>While artificial intelligence seems to be one of the major buzzwords of our times, popping up nearly everywhere you look, the Dan David Prize dedicated its future category this year to leaders who are applying AI in ways that truly advance society. And this year, our founder and chief executive has earned that distinction.\u003C/p>\n\u003Cp>Named after the late business leader and philanthropist, the Dan David Prize is bestowed upon &ldquo;those who have made a lasting impact on society and to help young students and entrepreneurs become the scholars and leaders of the future.&rdquo; The highly prestigious award has been granted in three categories &ndash; focusing on the past, present, and future &ndash; every year since 2002.\u003C/p>\n\u003Cp>This year the &ldquo;future&rdquo; category put the spotlight on artificial intelligence, a field in which \u003Ca href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Professor Amnon Shashua\u003C/a> has emerged as a leading thinker, researcher, and pioneer. In addition to having co-founded Mobileye and serving as its CEO, Shashua is a Senior Vice President at our parent company Intel, the Sachs Professor of Computer Science at the Hebrew University of Jerusalem, and the co-founder of additional enterprises in the fields of computer vision, natural language processing, and digital banking. He has also authored over 120 papers and holds 45 patents. He shares this year&rsquo;s Dan David Prize with Dr. Demis Hassabis, co-founder and CEO of DeepMind.\u003C/p>\n\u003Cp>&ldquo;It is with great honor and gratitude that I \u003Ca href=\"https://dandavidprize.org/laureates/prof-amnon-shashua/\" target=\"_blank\" rel=\"noopener noreferrer\">receive the 2020 Dan David Prize\u003C/a> in the field of AI, together with Dr. Hassabis,&rdquo; remarked Shashua. &ldquo;I am very fortunate to be affiliated with its distinguished list of laureates. Transforming modern life for the better, using artificial intelligence, has always been my primary motivation, and this prize holds an excellent opportunity to increase awareness to the great promise of AI for the benefit of humanity.&rdquo;\u003C/p>\n\u003Cp>&ldquo;It&rsquo;s such an honor to have been chosen to receive the 2020 Dan David Prize alongside my colleague, Professor Shashua, and so many other luminaries over the years,&rdquo; added Hassabis. &ldquo;At DeepMind, we believe AI could be one of humanity&rsquo;s most useful inventions &ndash; acting as a multiplier for human ingenuity and ushering in a new renaissance of scientific discovery. This award is a great recognition of the work we have done so far and hopefully a sign of the impact we aspire to achieve in the future.&rdquo;\u003C/p>\n\u003Cp>The &ldquo;past&rdquo; prize this year was shared by Professor Barbara Kirshenblatt-Gimblett and Lonnie G. Bunch III for their work in cultural preservation and revival. The &ldquo;present&rdquo; prize recognized Professors Gita Sen and Debora Diniz for their contributions to gender equality. Each award category comes with a million-dollar prize, with ten percent of each donated as scholarships for graduate and post-graduate researchers in their respective fields.\u003C/p>\n\u003Cp>In receiving the prize, Shashua joins such notable past laureates as celebrated cellist Yo-Yo Ma, former US Vice President Al Gore, novelist Margaret Atwood, and filmmakers Ethan and Joel Coen.\u003C/p>","2020-02-13T08:00:00.000Z",{"id":2360,"type":5,"url":2361,"title":2362,"description":2363,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2363,"image":2364,"img_alt":2365,"content":2366,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2367,"tags":2368},53,"mobileye-makes-big-waves-at-ces-2020","Mobileye Makes Big Waves at CES 2020","If you didn’t make it to Las Vegas for the show this year, you can still catch the highlights of Mobileye’s presence right here.","https://static.mobileye.com/website/us/corporate/post/images/b734f44cc257102d31a6c357079a3a3f_1598508025045.jpg","Mobileye booth at CES 2020","\u003Cp>This year&rsquo;s CES was a big event, to say the least &ndash; for the entire consumer electronics industry, of course, but also for Mobileye. At the Las Vegas expo last week, we announced two major new partnerships, released compelling new footage of our technology in action, hosted an in-depth presentation by our CEO, and showcased what we&rsquo;re up to at our booth. For those who missed it, here&rsquo;s a brief recap of our presence at the biggest tech gathering of the year.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>At the Booth\u003C/strong>\u003C/p>\n\u003Cp>Our physical corporate presence at CES 2020 welcomed visitors to place any of nine miniature cars on a glass tabletop to trigger an interactive display. Two of these you can see in the video below: one highlighting our fully \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomous driving\u003C/a> development platform, and the other, our \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">advanced driver-assistance system\u003C/a> &ndash; representing the first time we&rsquo;ve showcased our ADAS partnerships with major automakers at CES.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/xKaAujDSs1w\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>The Mobileye Show\u003C/strong>\u003C/p>\n\u003Cp>Twice every hour during the expo, we also hosted the Mobileye Show, an engaging, live multimedia stage presentation of our innovations. The seven-minute presentation took audiences through vital aspects of our business, including our \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ\u003C/a>-powered ADAS, Mobility-as-a-Service plans, true-redundancy AV platform, crowdsourced \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">mapping technology\u003C/a>, and thought-leadership in AV safety. Watch the recording right here:\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/aGeXYz7Ghoc\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>An Hour with Amnon\u003C/strong>\u003C/p>\n\u003Cp>The highlight of our presence at CES this year, as in years past, was &ldquo;An Hour with Amnon&rdquo; &ndash; an in-depth address by our CEO on the latest technologies and developments from Mobileye. Several hundred people attended the press conference, with hundreds more watching the live webcast remotely. \u003Ca href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Professor Shashua\u003C/a> touched on the company&rsquo;s leadership in ADAS, the development of autonomous vehicles, the technologies driving our advancement, and more &ndash; including a narrated screening of never-before-released footage of Mobileye&rsquo;s AV driving itself through the streets of Jerusalem.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/HPWGFzqd7pI\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>You can watch the full press conference in the video above, or view below just the unedited 20-minute drive video in our Ford Fusion-based prototype, which \u003Ca href=\"https://www.eetimes.com/ces-2020-deconstructed-10-lessons/2/\" target=\"_blank\" rel=\"noopener noreferrer\">\u003Cem>EE Times\u003C/em> aptly (and much to our amusement) characterized\u003C/a> as an AV &ldquo;with chutzpah.&rdquo;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/hCWL0XF_f8Y\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>Beyond Handshakes\u003C/strong>\u003C/p>\n\u003Cp>Aside from what we showed at CES, we also made \u003Ca href=\"https://www.mobileye.com/news/mobileyes-global-ambitions-take-shape-new-deals-china-south-korea/\" target=\"_blank\" rel=\"noopener noreferrer\">two major announcements\u003C/a> at the show. Shanghai-based automaker SAIC has signed on to use Mobileye&rsquo;s Road Experience Management system to digitally map China&rsquo;s roadways in high-definition to be used in ADAS and AVs. We also announced another agreement with the municipal government of Daegu City to deploy a robotaxi service in the South Korean metropolis, which was represented personally by Mayor Kwon Young-jin. We were also pleased to welcome US Transportation Secretary Elaine Chao and other senior officials from the US Department of Transportation among the honored guests who stopped by to see what Mobileye has to offer.\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/hBAARZ1pruk\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2020-01-14T08:00:00.000Z","Events, Video, From our CEO",{"id":2370,"type":24,"url":2371,"title":2372,"description":2373,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2373,"image":2374,"img_alt":2373,"content":2375,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":2376,"tags":2377},160,"2020-ces-mobileye-news-livestream-replay","2020 CES: Mobileye's Computer Vision (Replay)","Watch Prof. Amnon Shashua’s CES 2020 address highlighting the progress and purpose of Mobileye’s drive to full autonomy.","https://static.mobileye.com/website/us/corporate/images/693fb6d5540bec0b40e030332e90a37c_1666086495871.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>Watch Prof. Amnon Shashua&rsquo;s CES 2020 address highlighting the progress and purpose of Mobileye&rsquo;s drive to full autonomy. He showcased new sensing technologies that culminate into a 23-minute drive on the congested streets of Jerusalem that is the basis for Mobileye&rsquo;s MaaS service. Shashua showcased the uninterrupted drive as an example of transparency he feels the industry should provide to get to full autonomy. | \u003Ca href=\"https://static.mobileye.com/website/common/files/Mobileye-Shashua-CES-2020-presentation-1-compressed.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Amnon Shashua&rsquo;s Speaker Notes\u003C/a>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cdiv style=\"padding: 56.25% 0 0 0; position: relative;\">\u003Ciframe style=\"position: absolute; top: 0; left: 0; width: 100%; height: 100%;\" title=\"CES 2020: Amnon Shashua's 'Under the Hood of Mobileye's Computer Vision' (Event Replay)\" src=\"https://player.vimeo.com/video/772999614?h=e1c9b731fb&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=58479\" width=\"640\" height=\"360\" frameborder=\"0\" data-mce-fragment=\"1\">\u003C/iframe>\u003C/div>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cstrong>When:\u003C/strong>&nbsp;11:30 a.m. PST Tuesday, Jan. 7, 2020\u003C/p>","2020-01-07T08:00:00.000Z","News, Video, Events, From our CEO",{"id":2379,"type":24,"url":2380,"title":2381,"description":2382,"primary_tag":28,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2382,"image":2383,"img_alt":2382,"content":2384,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":16,"publish_date":2376,"tags":2385},161,"mobileyes-global-ambitions-take-shape-new-deals-china-south-korea","2020 CES: Mobileye's Global Ambitions Take Shape with New Deals in China, South Korea","Mobileye’s ambitions in advanced driver-assistance systems and autonomous mobility-as-a-service comes into sharper focus with two agreements.","https://static.mobileye.com/website/us/corporate/images/136ea282b009e6437062584b816673d3_1666085214241.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>\u003Cstrong>What&rsquo;s New:&nbsp;\u003C/strong>With sales close to $1 billion in 2019 and expected to rise double-digits this year, Mobileye&rsquo;s global ambitions in advanced driver-assistance systems (ADAS) and autonomous mobility-as-a-service (MaaS) came into sharper focus with two agreements announced today. SAIC, a leading Chinese OEM, plans to use Mobileye&rsquo;s&nbsp;\u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">REM mapping\u003C/a>&nbsp;technology to map China for&nbsp;L2+&nbsp;ADAS deployment while paving the way for autonomous vehicles in the country. And the leaders of Daegu Metropolitan City, South Korea, agreed to establish a long-term cooperation to deploy MaaS based on Mobileye&rsquo;s self-driving system.\u003C/p>\n\u003Cp>&ldquo;These two new agreements build our global footprint in both MaaS and ADAS and demonstrate our commitment to true global leadership toward full autonomy.&rdquo; &ndash;Prof. Amnon Shashua, Mobileye president &amp; CEO and Intel senior vice president\u003C/p>\n\u003Cp>\u003Cstrong>Why It Matters:&nbsp;\u003C/strong>The two deals show how Mobileye, an Intel Company, is executing on its multiprong strategy toward full autonomy, which includes mapping, ADAS, MaaS and consumer AVs. The agreements build on other recent announcements, including: an agreement with RATP in partnership with the city of Paris to bring robotaxis to France; a collaboration with NIO to manufacture Mobileye&rsquo;s self-driving system and sell consumer AVs based on that system, and to supply robotaxis exclusively to Mobileye for China and other markets; a joint venture with UniGroup in China for use of map data; and a joint venture with Volkswagen and Champion Motors to operate an autonomous ride-hailing fleet in Jerusalem.\u003C/p>\n\u003Cp>Based on third-party data, Mobileye estimates the autonomous MaaS total addressable market (TAM) at $160 billion by 2030. Mobileye&rsquo;s ADAS leadership, uniquely scalable mapping tools and global robotaxi-based mobility ambitions have been designed to address this massive opportunity.\u003C/p>\n\u003Cp>China is the first country to benefit from the four Mobileye strategic product categories. With the addition of the SAIC agreement, Mobileye&rsquo;s China footprint now includes L2+ ADAS, mapping (a first for China), MaaS and consumer AVs.\u003C/p>\n\u003Cp>\u003Cstrong>How the SAIC Agreement Works:&nbsp;\u003C/strong>SAIC and Mobileye have signed an agreement to use Mobileye&rsquo;s Road Experience Management&trade; (REM&trade;) mapping technology on SAIC vehicles via SAIC&rsquo;s licensed map subsidiary (Heading). SAIC vehicles will contribute to Mobileye&rsquo;s RoadBook by gathering information on China&rsquo;s roadways, creating a high-definition map of the country that can be used by vehicles with&nbsp;\u003Ca href=\"https://static.mobileye.com/dev/website/us/corporate/images/be013ffc23e3d75babbda0ed4a5019ea_1663241548611.jpg\" target=\"_blank\" rel=\"noopener noreferrer\">L2+ and higher levels of autonomy\u003C/a>. The deployment of the mapping solution in China presents opportunities for additional OEM partners to enter the Chinese market with map-related features.`\u003C/p>\n\u003Cp>The SAIC agreement marks Mobileye&rsquo;s first design win with a major Chinese automaker to harvest road data while also utilizing Mobileye&rsquo;s REM mapping technology to enable L2+ in passenger vehicles.\u003C/p>\n\u003Cp>SAIC joins other Mobileye OEM partners around the world in collecting road data to enable a global real-time high-definition map. It is the first Chinese OEM to use Mobileye&rsquo;s REM technology to offer sharper ADAS capabilities and accelerate the development of autonomous driving in China.\u003C/p>\n\u003Cp>\u003Cstrong>How the Daegu Metropolitan City Agreement Works:&nbsp;\u003C/strong>Mobileye and Daegu City will collaborate to test and deploy robotaxi-based mobility solutions powered by Mobileye&rsquo;s autonomous vehicle technology. Mobileye will integrate its industry-leading self-driving system into vehicles to enable a driverless MaaS operation. Daegu Metropolitan City partners will ensure the regulatory framework supports the establishment of robotaxi fleet operation.\u003C/p>\n\u003Cp>The agreement with Daegu City, one of South Korea&rsquo;s largest metropolitan areas, extends Mobileye&rsquo;s global MaaS footprint. Combined with Mobileye&rsquo;s previously announced robotaxi-based mobility services agreements, the new deal shows how Mobileye is quickly scaling its autonomous MaaS ambitions globally. No other MaaS provider has declared a global MaaS footprint that rivals Mobileye&rsquo;s strategy and go-to-market plan.\u003C/p>\n\u003Cp>\u003Cstrong>How Mobileye&rsquo;s Strategy Differs:&nbsp;\u003C/strong>Leaders on the road to full autonomy must successfully navigate the phases of both ADAS and MaaS before the consumer AV industry can take shape. Doing this requires a simple, scalable mapping solution, such as Mobileye&rsquo;s REM. With its eye toward full autonomy, Mobileye addresses these critical aspects of the autonomous revolution.\u003C/p>\n\u003Cp>\u003Cstrong>REM technology:\u003C/strong>&nbsp;Because it relies on crowd sourcing and low-bandwidth uploads, Mobileye REM technology is a fast and cost-effective way to create high-definition maps that can be utilized for enhanced ADAS such as L2+, as well as higher levels of autonomy for future self-driving cars. Mobileye&rsquo;s REM map data has significant value beyond the automotive industry and can bring insights to businesses in new market segments, such as smart cities. SAIC is the latest OEM to turn passenger cars into harvesting vehicles that will contribute to the global RoadBook.\u003C/p>\n\u003Cp>\u003Cstrong>Robotaxis:\u003C/strong>&nbsp;Mobileye&rsquo;s strategy for deploying robotaxis covers the specification, development and integration of all five value layers of the robotaxi market including: self-driving systems, self-driving vehicles, fleet operations, mobility intelligence, and rider experience and services. Mobileye&rsquo;s approach is cost-effective, allowing the company to scale global operations more quickly than competitors and thereby capture a greater share of the aforementioned $160 billion global robotaxi opportunity, which is a significant step on the way to the fully autonomous future. Mobileye&rsquo;s unique approach of scaling globally with a more economical solution, coupled with its superior technology, enable the company to lead MaaS and consumer-AV development at scale well ahead of the market.\u003C/p>","Autonomous Driving, News, Events, ADAS",{"id":2387,"type":5,"url":2388,"title":2389,"description":2390,"primary_tag":397,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2390,"image":2391,"img_alt":2392,"content":2393,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2394,"tags":2395},38,"mobileye-hosts-its-first-investor-summit-since-the-intel-acquisition","Mobileye Hosts First Summit Since Intel Acquisition ","Investors, journalists, and industry leaders briefed on Mobileye’s latest strategies.\t","https://static.mobileye.com/website/us/corporate/post/images/bd59a29338bd884eddb7d8406d7a33fc_1597844193156.jpg","Prof. Amnon Shashua speaks at the 2019 Mobileye DRIVES Investor Summit in Jerusalem","\u003Cp>On November 5, 2019, Mobileye and Intel hosted an \u003Ca href=\"https://newsroom.intel.com/press-kits/2019-mobileye-investor-summit/\" target=\"_blank\" rel=\"noopener noreferrer\">investor summit\u003C/a> at the Mobileye headquarters in Jerusalem. The summit, named DRIVES (Data, Robofleet, Innovation, Vision, Economics, Safety), was Mobileye&rsquo;s first investor summit since its 2017 acquisition by Intel.\u003C/p>\n\u003Cp>The event was attended by 30 influential guests, including senior analysts, investors and journalists and included demos, in-depth technical discussions, business presentations, and executive networking.\u003C/p>\n\u003Cp>In the course of the summit, visitors learned about Mobileye&rsquo;s \u003Ca href=\"https://www.mobileye.com/solutions/super-vision/\" target=\"_blank\" rel=\"noopener noreferrer\">ADAS \u003C/a>work with vehicle manufacturers, its new data services and exciting new developments in the field of \u003Ca href=\"https://www.mobileye.com/technology/true-redundancy/\" target=\"_blank\" rel=\"noopener noreferrer\">autonomous vehicles\u003C/a>. There was also an in-depth look at the company&rsquo;s business strategy and growth plan as well as details regarding Mobileye&rsquo;s role as major growth engine for Intel.\u003C/p>\n\u003Cp>During their visit to the Mobileye garage, investors and analysts participated in four demos. These included:\u003C/p>\n\u003Cul>\n\u003Cli>A look at how Mobileye is mapping roads worldwide in preparation for autonomous vehicles with Tal Babaioff, VP of Mapping and Localization and Co-General Manager \u003Ca href=\"https://www.mobileye.com/technology/rem/\" target=\"_blank\" rel=\"noopener noreferrer\">REM\u003C/a>\u003C/li>\n\u003Cli>An explanation of Mobileye&rsquo;s proposed safety standard for autonomous vehicles - \u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\" target=\"_blank\" rel=\"noopener noreferrer\">Responsibility-Sensitive Safety (RSS)\u003C/a> - with Jack Weast, VP of Autonomous Vehicle Standards\u003C/li>\n\u003Cli>A description of Mobileye&rsquo;s sensing technology with Gaby Hayon, EVP of Research and Development\u003C/li>\n\u003Cli>Finally, visitors got to see how all this technology comes together when they took a ride through Jerusalem in an autonomous vehicle.\u003C/li>\n\u003C/ul>\n\u003Cp>The day concluded with presentations by George Davis, CFO of Intel, \u003Ca href=\"https://www.mobileye.com/opinion/digitizing-the-social-contract-for-safer-roads/\" target=\"_blank\" rel=\"noopener\">Erez Dagan\u003C/a>, Mobileye EVP of Product and Strategy and Mobileye&rsquo;s CEO, \u003Ca href=\"https://www.mobileye.com/blog/tag/amnon-shashua/\" target=\"_blank\" rel=\"noopener\">Prof. Amnon Shashua\u003C/a>.\u003C/p>\n\u003Cp>Two major announcements which Shashua made during the event included:\u003C/p>\n\u003Cul>\n\u003Cli>A collaboration between Mobileye and France&rsquo;s RATP Group (R&eacute;gie Autonome des Transports Parisiens); Mobileye and RATP are working to deploy autonomous transportation solutions based on Mobileye technology, exploring the possibility of a joint offering for robotaxi shuttle fleets globally.\u003C/li>\n\u003Cli>The shipment of more than 50 million \u003Ca href=\"https://www.mobileye.com/technology/eyeq-chip/\" target=\"_blank\" rel=\"noopener noreferrer\">EyeQ chips\u003C/a> since 2008.\u003C/li>\n\u003C/ul>\n\u003Cp>Watch the highlights from the 2019 Mobileye DRIVES Investor Summit in the video below.\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/ZMqpUlzXtQI\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>","2019-11-06T08:00:00.000Z","Events, Autonomous Driving",{"id":2397,"type":5,"url":2398,"title":2399,"description":2399,"primary_tag":140,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2399,"image":2400,"img_alt":2399,"content":2401,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2402,"tags":2147},164,"rss-explained-the-five-rules-for-autonomous-vehicle-safety","RSS Explained: the Five Rules for Autonomous Vehicle Safety","https://static.mobileye.com/website/us/corporate/images/d8e5c674786e7e297eb29a26b32d8839_1666084689790.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">When we learn to drive, we&rsquo;re taught the rules of the road. These rules typically take two forms:\u003C/span>\u003C/p>\n\u003Col>\n\u003Cli>\u003Cstrong>Explicit rules\u003C/strong>, like speed limits or what to do at a stop sign.\u003C/li>\n\u003Cli>\u003Cstrong>Implicit rules\u003C/strong>, which are often cultural and lean heavily on common sense, like how to maintain a safe following distance and drive safely for the conditions.\u003C/li>\n\u003C/ol>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">For an autonomous vehicle (AV), following the explicit rules is easy. The AV will never exceed the speed limit and will always stop at a stop sign. Understanding of and adherence to implicit rules, however, is more difficult. The very definition of the rules is part of an implied understanding of acceptable driving practices that ensures the transportation system functions safely.\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">How, then, can a machine precisely interpret these subjective implicit rules of the road?\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">In 2017, Mobileye published an academic paper that proposed Responsibility-Sensitive Safety (RSS). The math-based AV safety model provides a framework for digitization of these implicit rules so self-driving cars can successfully integrate with human drivers on the road.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">While we&rsquo;ve accepted that human drivers with varying abilities (and degrees of common sense-based safe driving practices) can still be granted licenses to drive, there&rsquo;s a far higher burden of proof for AVs. Without a verifiable way to demonstrate their ability to drive safely, AVs will never get a license to drive. Technology leaders, automakers, governmental bodies and society must collaboratively define a standard for what it means to drive safely, along with a metric that can be used for assessment and verification of autonomous vehicle safety. To this end, Intel contributed RSS&rsquo;s technology-neutral framework as a starting point for the industry to align on what it means for an AV to drive safely.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #555555;\">What is Responsibility-Sensitive Safety?\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">RSS formalizes human notions of safe driving, using a set of mathematical formulas and logical rules that are transparent and verifiable. These rules define the common-sense behavioral characteristics that humans would characterize as leading to safe driving. The goal is that the AV should drive carefully enough so that it will not be the cause of an accident, and cautiously enough so that it can compensate for the mistakes of others.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #555555;\">RSS is compatible with other AV systems\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">RSS is technology-neutral in that it is compatible with any automated driving system, allowing for consistency in safety. To establish an AV safety standard that can be adopted around the world, Intel is committed to collaborating with all stakeholders across industry, governments, nongovernmental organizations, standards bodies, and academia.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #555555;\">RSS can enable AVs that can drive successfully alongside humans\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">RSS operates as a separate layer from artificial intelligence-based decision-makers. It deterministically defines decisions that are safe, enabling AVs to make cautious but assertive maneuvers that are within a precisely defined safety envelope &ndash; some that otherwise would have been thrown out under AI-based decision-makers that are often too-conservative. If the industry is not able to deliver an AV that drives naturalistically alongside human drivers, then society will not likely accept the annoying, super-conservative AVs that are the alternative.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">To accomplish this, RSS adheres to five safety principles:\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #252525;\">RULE 1. Do not hit the car in front (longitudinal distance)\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/1f0c6d024a3e1bd43d0c723ccfc78ec5_1663240368326.jpg\" alt=\"Define safe longitudinal distance\" width=\"1650\" height=\"1001\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">New human drivers are taught to &ldquo;leave 2 to 3 seconds&rsquo; worth of distance&rdquo; between the car in front of them to provide the time and space to react. This simple guide works without requiring the driver to understand the math and physics behind the velocities of both cars, driver reaction times, and the front vehicle&rsquo;s braking capability.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">In RSS, we&rsquo;ve formalized this rule into a mathematical calculation (pictured above). It means the moment the distance between the two cars is less than dmin, the automated vehicle will perform the proper response: braking until a safe following distance is restored or until the vehicle comes to a complete stop.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #252525;\">RULE 2. Do not cut in recklessly (lateral distance)\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/d7c2bb31f4ecb8b752ddcf4984921c70_1663240419726.jpg\" alt=\"Define safe lateral distance\" width=\"1650\" height=\"1001\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">The safest human drivers maintain position in their designated lanes and avoid unsafe cut-ins when merging into other lanes. Rule 2 formalizes safe lateral distance, which enables AVs to be aware when their lateral safety may be compromised by unsafe drivers turning into their lanes.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">This rule is formalized in a formula (pictured directly above) that also accounts for the natural lateral movement within a lane that is performed by human drivers.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">For example, if another car moves into or occupies that space, a human first steers to avoid a collision, stopping the lateral velocity relative to the other car, then continues to move away laterally until a safe distance is restored. Similarly, this is the proper response for an AV if a violation is made in the safe lateral distance defined by RSS.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #252525;\">RULE 3. Right of way is given, not taken\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/5599386110b9de961ec8c64920b110db_1663240474208.jpg\" alt=\"Right of way\" width=\"1650\" height=\"888\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">On well-marked roads, the right of way is clear. Lane lines, signs, and traffic lights establish priorities for routes as they intersect one another. However, there are other times when the right of way is less clear, and human drivers must negotiate with one another.&nbsp;For AVs, this negotiation must be formalized so that machines can make that same negotiation and be sure to arrive at the same conclusion.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">For example, the figure above shows a T-junction without a stop sign. If a stop sign existed, drivers are expected to give the right of way to vehicles without a stop sign. Sometimes they don&rsquo;t. And if a car ran the stop sign and created a dangerous situation, the AV must still respond accordingly. It has right of way on paper but should not let a crash happen just because the rules give it the right of way.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #252525;\">RULE 4. Be cautious in areas with limited visibility\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/bcd7385c0ce69f0d05a16e1668851fbe_1663240525449.jpg\" alt=\"Areas with limited visibility\" width=\"1650\" height=\"1053\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">Many factors can aﬀect visibility while driving. Aside from the weather, factors such as road topography, buildings, and even other cars can obstruct views of the road and of other road users. Depending on the surroundings, humans naturally put bounds on their behavior to avoid unforeseen dangers.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">On a main thoroughfare, it is not reasonable to expect that pedestrians will suddenly jump into the road. However, on a street near a school or neighborhood, it is much more likely. And, for&nbsp;drivers, it is reasonable to expect pedestrians to suddenly step into the road. Drivers must proceed cautiously, especially as they approach crosswalks or pass cars parked along the street. To ensure safety,&nbsp;AVs will have to make similar assumptions and exhibit caution in areas of occlusion.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #252525;\">RULE 5. If the vehicle can avoid a crash without causing another one, it must\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/94f3f5fd50b9a3065b6805a4362d6f4b_1663240585215.jpg\" alt=\"Avoid a collision without causing another\" width=\"1368\" height=\"1070\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">Rules 1-4 create formal definitions to identify what a dangerous situation is and the proper response for the AV. Rule 5 covers scenarios where a dangerous situation may have been imposed so suddenly that a collision cannot be avoided unless a more evasive action is taken. Rule 5 states that if the AV can safely and legally avoid a crash without causing another one, it must do so.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">For example, if a front car suddenly swerves into the next lane exposing an object in the road the following car&rsquo;s time of exposure to this object is insufficient to stop in time. However, if the next lane is free, the following car can follow the front car and take evasive action to avoid the accident.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #252525;\">Going beyond miles-driven\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">RSS enables safety testing that can be verified without millions of miles of driving. Statistical argumentation is a last resort to claim the safety of an AV when its creators have no ability to formally verify the safety of the design. Because RSS is a formal mathematical model, it can be proven correct, so testing is needed only to ensure implementation matches specification, significantly reducing the validation burden.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #252525;\">Improving road safety today with RSS\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/9b6453b2b80ee91ce2fe97024b178db0_1663240646995.jpg\" alt=\"Response time vs. braking distance\" width=\"1650\" height=\"487\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">RSS is a framework for how cars can safely drive themselves, but these formalized concepts can also be used to keep human drivers within the safety envelope. For example, using the same safety principles, RSS can be a proactive safety mechanism that improves automatic emergency braking (AEB). Called automatic preventive braking (APB), the application of RSS to traditional AEB systems would use formulas to determine the moment when a vehicle enters a dangerous situation. It would then use comfortable, subtle braking to help return the vehicle to a safer position without waiting for an imminent collision to engage maximum braking force. This preventive approach would provide a stopping distance buffer that could prevent a chain reaction of braking and swerving should an emergency stop occur.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #252525;\">RSS gaining support\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">Since RSS was proposed in 2017, Intel has engaged with governmental regulatory agencies and technology pioneers around the world to gather feedback on the model. Its real-world effectiveness has been demonstrated through Intel&rsquo;s AV development fleet on the busy streets of Jerusalem. Support for RSS is gaining global acceptance among industry peers and standards organizations that have applauded Intel for taking the first step toward a verifiable safety framework.\u003C/span>\u003C/p>\n\u003Cul>\n\u003Cli>\u003Cstrong>Baidu\u003C/strong>, the Chinese technology leader, adopted RSS as part of its Apollo autonomous driving platform, and in 2019 incorporated the world&rsquo;s first open-source implementation of RSS.\u003C/li>\n\u003Cli>\u003Cstrong>Valeo\u003C/strong>, the European-based automotive supplier, contributes to research of RSS as it collaborates on policies and technologies to enhance the adoption of AV safety standards across Europe, the U.S., and China.\u003C/li>\n\u003Cli>\u003Cstrong>China ITS Alliance\u003C/strong>, the standards body under the China Ministry of Transportation, approved a proposal to use RSS as the framework for its forthcoming AV safety standard.\u003C/li>\n\u003Cli>\u003Cstrong>RAND Corp.\u003C/strong>, a leading think tank, cited RSS in a recent report as a leading measure that defines safety as an &ldquo;envelope&rdquo; around the AV &ndash; an important aspect for AVs to achieve &ldquo;roadmanship.&rdquo;\u003C/li>\n\u003Cli>\u003Cstrong>Arizona Institute for Automated Mobility\u003C/strong>, a group established to explore and deliver AV safety, uses RSS as the foundation for its research and testing.\u003C/li>\n\u003Cli>\u003Cstrong>Collaborative Research Institutes\u003C/strong>, in partnership with Intel Labs, have been set up in China and Europe to explore AV safety and reliability questions using RSS as a foundation for the research.\u003C/li>\n\u003C/ul>\n\u003Cp>\u003Cstrong style=\"background-color: #ffffff; color: #555555;\">Want to go deeper into RSS?\u003C/strong>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">&nbsp;Read the full academic paper from Mobileye:&nbsp;\u003C/span>\u003Ca style=\"background-color: #ffffff; color: #0071c5;\" href=\"https://www.mobileye.com/responsibility-sensitive-safety/vision_zero_with_map.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Vision Zero: Can Roadway Accidents be Eliminated without Compromising Traffic Throughput?\u003C/a>\u003C/p>","2019-10-31T07:00:00.000Z",{"id":2404,"type":5,"url":2405,"title":2406,"description":2406,"primary_tag":934,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2406,"image":2407,"img_alt":2406,"content":2408,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2402,"tags":1998},165,"the-why-and-how-of-making-hd-maps-for-automated-vehicles","The Why and How of Making HD Maps for Automated Vehicles","https://static.mobileye.com/website/us/corporate/images/cc532d152c5d52a9cd1ecd05968babb2_1666084512983.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Ch2>Mobileye&rsquo;s Road Experience Management Technology Enables Real-Time, Low-Cost Updating of the Maps Needed for Autonomous and Semi-Automated Driving\u003C/h2>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">One of the least-understood challenges of automated driving is mapping. This backgrounder explains how and why machines need something different than the GPS maps that humans use. It also explains how Mobileye&rsquo;s approach to mapping is cost-effective and scalable &ndash; two of the critical tests necessary to ensure automated vehicles (AV) can proliferate broadly.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525; background-color: #ffffff;\">Why are HD maps so important?\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">The suite of sensors on modern cars &ndash; cameras, radars and lidars &ndash; serve as the &ldquo;eyes&rdquo; to see other vehicles, pedestrians, road signs, traffic lights and landmarks. But human drivers benefit from what machines lack: context and understanding. With those skills, humans know how to adjust in the moment and drive more confidently even when they come upon a situation for the first time.\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Maps fill that gap for the automated car. Maps provide additional layers of data to provide human-like context to what the car&rsquo;s sensors are &ldquo;seeing.&rdquo; This three-dimensional view of their environment needs to be up-to-date and accurate to within centimeters.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Working to its potential, HD mapping:\u003C/span>\u003C/p>\n\u003Cul>\n\u003Cli>Helps the AV overcome sensing limitations by providing information beyond the range of camera, lidar, radar (including in inclement conditions).\u003C/li>\n\u003Cli>Provides another layer of support to sensor redundancy in instances where lane markings may be unclear and road signs affected by outside influences (e.g., a stop sign knocked over or a sign obstructed by a parked truck).\u003C/li>\n\u003Cli>Gives the AV understanding of detailed driving semantics &ndash; such as which traffic light corresponds to which lane, the proper stopping point when entering a junction or unwritten local driving laws (when turning right on a red light is permissible).\u003C/li>\n\u003C/ul>\n\u003Cp>&nbsp;\u003C/p>\n\u003Ch3 class=\"videoWrap\">\u003Ciframe class=\"ql-video-embed\" src=\"https://player.vimeo.com/video/370199658?h\" width=\"640\" height=\"360\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/h3>\n\u003Ch3>&nbsp;\u003C/h3>\n\u003Ch3>&nbsp;\u003C/h3>\n\u003Ch6>&nbsp;\u003C/h6>\n\u003Ch6>&nbsp;\u003C/h6>\n\u003Ch6>&nbsp;\u003C/h6>\n\u003Cp>\u003Cstrong style=\"color: #252525; background-color: #ffffff;\">What about the maps I have today? (Google, Waze, OSM)\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">The maps humans rely on for navigation seem pretty sophisticated versus the paper versions we followed a few years ago. But the real problem is localization &ndash; knowing where you are on the map within a matter of centimeters at any particular moment. Popular mapping programs locate you within a few meters, but for an AV that&rsquo;s the difference between being in the correct lane and in the middle of oncoming traffic.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Maps like these are updated infrequently &ndash; on a yearly basis, in many cases. But the world changes constantly.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">The HD maps that most companies are creating today are manually driven and expensive. The process requires dedicated vehicles equipped with expensive sensing technology to gather the map data. And precise GPS is not the answer. It is expensive and cannot work everywhere &ndash; tunnels, urban canyons &ndash; or all the time.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525; background-color: #ffffff;\">How is Mobileye Making HD Maps?\u003C/strong>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/742dc3fe2d56f3eda8bb888c79a9b23c_1663240997659.png\" alt=\"How is Mobileye Making HD Maps?\" width=\"1650\" height=\"536\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Mobileye&rsquo;s map creation process is called Road Experience Management&trade; &ndash; REM&trade; for short. Using a crowd-sourced approach, Mobileye gathers data for its maps through consumer vehicles already on the road and equipped with Mobileye&rsquo;s EyeQ4 driving assistance system. It turns mapmaking into a byproduct of human driving without the need to add any expensive hardware or additional cars on the road.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">EyeQ4 camera systems are identifying and processing lane markings, curbs, landmarks, traffic signs, telephone poles and other infrastructure. Data is translated into coordinates in compact, compressible data &ndash; about 10 kilobytes per kilometer captured. (In a year of typical North American driving, this equates to approximately 200 megabytes per vehicle.) This is a process that uses algorithms resident in the EyeQ4 system to optimize data bandwidth.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Mobileye has automated the mapmaking and updating process so that creation, updating, validating and distribution of the maps happens without any human intervention.\u003C/span>\u003C/p>\n\u003Ch3 class=\"videoWrap\">\u003Ciframe class=\"ql-video-embed\" src=\"https://player.vimeo.com/video/370204261?h\" width=\"640\" height=\"254\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/h3>\n\u003Ch3>&nbsp;\u003C/h3>\n\u003Ch3>&nbsp;\u003C/h3>\n\u003Ch6>&nbsp;\u003C/h6>\n\u003Ch6>&nbsp;\u003C/h6>\n\u003Ch6>&nbsp;\u003C/h6>\n\u003Cp>\u003Cstrong style=\"color: #252525; background-color: #ffffff;\">What about my privacy?\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">With REM, Mobileye is building the global Roadbook&trade; using completely anonymized data. The camera system discards all data within a large random range of a trip&rsquo;s origin and destination, so exact start and stop locations are never uploaded to the cloud. Furthermore, the system is not constantly uploading data to the cloud.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">To reduce the burden of being flooded with data from thousands of cars all driving the same stretch of highway, Mobileye operates what it calls &ldquo;a challenge request&rdquo; instruction. In short, the vehicle tells the system that it drove from point A to B and asks if the system wants the data. Mobileye can then respond yes or no depending on how much data was recently collected from each specific road.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525; background-color: #ffffff;\">How much of the Roadbook exists today?\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">In late 2018, the first series-production vehicles capable of harvesting map data for the Roadbook began shipping to consumers. Nearly 2.5 million kilometers of roads are covered daily and nearly 500 million kilometers were covered in one six-month period this year. As the number of cars collecting data grows, these metrics will increase as well.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Areas where maps have been created include all of Japan, both highway and urban areas in Europe and the United States &ndash; notably Berlin, Munich, Paris, Rome, Los Angeles and New York City.\u003C/span>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/4d65d9aa0f3d4e1ea5e79359de3875af_1663241121093.png\" alt=\"How much of the Roadbook exists today?\" width=\"1650\" height=\"839\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">The geographic areas being mapped will grow exponentially in the coming months, and the time to collect data to create maps everywhere will drop to days in the coming years as new models from BMW, Nissan, VW and SAIC &ndash; plus other automakers &ndash; will contain the EyeQ4 and REM technology capable of building the Roadbook.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525; background-color: #ffffff;\">REM technology used for more than just driving\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">REM gathers more data than is needed for automated vehicle maps. This other data can be used by city planners to make life better for everyone, including pedestrians and bicyclists.\u003C/span>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/e304192d79ec0d5ab5046e82222ad4c8_1663241199628.png\" alt=\"REM technology used for more than just driving\" width=\"1608\" height=\"1070\" />\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #959595; background-color: #ffffff;\">A vehicle retrofitted with Mobileye 8 Connect detects a construction area on its path, capturing the data as part of trials launched by Mobileye and Ordnance Survey.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">One example is Mobileye&rsquo;s project with Ordnance Survey (OS), the U.K.&rsquo;s national mapping agency. Mobileye technology is installed in the OS fleet, adding a valuable layer of 3D HD data that can be made available to utility companies and city planners. OS&rsquo;s camera-equipped vehicles are gathering information about manholes, drain covers, telco cabinets, signposts, overhead cables and more. This 3D road data is overlaid onto OS&rsquo;s existing 2D local map data enabling utility companies to manage infrastructure.\u003C/span>\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525; background-color: #ffffff;\">Scale is the key\u003C/strong>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Mobileye REM technology revolutionizes mapping for the automotive industry. REM maps are created automatically, reflect road changes in a timely manner, and provide centimeter-scale accuracy for cars driving on the road.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Instead of doing this with designated mapping vehicles in every city globally, Mobileye uses&nbsp;\u003C/span>\u003Cem style=\"color: #555555; background-color: #ffffff;\">existing&nbsp;\u003C/em>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">cars equipped with EyeQ4 driver assistance systems. This ability to deploy at mass scale is a vast improvement over traditional methods of mapping and unmatched in the industry.\u003C/span>\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #555555; background-color: #ffffff;\">Mobileye believes mapping has a positive feedback loop; the more you map, the more partners will want to join, allowing the mapping grid to expand exponentially. Scale is everything in mapping, the organization with the largest reach will drive the autonomous vehicle industry to reality.\u003C/span>\u003C/p>",{"id":2410,"type":654,"url":2411,"title":2412,"description":2413,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2413,"image":2414,"img_alt":2415,"content":2416,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2417,"tags":2322},47,"navigating-the-winding-road-toward-driverless-mobility","Navigating the Winding Road Toward Driverless Mobility","Why your first autonomous ride will likely be in a robotaxi, according to Mobileye CEO Prof. Amnon Shashua.","https://static.mobileye.com/website/us/corporate/post/images/83ffe5d88983175808b8f04458972e7a_1597841143652.jpg","Mobileye CEO Prof. Amnon Shashua at CES","\u003Cp>As we all watch automakers and autonomous tech companies team up in various alliances, it&rsquo;s natural to wonder about their significance and what the future will bring. Are we realizing that autonomous driving technology and its acceptance by society could take longer than expected? Is the cost of investing in such technology proving more than any single organization can sustain? Are these alliances driven by a need for regulation that will be accepted by governments and the public or for developing standards on which manufacturers can agree?\u003C/p>\n\u003Cp>The answers are likely a bit of each, which makes it a timely opportunity to review the big picture and share our view of where Intel and Mobileye stand in this landscape.\u003C/p>\n\u003Cp>\u003Cstrong>Three Aspects to Auto-Tech-AI\u003C/strong>\u003C/p>\n\u003Cp>There are three aspects to automotive-technology-artificial intelligence (auto-tech-AI) that are unfolding:\u003C/p>\n\u003Col>\n\u003Cli>Advanced driver-assistance systems (ADAS)\u003C/li>\n\u003Cli>Robotaxi ride hailing as the future of mobility-as-a-service (MaaS)\u003C/li>\n\u003Cli>Series-production passenger car autonomy\u003C/li>\n\u003C/ol>\n\u003Cp>With ADAS technologies, the driver remains in control while the system intervenes when necessary to prevent accidents. This is especially important as distracted driving grows unabated. Known as Levels 0-2 as defined by the&nbsp;\u003Ca href=\"https://www.sae.org/news/press-room/2018/12/sae-international-releases-updated-visual-chart-for-its-%E2%80%9Clevels-of-driving-automation%E2%80%9D-standard-for-self-driving-vehicles\">Society of Automotive Engineers (SAE)\u003C/a>, ADAS promises to reduce the probability of an accident to infinitesimal levels. This critical phase of auto-tech-AI is well underway, with today&rsquo;s penetration around 22%, a number expected to climb sharply to 75% by 2025.\u003Csup>1\u003C/sup>\u003C/p>\n\u003Cp>Meanwhile, the autonomous driving aspect of auto-tech-AI is coming in two phases: robotaxi MaaS and series-production passenger car autonomy. What has changed in the mindset of many companies, including much of the auto industry, is the realization that those two phases cannot proceed in parallel.\u003C/p>\n\u003Cp>Series-production passenger car autonomy (SAE Levels 4-5) must wait until the robotaxi industry deploys and matures. This is due to three factors: cost, regulation and geographic scale. Getting all factors optimized simultaneously has proven too difficult to achieve in a single leap, and it is why many in the industry are contemplating the best path to achieve volume production. Many industry leaders are realizing it is possible to stagger the challenges if the deployment of fully autonomous vehicles (AVs) aims first at the robotaxi opportunity.\u003C/p>\n\u003Cp>\u003Cstrong>Cost:\u003C/strong>&nbsp;The cost of a self-driving system (SDS) with its cameras, radars, lidars and high-performance computing is in the tens of thousands of dollars and will remain so for the foreseeable future. This cost level is acceptable for a driverless ride-hailing service, but is simply too expensive for series-production passenger cars. The cost of SDS should be no more than a few thousand dollars &ndash; an order of magnitude lower than today&rsquo;s costs &ndash; before such capability can find its way to series-production passenger cars.\u003C/p>\n\u003Cp>\u003Cstrong>Regulation:\u003C/strong>&nbsp;Regulation is an area that receives too little attention. Companies deep in the making of SDSs know that it is the stickiest issue. Beside the fact that laws for granting a license to drive are geared toward human drivers, there is the serious issue of how to balance safety and usefulness in a manner that is acceptable to society.\u003C/p>\n\u003Cp>It will be easier to develop laws and regulations governing a fleet of robotaxis than for privately-owned vehicles. A fleet operator will receive a limited license per use case and per geographic region and will be subject to extensive reporting and back-office remote operation. In contrast, licensing such cars to private citizens will require a complete overhaul of the complex laws and regulations that currently govern vehicles and drivers.\u003C/p>\n\u003Cp>The auto industry is gradually realizing that autonomy must wait until regulation and technology reach equilibrium, and the best place to get this done is through the robotaxi phase.\u003C/p>\n\u003Cp>\u003Cstrong>Scale:\u003C/strong>&nbsp;The third factor, geographic scale, is mostly a challenge of creating high-definition maps with great detail and accuracy, and of keeping those maps continuously updated. Geographic scale is crucial for series-production driverless cars because they must necessarily operate &ldquo;everywhere&rdquo; to fulfill the promise of the self-driving revolution. Robotaxis can be confined to geo-fenced areas, which makes it possible to postpone the issue of scale until the maturity of the robotaxi industry.\u003C/p>\n\u003Cp>When the factors of cost, regulation and scale are taken together, it is understandable why series-production passenger cars will not become possible until after the robotaxi phase.\u003C/p>\n\u003Cp>As is increasingly apparent, the auto industry is gravitating towards greater emphasis on their Level 2 offerings. Enhanced ADAS &ndash; with drivers still in charge of the vehicle at all times &ndash; helps achieve many of the expected safety benefits of AVs without bumping into the regulatory, cost and scale challenges.\u003C/p>\n\u003Cp>At the same time, automakers are solving for the regulatory, cost and scale challenges by embracing the emerging robotaxi MaaS industry. Once MaaS via robotaxi achieves traction and maturity, automakers will be ready for the next (and most transformative) phase of passenger car autonomy.\u003C/p>\n\u003Cp>\u003Cstrong>The Strategy for Autonomy\u003C/strong>\u003C/p>\n\u003Cp>With all of this in mind, Intel and Mobileye are focused on the most efficient path to reach passenger car autonomy. It requires long-term planning, and for those who can sustain the large investments ahead, the rewards will be great. Our path forward relies on four focus areas:\u003C/p>\n\u003Cul>\n\u003Cli>Continue at the forefront of ADAS development. Beyond the fact that ADAS is the core of life-saving technology, it allows us to validate the technological building blocks of autonomous vehicles via tens of new production programs a year with automakers that submit our technology to the most stringent safety testing. Our ADAS programs &ndash; more than 34 million vehicles on roads today &ndash; provide the financial &ldquo;fuel&rdquo; to sustain autonomous development activity for the long run.\u003C/li>\n\u003Cli>Design an SDS with a backbone of a camera-centric configuration. Building a robust system that can drive solely based on cameras allows us to pinpoint the critical safety segments for which we truly need redundancy from radars and lidars. This effort to avoid unnecessary over-engineering or &ldquo;sensor overload&rdquo; is key to keeping the cost low.\u003C/li>\n\u003Cli>Build on our&nbsp;\u003Ca href=\"https://www.mobileye.com/technology/rem/\">Road Experience Management\u003C/a>&nbsp;(REM)&trade; crowdsourced automatic high-definition map-making to address the scale issue. Through existing contracts with automakers, we at Mobileye expect to have more than 25 million cars sending road data by 2022.\u003C/li>\n\u003Cli>Tackle the regulatory issue through our&nbsp;\u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\">Responsibility-Sensitive Safety\u003C/a>(RSS) formal model of safe driving, which balances the usefulness and agility of the robotic driver with a safety model that complies with societal norms of careful driving.\u003C/li>\n\u003C/ul>\n\u003Cp>At Intel and Mobileye, we are all-in on the global robotaxi opportunity. We are developing technology for the entire robotaxi experience &ndash; from hailing the ride on your phone, through powering the vehicle and monitoring the fleet. Our hands-on approach with as much of the process as possible enables us to maximize learnings from the robotaxi phase and be ready with the right solutions for automakers when the time is right for series-production passenger cars.\u003C/p>\n\u003Cp>On the way, we will help our partners deliver on the life-saving safety revolution of ADAS. We are convinced this will be a powerful and historic example of the greatest value being realized on the journey.\u003C/p>\n\u003Cp>\u003Csup>1\u003C/sup>&nbsp;\u003Ca href=\"http://www.wolferesearch.com/\">Wolfe Research 2019.\u003C/a>\u003C/p>","2019-07-09T07:00:00.000Z",{"id":2419,"type":654,"url":2420,"title":2421,"description":2422,"primary_tag":658,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2422,"image":2423,"img_alt":2424,"content":2425,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2426,"tags":2427},36,"digitizing-the-social-contract-for-safer-roads","Digitizing the Social Contract for Safer Roads","Mobileye EVP for Products and Strategy, Erez Dagan, describes how designing our AV safety model for the future enables better safety solutions for human drivers today.","https://static.mobileye.com/website/us/corporate/post/images/cef5f37c764e99a92f5822bdff4bf59b_1597844339982.jpg","Erez Dagan, Executive Vice President for Products and Strategy at Mobileye and a Vice President at Intel Corporation","\u003Cp>When Mobileye set out to design a safety concept for autonomous vehicles (AVs), we first had to examine the concepts and mechanisms that humans use to maintain road safety. We needed a framework fully compliant with the human road safety system so that AVs could share the same roads. We also needed something demonstrably safer, by design, for society to accept them on the roads.\u003C/p>\n\u003Cp>During development of this system, we discovered the same framework that solves this challenge for AVs is also capable of dramatically improving the safety of the road today via advanced driver assistance systems (ADAS). The solution digitizes the mostly informal, hard-to-enforce social contract that governs road safety today. How this works was the subject of&nbsp;my keynote address today at SAE World Congress:\u003C/p>\n\u003Cp>\u003Ciframe class=\"ql-video\" src=\"https://www.youtube.com/embed/-tELRcl-Gec\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\">\u003C/iframe>\u003C/p>\n\u003Cp>\u003Cstrong>The Gap in Our Traffic Rules\u003C/strong>\u003C/p>\n\u003Cp>The foundation of the existing road safety system is traffic rules: explicit, unequivocal instructions to the driver, coded through on-road and road-side signs and indicators such as traffic lights, stop signs, lane dividers, etc.\u003C/p>\n\u003Cp>Still, traffic rules are an under-defined system. Even if all agents rigorously follow them there is still a risk of road accidents. This is because the alternative &ndash; to over-define with traffic lights at every junction (no roundabouts) and by making every lane line always solid &ndash; would be costly and degrade traffic flow to impractical levels.\u003C/p>\n\u003Cp>Dashed lane lines and yield signs allow for more efficiency, but also leave points of potential conflict in which road users must negotiate with one another (for example, when changing lanes or at a four-way stop). Had these negotiations been left completely unregulated, the outcomes would be a wild function of the different agents&rsquo; time-utility and risk-averseness.\u003C/p>\n\u003Cp>This is where the social contract comes in.\u003C/p>\n\u003Cp>The social contract governing careful driving is meant to compensate for the safety gap left by the fact that the traffic rules are under-determined. It minimizes the occurrence of time-critical conflicts and regulates negotiations between road users by directing agents to keep a safe distance from the car ahead, to proceed with caution when visibility is compromised, to give up the right of way if others claim it, and so on. It is a social contract in the sense that we all uphold this unspoken set of rules because we are all better off if we do.\u003C/p>\n\u003Cp>The social contract supersedes traffic rules and can therefore remedy the consequences of traffic rule violations. For example,&nbsp;the social contract would allow an agent to cross a solid lane line if a vehicle in the opposite lane has crossed it right in front him (as long as it does not lead to a different social contract violation).\u003C/p>\n\u003Cp>Despite its critical role in the human road safety system, the social contract for cautious driving has shortcomings. It is broad, without specific definitions of what is safe or appropriate, leaving the correct application up to real-time human judgments. Hence, a lapse of judgment is a leading cause of accidents. The social contract is also nearly impossible to enforce, since detecting a violation requires detailed analysis of a traffic situation.\u003C/p>\n\u003Cp>\u003Cstrong>Digitizing the Social Contract for AVs\u003C/strong>\u003C/p>\n\u003Cp>Humans must interpret this implicit, non-metric system as they go. But for AVs &ndash; which are necessarily explicit and quantitative in their decision-making &ndash; we need a more accessible interpretation of this contract. This is the exact premise of Mobileye&rsquo;s&nbsp;\u003Ca href=\"https://static.mobileye.com/website/corporate/media/intel_pdf/rss-fact-sheet2.pdf\">Responsibility-Sensitive Safety (RSS) framework\u003C/a>: a digital interpretation of the social contract that is explicit, concise, (para)metric, efficiently applicable in real-time and retrospectively traceable.\u003C/p>\n\u003Cp>RSS has several additional contributions critical to road safety. First, it is a formally proven contract, meaning that it is mathematically proven that if all agents implement RSS the vehicle will not cause an accident resulting from a decision-making process, assuming all other vehicle-relevant factors function appropriately. Second, by being completely explicit and quantitative, it aids investigators after an accident regarding different agents&rsquo; compliance with the digital social contract.\u003C/p>\n\u003Cp>\u003Cstrong>From Humans to AVs and Back &nbsp;\u003C/strong>\u003C/p>\n\u003Cp>What started with the AV&rsquo;s duty to comprehend the human road safety system evolved into an undeniable opportunity to dramatically improve it.\u003C/p>\n\u003Cp>With a safety model that is fully measurable, interpretable and enforceable, we wondered: Why wait for AVs to experience the life-saving benefits of this new reality? Let&rsquo;s find a way to allow human drivers to benefit from RSS &ndash; the digital version of the social contract.\u003C/p>\n\u003Cp>To that aim, we have designed the&nbsp;\u003Ca href=\"https://arxiv.org/pdf/1901.05022.pdf\">Vision Zero\u003C/a>&nbsp;driver-assistance system, which is purpose-built to meet mass-market deployment and economics. This system uses preventive techniques to help humans avoid emergency responses. It leverages a set of surround cameras and harnesses the RSS framework to provide preventive micro-interventions in accordance with principles of cautious driving. It further benefits from lean, crowd-sourced foresight of upcoming negotiation points and insights to dynamic road-usage patterns and road-network safety vulnerabilities.\u003C/p>\n\u003Cp>This is a very different approach to the current Vision Zero tactics that focus on &ldquo;road diets&rdquo; being adopted by cities all over the world. This movement has chosen to deepen the traffic rules with static and pre-set rules like speed limits, speed bumps and physical barriers. These road restrictions are making traffic rules more invasive, and society is paying a high price in efficiency with&nbsp;\u003Ca href=\"https://www.wsj.com/articles/vision-zero-a-road-diet-fad-is-proving-to-be-deadly-11547853472\">questionable outcomes\u003C/a>.\u003C/p>\n\u003Cp>Digitizing the social contract will help make our roads safer, with huge upside potential for traffic flow. With proper regulatory endorsement, these digitized principles of cautious driving may ultimately become a formal, enforceable and binding contract, thereby mitigating the weaknesses in the informal social contract today.\u003C/p>\n\u003Cp>The RSS framework is the digital solution for the social contract that tackles these inherent shortcomings. It also avoids the restrictions of road diets. While RSS was originally envisioned for AVs, we can apply it to ADAS solutions now with immediate impact. This is what I believe is the next revolution in ADAS. It&rsquo;s a very human concept come full circle.\u003C/p>\n\u003Cp>\u003Cem>Erez Dagan is the executive vice president for Products and Strategy at Mobileye and a vice president at Intel Corporation.\u003C/em>\u003C/p>","2019-04-10T07:00:00.000Z","Opinion, AV Safety",{"id":2429,"type":654,"url":2430,"title":2431,"description":2432,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2432,"image":2433,"img_alt":2434,"content":2435,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2436,"tags":2437},48,"innovation-requires-originality","Innovation Requires Originality","Despite Uncanny Similarities, Nvidia’s SFF Fails to Match Mobileye’s RSS, which is the Leading AV Safety Model","https://static.mobileye.com/website/us/corporate/post/images/c3c3c0a45b6569b51a2c86b493832738_1597841357364.jpg","Inside Mobileye Labs","\u003Cp>As we march towards a driverless future, we at Mobileye have continued to lead the industry with new innovations that will not only enable fully autonomous vehicles (AVs), but will also make human-driven cars safer than they have ever been. Over the years, I am proud that we have achieved many industry firsts: camera and radar fusion in 2007, pedestrian-detection warning in 2010, camera-only forward-collision warning in 2011, camera-only automatic cruise control (ACC) in 2013, hands-free assist in 2015, crowd-sourced HD-mapping in 2016, the Responsibility-Sensitive Safety (RSS) safety model in 2017 and, most recently,&nbsp;\u003Ca href=\"https://arxiv.org/pdf/1901.05022.pdf\">a &ldquo;vision zero&rdquo; horizon\u003C/a>&nbsp;through a novel preventive system using RSS.\u003C/p>\n\u003Cp>It is said that imitation is the sincerest form of flattery, and our innovations have not gone unnoticed with many embracing the same concepts that we pioneered. One industry player in particular habitually follows our lead and today I would like to set the record straight on its latest imitation.\u003C/p>\n\u003Cp>Let us consider the recent past. After Mobileye announced the world&rsquo;s first crowdsourced mapping technology &ndash; Road Experience Management&trade; (REM) &ndash; in 2016, Nvidia announced a solution the following year that claimed to do the same. When Mobileye coined and introduced L2+ in 2017 as a new category of driving automation that uniquely applied our REM technology to driver assistance systems, again Nvidia followed suit and announced its L2+ offering in 2019.\u003C/p>\n\u003Cp>Our most recent innovation, RSS, was&nbsp;\u003Ca href=\"https://arxiv.org/pdf/1708.06374.pdf\">published in an academic paper in 2017\u003C/a>. We openly shared all the technical details and mathematics behind RSS because we believe that the safety of automated vehicles should not be proprietary, and that the industry should collaborate with governments on what it means for an AV to drive safely.\u003C/p>\n\u003Cp>The response to and support of RSS has been tremendous.&nbsp;\u003Ca href=\"https://www.intc.com/news-events/press-releases/detail/145/baidu-to-integrate-mobileyes-responsibility-sensitive\">Baidu\u003C/a>&nbsp;and&nbsp;\u003Ca href=\"https://www.valeo.com/en/valeo-signs-an-agreement-with-mobileye-to-develop-a-new-autonomous-vehicle-safety-standard/\">Valeo\u003C/a>&nbsp;have publicly signed on. China ITS has approved a work group tasked with standardizing RSS for the China market. And we have engaged with governments and standards organizations around the world on RSS. What&rsquo;s more, dozens of research papers have cited RSS, contributing to the public discourse on this important topic.\u003C/p>\n\u003Cp>We&rsquo;ve always said that we believe RSS is an excellent starting point for verifiable safety assurance of automated vehicle decision-making. We&rsquo;ve openly invited the entire industry to contribute their ideas on how to improve RSS. We were pleased when Nvidia heeded this call and reached out to us in 2018 about a collaboration on AV Safety. We were puzzled when Nvidia backed out of the proposed partnership.\u003C/p>\n\u003Cp>Imagine our surprise last week when Jensen Huang, CEO of Nvidia, announced a &ldquo;first-of-its-kind&rdquo; safety model for AVs. Curious to see what &ldquo;first-of-its-kind&rdquo; innovation Nvidia had created, we eagerly read the publicly released white paper about Safety Force Field (SFF), only to have the eerie feeling that we were looking in the mirror.\u003C/p>\n\u003Cp>If imitation is the sincerest form of flattery, then Nvidia must think very highly of us. Based on the information that has been made available, it is clear Nvidia&rsquo;s leaders have continued their pattern of imitation as their so-called &ldquo;first-of-its-kind&rdquo; safety concept is a close replica of the RSS model we published nearly two years ago. In our opinion, SFF is simply an inferior version of RSS dressed in green and black. To the extent there is any innovation there, it appears to be primarily of the linguistic variety.\u003C/p>\n\u003Cp>To illustrate Nvidia&rsquo;s latest attempt to emulate Mobileye&rsquo;s technology leadership, let me first describe RSS using the technical terms (in Italics) that are defined in&nbsp;\u003Ca href=\"https://arxiv.org/pdf/1708.06374.pdf\">our publicly published materials\u003C/a>.\u003C/p>\n\u003Cp>RSS defines a&nbsp;\u003Cem>safe longitudinal\u003C/em>&nbsp;and a&nbsp;\u003Cem>safe lateral&nbsp;\u003C/em>distance around the vehicle. When those safe distances are compromised, we say that the vehicle is in a&nbsp;\u003Cem>Dangerous Situation\u003C/em>&nbsp;and must perform a&nbsp;\u003Cem>Proper Response\u003C/em>. The specific moment when the vehicle must perform the Proper Response is called the&nbsp;\u003Cem>Danger Threshold\u003C/em>.\u003C/p>\n\u003Cp>SFF defines identical concepts with slightly modified terminology.&nbsp;\u003Cem>Safe longitudinal distance\u003C/em>&nbsp;is instead called &ldquo;the SFF in One Dimension;&rdquo;&nbsp;\u003Cem>safe lateral distance\u003C/em>&nbsp;is described as &ldquo;the SFF in Higher Dimensions.&rdquo;&nbsp; Instead of&nbsp;\u003Cem>Proper Response\u003C/em>, SFF uses &ldquo;Safety Procedure.&rdquo; Instead of&nbsp;\u003Cem>Dangerous Situation\u003C/em>, SFF replaces it with &ldquo;Unsafe Situation.&rdquo; And, just to be complete, SFF also recognizes the existence of a&nbsp;\u003Cem>Danger Threshold\u003C/em>, instead calling it a &ldquo;Critical Moment.&rdquo;\u003C/p>\n\u003Cp>RSS covers occluded agents such as vehicles hidden behind buildings and pedestrians hidden behind cars. So does SFF.\u003C/p>\n\u003Cp>RSS defines reasonable expectations on the behavior of other agents, such as the expected maximum braking. So does SFF.\u003C/p>\n\u003Cp>RSS defines Proper Responses for both structured and unstructured roads. So does SFF.\u003C/p>\n\u003Cp>RSS defines a &ldquo;response time&rdquo; of how long it will take the AV to respond to a change in situation. So does SFF, though it is renamed &ldquo;reaction time.&rdquo;\u003C/p>\n\u003Cp>In fact, every one of the five &ldquo;common sense&rdquo; rules defined by RSS are replicated in SFF:\u003C/p>\n\u003Col>\n\u003Cli>Do not hit someone from behind.\u003C/li>\n\u003Cli>Do not cut-in recklessly.\u003C/li>\n\u003Cli>Right-of-way is given, not taken.\u003C/li>\n\u003Cli>Be careful of areas with limited visibility.\u003C/li>\n\u003Cli>If you can avoid an accident without causing another one, you must do it.\u003C/li>\n\u003C/ol>\n\u003Cp>Before we spell out the similarities in more detail below, let&rsquo;s talk about what an AV safety model is really about. At the heart of the construction lies the concept of a &ldquo;Proper Response&rdquo; in RSS terminology (&ldquo;Safety Procedure&rdquo; in SFF terminology). In our opinion, to really implement an AV safety model, one must explicitly specify this Proper Response &ndash; which Mobileye has done. We invested a great deal of thought to make RSS comply with three crucial properties:\u003C/p>\n\u003Col>\n\u003Cli>Sound: The Proper Response of RSS complies with the common sense of human judgment.\u003C/li>\n\u003Cli>Usefulness: RSS enables agile driving policy.\u003C/li>\n\u003Cli>Efficiently verifiable: it is possible to comply with RSS rules in a computational efficient manner.\u003C/li>\n\u003C/ol>\n\u003Cp>Without an explicitly defined Proper Response, no practical implementation is possible. We attempted to evaluate SFF&rsquo;s so-called &ldquo;Safety Procedure,&rdquo; but we were unable to find it in the SFF paper, except for one simple example that, in our opinion, is too simplistic and yields a non-useful driving policy. The SFF paper appears to be non-transparent where the most important ingredient is missing from the model. We encourage Nvidia to be more transparent about the details of its Safety Procedure so that the industry is better able to evaluate its safety and efficacy. And although we enjoy flattery as much as anyone else, we hope that SFF&rsquo;s Safety Procedure is based on independent innovation as opposed to imitation of Mobileye&rsquo;s technology leadership. But don&rsquo;t take my word for it, let&rsquo;s let the text speak for itself.\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/website/corporate/rss/img/SFFvsRSS-chart-1.jpg\" /> \u003Cimg src=\"https://static.mobileye.com/website/corporate/rss/img/SFFvsRSS-chart-2.jpg\" /> \u003Cimg src=\"https://static.mobileye.com/website/corporate/rss/img/SFFvsRSS-chart-3.jpg\" /> \u003Cimg src=\"https://static.mobileye.com/website/corporate/rss/img/SFFvsRSS-chart-4.jpg\" /> \u003Cimg src=\"https://static.mobileye.com/website/corporate/rss/img/SFFvsRSS-chart-5.jpg\" /> \u003Cimg src=\"https://static.mobileye.com/website/corporate/rss/img/SFFvsRSS-chart-6.jpg\" /> \u003Cimg src=\"https://static.mobileye.com/website/corporate/rss/img/SFFvsRSS-chart-7.jpg\" /> \u003Cimg src=\"https://static.mobileye.com/website/corporate/rss/img/SFFvsRSS-chart-8.jpg\" />\u003C/p>\n\u003Cp>This analysis clearly shows that SFF is a close replica of RSS. It is indisputable that RSS was, in fact, the &ldquo;first-of-its-kind&rdquo; safety model for AV&rsquo;s when we announced it in 2017, and we believe it is still the leading solution today.\u003C/p>\n\u003Cp>At Mobileye, we believe in technology innovation, not linguistic innovation. We have openly invited and are enjoying active collaboration with industry and government partners around the globe. It is unfortunate that rather than collaborate with us, Nvidia felt it necessary to follow us yet again, creating confusion where there could have been cohesion. Mobileye has invested enormous resources to develop RSS, and Mobileye has obtained intellectual property rights to protect these investments.\u003C/p>","2019-03-25T07:00:00.000Z","Opinion, AV Safety, From our CEO",{"id":2439,"type":24,"url":2440,"title":2441,"description":2441,"primary_tag":190,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2441,"image":2442,"img_alt":2441,"content":2443,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2444,"tags":928},166,"autonomous-driving-hands-on-the-wheel-or-no-wheel-at-all","Autonomous Driving – Hands on the Wheel or No Wheel at All","https://static.mobileye.com/website/us/corporate/images/2b6fc1059defc53ba8961052c971e42f_1666086262848.png","\u003Cp>\u003Cem>This news content was originally published on the Intel Corporation Newsroom.\u003C/em>\u003C/p>\n\u003Cp>\u003Cstrong>Intel Explainer:\u003C/strong>&nbsp;6 Levels of Autonomous Driving\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">Vehicles on the road today are getting smarter, safer and more capable. But even the newest vehicles vary widely in their advanced driver assistance systems (ADAS), which aim to enhance safety and make driving more comfortable. Add to that the global race to fully self-driving vehicles, which will take the driver out of the equation completely.\u003C/span>\u003C/p>\n\u003Cp>&nbsp;\u003C/p>\n\u003Cp>\u003Cspan style=\"background-color: #ffffff; color: #555555;\">Vehicles can be categorized according to the ADAS features they offer, and the Society of Automotive Engineers defines six levels of automotive automation, explained here.\u003C/span>\u003C/p>\n\u003Cp>\u003Cimg src=\"https://static.mobileye.com/dev/website/us/corporate/images/be013ffc23e3d75babbda0ed4a5019ea_1663241548611.jpg\" alt=\"The 6 levels of autonomous driving\" width=\"1173\" height=\"1070\" />\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525;\">Level 0: No Automation&nbsp;\u003C/strong>&mdash; Zero autonomy; the driver performs all the driving, but the vehicle can aid with blind spot detection, forward collision warnings and lane departure warnings.\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525;\">Level 1: Driver Assistance&nbsp;\u003C/strong>&mdash; The vehicle may have some active driving assist features, but the driver is still in charge. Such assist features available in today&rsquo;s vehicles include adaptive cruise control, automatic emergency braking and lane keeping.\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525;\">Level 2: Partial Automation&nbsp;\u003C/strong>&mdash; The driver still must be alert and monitor the environment at all times, but driving assist features that control acceleration, braking and steering may work together in unison so the driver does not need to provide any input in certain situations. Such automated functions available today include self-parking and traffic jam assist (stop-and-go traffic driving).\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525;\">Level 3: Conditional Automation\u003C/strong>&nbsp;&mdash; The vehicle can itself perform all aspects of the driving task under some circumstances, but the human driver must always be ready to take control at all times within a specified notice period. In all other circumstances, the human performs the driving.\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525;\">Level 4: High Automation&nbsp;\u003C/strong>&mdash; This is a self-driving vehicle. But it still has a driver&rsquo;s seat and all the regular controls. Though the vehicle can drive and &ldquo;see&rdquo; all on its own, circumstances such as geographic area, road conditions or local laws might require the person in the driver&rsquo;s seat to take over.\u003C/p>\n\u003Cp>\u003Cstrong style=\"color: #252525;\">Level 5: Full Automation\u003C/strong>&nbsp;&mdash; The vehicle is capable of performing all driving functions under all environmental conditions and can operate without humans inside. The human occupants are passengers and need never be involved in driving. A steering wheel is optional in this vehicle.\u003C/p>\n\u003Cp>\u003Cspan style=\"color: #8c8c8c;\">Sources: Society of Automotive Engineers (SAE); National Highway and Traffic Safety Administration (NHTSA)\u003C/span>\u003C/p>","2018-04-10T07:00:00.000Z",{"id":2446,"type":654,"url":2447,"title":2448,"description":2449,"primary_tag":32,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2449,"image":2450,"img_alt":2451,"content":2452,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2453,"tags":2454},49,"when-it-comes-to-av-safety-experience-counts","When It Comes to AV Safety, Experience Counts","Now is the time for substantive conversations about safety for autonomous vehicles, says Mobileye CEO Prof. Amnon Shashua.","https://static.mobileye.com/website/us/corporate/post/images/2696be82504217843064c23b6fa24277_1597841473520.jpg","Mobileye CEO Prof. Amnon Shashua at the podium","\u003Cp>Society expects autonomous vehicles&nbsp;to be held to a higher standard than human drivers.&nbsp;Following the tragic death of Elaine Herzberg after being hit last week by a self-driving Uber car operating in autonomous mode in Arizona, it feels like the right moment to make a few observations around the meaning of safety with respect to sensing and decision-making.\u003C/p>\n\u003Cp>First, the challenge of interpreting sensor information. The video released by the police seems to demonstrate that even the most basic building block of an autonomous vehicle system, the ability to detect and classify objects, is a challenging task. Yet this capability is at the core of today&rsquo;s advanced driver assistance systems (ADAS), which include features such as automatic emergency braking (AEB) and lane keeping support.&nbsp;It is the high-accuracy sensing systems inside ADAS that are saving lives today, proven over billions of miles driven.&nbsp;It is this same technology that is required, before tackling even tougher challenges, as a foundational element of fully autonomous vehicles of the future.\u003C/p>\n\u003Cp>To demonstrate the power and sophistication of today&rsquo;s ADAS technology, we ran our software on a video feed coming from a TV monitor running the police video of the incident. Despite the suboptimal conditions, where much of the high dynamic range data that would be present in the actual scene was likely lost, clear detection was achieved approximately one second before impact. The images below show three snapshots with bounding box detections on the bicycle and Ms. Herzberg. The detections come from two separate sources: pattern recognition, which generates the bounding boxes, and a &ldquo;free-space&rdquo; detection module, which generates the horizontal graph where the red color section indicates a &ldquo;road user&rdquo; is present above the line. A third module separates objects from the roadway using structure from motion &ndash; in technical terms: &ldquo;plane + parallax.&rdquo; This validates the 3D presence of the detected object that had a low confidence as depicted by &ldquo;fcvValid: Low,&rdquo; which is displayed in the upper left side of the screen. This low confidence occurred because of the missing information normally available in a production vehicle and the low-quality imaging setup from taking a video of a video from a dash-cam that was subjected to some unknown downsampling.\u003C/p>\n\u003Cp>The software being used for this experiment is the same as included in today&rsquo;s ADAS-equipped vehicles, which have been proven over billions of miles in the hands of consumers.\u003C/p>\n\u003Cp>Recent developments in artificial intelligence, like deep neural networks, have led many to believe that it is now easy to develop a highly accurate object detection system and that the decade-plus experience of incumbent computer vision experts should be discounted. This dynamic has led to many new entrants in the field.&nbsp;While these techniques are helpful, the legacy of identifying and closing hundreds of corner cases, annotating data sets of tens of millions of miles, and going through challenging preproduction validation tests on dozens of production ADAS programs, cannot be skipped. Experience counts, particularly in safety-critical areas.\u003C/p>\n\u003Cp>The second observation is about transparency. Everyone says that &ldquo;safety is our most important consideration,&rdquo; but we believe that to gain public trust, we must be more transparent about the meaning of this statement. As I stated in October, when Mobileye released the&nbsp;\u003Ca href=\"https://www.mobileye.com/technology/responsibility-sensitive-safety/\">formal model of Responsible Sensitive Safety (RSS)\u003C/a>, decision-making must comply with the common sense of human judgment. We laid out a mathematical formalism of common sense notions such as &ldquo;dangerous situation&rdquo; and &ldquo;proper response&rdquo; and built a system to mathematically guarantee compliance to these definitions.\u003C/p>\n\u003Cp>The third observation is about redundancy. True redundancy of the perception system must rely on independent sources of information: camera, radar and LIDAR. Fusing them together is good for comfort of driving&nbsp;but is bad for safety. At Mobileye, to really show that we obtain true redundancy, we build a separate end-to-end camera-only system and a separate LIDAR and radar-only system.\u003C/p>\n\u003Cp>More incidents like the one last week could do further harm to already fragile consumer trust and spur reactive regulation that could stifle this important work. As I stated during the introduction of RSS, I firmly believe the time to have a meaningful discussion on a safety validation framework for fully autonomous vehicles is now. We invite automakers, technology companies in the field, regulators and other interested parties to convene so we can solve these important issues together.\u003C/p>","2018-03-26T07:00:00.000Z","Autonomous Driving, AV Safety, Opinion, From our CEO",{"id":2456,"type":654,"url":2457,"title":2458,"description":2459,"primary_tag":658,"author_name":16,"is_hidden":11,"lang":12,"meta_description":2459,"image":2460,"img_alt":2461,"content":2462,"download_doc":16,"download_title":16,"download_btn":16,"webinar_video":16,"thumbnail":16,"is_gated":16,"featured":11,"publish_date":2463,"tags":2437},35,"paving-the-way-toward-safer-roads-for-all","Paving the Way Toward Safer Roads for All","The Responsibility-Sensitive Safety Model Illustrates How Standards for Accident Fault and Vehicle Safety are Required to Advance the Autonomous Vehicle Industry, writes Mobileye CEO Prof. Amnon Shashua. ","https://static.mobileye.com/website/us/corporate/post/images/094a2673fa5ba0b11ba0a8aeab12e595_1597833269208.jpg","Mobileye's autonomous vehicle on the road in Jerusalem","\u003Cp>We architected the Responsibility-Sensitive Safety (RSS) model as a catalyst to drive cross-industry discussion among industry groups, car manufacturers and regulatory bodies. Since its publication, my co-authors and I have received many positive affirmations, but it has also raised some very important questions, which was our goal since the beginning of this project.\u003C/p>\n\u003Cp>One critical line of questioning centers around the idea that human judgment involves legal, safety and&nbsp;cultural considerations, while the RSS model seems to be focused only on the legal aspect. The notion that RSS is designed to make manufacturers immune to liability is a misunderstanding that demands further explanation.\u003C/p>\n\u003Cp>\u003Cstrong>Math Not Morals: How RSS Formalizes Driving Dilemmas\u003C/strong>\u003C/p>\n\u003Cp>Let&rsquo;s start out by reaffirming what RSS is. RSS formalizes the common sense of human judgment under a comprehensive set of road situations.&nbsp;It sets clear definitions for what it means to drive safely versus to drive recklessly.&nbsp;With human drivers, the interpretation of responsibility for collisions and other incidents is fluid. Driver error or, quite simply, blame is applied based on imperfect information and other factors interpreted after the fact. With machines, the definitions can be formal and mathematical. Machines have highly accurate information about the environment around them, always know their reaction time and braking power, and are never distracted or impaired. With machines, we do not need to interpret their actions after the fact. Instead, we can program them to follow a determined pattern &ndash; as long as we have the means to formalize that pattern.\u003C/p>\n\u003Cp>At its core, the RSS model is designed to formalize and contextualize today&rsquo;s driving dilemmas, like notions of safe distance and safe gaps when merging and cutting in, which agent cuts in and thus assumes responsibility to maintain a safe distance, how the right of way enters in the model, how to define safe driving with limited sensing (for example, when road users are hidden behind buildings or parked cars and might suddenly appear), and more. Clearly, human judgment includes avoiding accidents and not merely avoiding blame. RSS attempts to build a formal foundation that sets all aspects of human judgment in the context of driving with the goal of setting a formal &ldquo;seal of safety&rdquo; for autonomous vehicles.\u003C/p>\n\u003Cp>\u003Cstrong>RSS = Less Accidents on Roads\u003C/strong>\u003C/p>\n\u003Cp>Let&rsquo;s follow by stating what RSS is not. RSS does not allow the autonomous vehicle (AV) to make judgments &ndash; even if the AV has the right of way &ndash; to cause a collision. On the other hand, RSS&nbsp;\u003Cem>does\u003C/em>&nbsp;allow an AV to perform an illegal maneuver, say crossing a solid line to escape a collision, or proceed around a double-parked vehicle to avoid danger. What it does not allow is an AV to take non-cautious actions, which would put it at risk of causing a separate collision.\u003C/p>\n\u003Cp>The RSS model does not allow an AV to mitigate one accident with another presumably less severe one. In other words, in desiring to escape a collision caused by a human driver, the RSS model allows the AV to take any action (including violating traffic laws) if those actions do not cause a separate accident. This constraint is appropriate because the judgment of accident severity is subjective and might miss hidden, critical variables, such as a baby in the back seat of the seemingly &ldquo;less severe&rdquo; accident.\u003C/p>\n\u003Cp>Nevertheless, if society desires to allow mitigating one collision with another, under certain conditions it can be added to the RSS formula under a notion of &ldquo;blame transitivity,&rdquo; where responsibility for the complete set of incidents would be assigned to the agent that started the chain of events. We chose not to include this possibility in our model, but it can be done.\u003C/p>\n\u003Cp>The common-sense notion that the &ldquo;right of way is given, not taken&rdquo; is also part of the formalities of RSS.&nbsp;Consider the example of a car crossing an intersection: A green light provides a legal right of way for the vehicle crossing, but there is another vehicle blocking the junction (say the other vehicle ran a red light). In this case, RSS does not give the AV the right to hit the vehicle blocking its way. The AV would be at fault under RSS.\u003C/p>\n\u003Cp>\u003Cstrong>A Software State of Mind\u003C/strong>\u003C/p>\n\u003Cp>Logically, to be in a place to criticize the RSS model, you would need to find an accident scenario where the determination of blame through RSS disagrees with &ldquo;common sense&rdquo; human judgment. We haven&rsquo;t found such a scenario, even after going through the National Highway Traffic Safety Administration&rsquo;s crash typology study of 6 million crashes grouped into 37 scenarios covering 99.4 percent of all those crashes. They all&nbsp;fit into the RSS model, and we will publish the analysis as part of the continued open sourcing of RSS.\u003C/p>\n\u003Cp>Over time, as we collaborate with industry peers, standards bodies and regulators, we will surely discover more scenarios, match them to RSS and, if necessary, update the model &ndash; just like human judgment sometimes needs an update.\u003C/p>\n\u003Cp>Bottom line, we must convince the industry that software can always make the safest decisions. At its core, for a model to be useful, one must show that it is possible to program software that never causes accidents but at the same time maintains a normal flow of traffic.\u003C/p>\n\u003Cp>This is hardly trivial. One needs to prove that the model will not suffer from&nbsp;the &ldquo;butterfly effect,&rdquo; where a seemingly innocent action in the present will unfold through a chain of actions into a&nbsp;catastrophic event. For example, imagine a scenario where an aggressive merge causes the car behind to brake and swerve into another lane and cause a collision.\u003C/p>\n\u003Cp>\u003Cstrong>The Devil is in the Details\u003C/strong>\u003C/p>\n\u003Cp>We published the RSS model to evoke debate, discussion and exploration &ndash; all vital pathways to the right solution. The sad reality is there are no alternatives to the RSS model right now. So, in the&nbsp;absence of a clear model, what is the industry to do? Simply resort to a&nbsp;&ldquo;best practice&rdquo; position?&nbsp;That would devolve to a &ldquo;my AV has more sensors than yours&rdquo; or a &ldquo;my testing program included more miles than yours&rdquo; argument. These quantitative-driven statements may protect AV developers in a world with no clear model to evaluate safety, but they do not guarantee safety.&nbsp;Worse, it will lead to AVs that are over-engineered and too expensive to deliver flexible, affordable, ultra-safe, on-demand transportation to the general population and underserved communities &ndash; the elderly and the disabled, for example &ndash; who will benefit the most.\u003C/p>\n\u003Cp>It is not enough to adopt the RSS model into our own AV technology alone. For true safety assurance, we will require transparency and society&rsquo;s acceptance of how human judgment will be incorporated into AV decision-making. Our belief is that safety, in terms of collisions caused by a properly engineered AV, can be improved 1,000-fold compared to human-driven vehicles.\u003C/p>\n\u003Cp>To prepare the landscape for successful deployment of AVs, many issues need clarification. Issues that go far beyond technological innovation or comparisons of one company&rsquo;s products versus another&rsquo;s. We are putting a stake in the ground in an attempt to drive the industry to agree there is a definitive need to formalize the rules of judgment, responsibility and fault to realize the massive benefits to society.&nbsp;Far from a system designed to avoid liability, RSS is an innovative model intended to enable AVs to perform to the highest safety standards.\u003C/p>","2017-11-28T08:00:00.000Z",1776359635669]