Mobileye® has developed a family of chips called Mobileye EyeQ® to fulfill the need for low power and inexpensive computing platforms which are able to support computationally intensive vision applications and which meet automotive cabin qualification requirements.
The first generation Mobileye EyeQ® Vision-system-on-a-chip has been selected for a number of automotive OEM production platforms and is in commercial use in Mobileye’s AWS consumer products as well as in advanced development programs of several OEMs and Tier1 suppliers, with the second generation Mobileye EyeQ2® debuting in the third quarter of 2007.
The Mobileye EyeQ® family represents highly integrated chips supporting intensive processing using specially designed processing modules, general purpose CPUs for control and IO capability for video input and car interface (including high speed CAN). The Mobileye EyeQ® architecture and computing engines were designed after careful analysis of the computational needs of computer vision requirements and of the bottom-up and top-down approaches used. This design results in efficient processing of images to extract interesting regions and features combined with support for powerful classification and tracking engines.
Mobileye EyeQ® processor design includes multiple processing units with their own local memory and instructions allow for efficient parallel applications.
The Mobileye EyeQ® supports vehicle detection (forward collision warning, adaptive headlight control) and lane detection (for lane departure warning or headway monitoring & warning). Alternatively, it can support pedestrian recognition with visible light or IR images.
The second generation Mobileye EyeQ2® is more powerful by a factor of 6 and support all the above algorithms and more on a single platform, and supports video input from two high-resolution image sensors, as well as video out capabilities with graphic overlay, In serial production since 2010.
The third generation Mobileye EyeQ3® is more powerful by factor of 6 than Mobileye EyeQ2® and will allow processing of multiple high resolution sensors in parallel, resulting in range extension and enhanced features for the end customer.