Bookmark and Share

"Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles," a Presentation from NXP Semiconductors

Register or sign in to access the Embedded Vision Academy's free technical training content.

The training materials provided by the Embedded Vision Academy are offered free of charge to everyone. All we ask in return is that you register, and tell us a little about yourself so that we can understand a bit about our audience. As detailed in our Privacy Policy, we will not share your registration information, nor contact you, except with your consent.

Registration is free and takes less than one minute. Click here to register, and get full access to the Embedded Vision Academy's unique technical training content.

If you've already registered, click here to sign in.

See a sample of this page's content below:


Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the "Choosing the Optimum Mix of Sensors for Driver Assistance and Autonomous Vehicles" tutorial at the May 2017 Embedded Vision Summit.

A diverse set of sensor technologies is available and emerging to provide vehicle autonomy or driver assistance. These sensor technologies often have overlapping capabilities, but each has its own strengths and limitations. Drawing on design experience from deployed real-world applications, Ors explores trade-offs among sensor technologies such as radar, lidar, 2D and 3D camera sensors and others that are making inroads into vehicles for safety applications. He also examines how these sensor technologies are progressing to meet the needs of autonomous vehicles.