Bookmark and Share

"The Role of the Cloud in Autonomous Vehicle Vision Processing: A View from the Edge," a Presentation from NXP Semiconductors

Register or sign in to access the Embedded Vision Academy's free technical training content.

The training materials provided by the Embedded Vision Academy are offered free of charge to everyone. All we ask in return is that you register, and tell us a little about yourself so that we can understand a bit about our audience. As detailed in our Privacy Policy, we will not share your registration information, nor contact you, except with your consent.

Registration is free and takes less than one minute. Click here to register, and get full access to the Embedded Vision Academy's unique technical training content.

If you've already registered, click here to sign in.

See a sample of this page's content below:

Ali Osman Ors, Director of Automotive Microcontrollers and Processors at NXP Semiconductors, presents the "Role of the Cloud in Autonomous Vehicle Vision Processing: A View from the Edge" tutorial at the May 2018 Embedded Vision Summit.

Regardless of the processing topology—distributed, centralized or hybrid —sensor processing in automotive is an edge compute problem. However, with the improvement of connectivity technologies, vehicles are transforming into connection hubs, and this provides opportunities to augment the edge compute capabilities of today’s vehicles. Conversely, the amount of data created at the sensors is increasing rapidly, creating a bandwidth shortage. Drawing on design experience from deployed applications and emerging technologies, Ors explores the role and impact cloud-based services will have on applications that are predominantly reliant on processing at the edge.