fbpx

Intel Demonstration of a Multi-Camera Neural Network Acceleration Platform

Paulo Borges, Autonomous Driving Strategic Business Manager in the Programmable Solutions Group at Intel, demonstrates the company's latest embedded vision technologies and products at the January 2017 Consumer Electronics Show. Specifically, Borges demonstrates a hardware and software platform for showcasing the possibilities of current technology in autonomous driving, focusing on multi-camera visual understanding of the environment. This demonstration highlights the capabilities of state-of-the-art deep learning technology: Intel automotive hardware (Xeon plus dual Arria 10 FPGAs), the Wind River operating system, and partner (and fellow Embedded Vision Alliance member) AImotive's recognition engine. By means of the Intel FPGA-based neural network acceleration IP core running on Intel architecture (Intel CPU, Wind River Linux, and an AImotive application), the Intel Programmable Solutions Group can enable customers to re-use the IP to productize deep learning solutions on Intel hardware.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top