Achieving 15 TOPS/s Equivalent Performance in Less Than 10 W Using Neural Network Pruning on Xilinx Zynq

Wednesday, May 23, 2:50 PM - 3:20 PM
Summit Track: 
Enabling Technologies
Location: 
Exhibit Hall A-2

Machine learning algorithms, such as convolution neural networks (CNNs), are fast becoming a critical part of image perception in embedded vision applications in the automotive, drone, surveillance and industrial vision markets. Applications include multi-object detection, semantic segmentation and image classification. However, when scaling these networks to modern image resolutions such as HD and 4K, the computational requirements for real-time systems can easily exceed 10 TOPS/s, consuming hundreds of watts of power, which is simply unacceptable for most edge applications. In this talk, we will describe a network/weight pruning methodology that achieves a performance gain of over 10 times on Zynq Ultrascale+ SoCs with very small accuracy loss. The network inference running on Zynq Ultrascale+ has achieved performance equivalent to 20 TOPS/s in the original SSD network, while consuming less than 10 W.

Speaker(s):

Nick Ni

Director of Product Marketing, AI and Edge Computing, Xilinx

Nick Ni is the Director of Product Marketing, AI and Edge Computing. His team’s responsibilities include product planning, business development and outbound marketing for Xilinx’s artificial intellegence products and software defined development environment for embedded systems.

Ni joined Xilinx in 2014. Prior to Xilinx he held multiple roles in R&D and appliations focusing on embedded system design and high-level synthesis at ATI, AMD, Qualcomm, and Altera.

Ni earned a master’s degree in Computer Engineering from the University of Toronto and holds over 10 patents and publications.

See you at the Summit! May 20-23 in Santa Clara, California!
Register today and reserve your hotel room!