Bookmark and Share

"Machine Learning Inference In Under 5 mW with a Binarized Neural Network on an FPGA," a Presentation from Lattice Semiconductor

Abdullah Raouf, Senior Marketing Manager at Lattice Semiconductor, presents the "Machine Learning Inference In Under 5 mW with a Binarized Neural Network on an FPGA" tutorial at the May 2018 Embedded Vision Summit.

The demand for always-on intelligence is rapidly increasing in various applications. You can find cameras that are always watching for anomalies in a manufacturing line, monitoring vehicle speeds on roads, or looking for a specific gesture or person. Since these cameras have to be always on, security and power consumption become concerns. Users don’t want the captured images to be sent to the cloud (available for hackers to access) and therefore item or anomaly detection must occur locally vs. in the cloud. This increases local computational requirements, which potentially increases power consumption – a major issue for battery-powered products.

This presentation from Raouf provides an overview of how FPGAs, such as Lattice’s iCE40 UltraPlus, are able to implement multiple binarized neural networks in a single 2 mm x 2 mm package to provide always-on intelligence without relying on cloud computation.