Bookmark and Share

"An Ultra-low-power Multi-core Engine for Inference on Encrypted DNNs," a Presentation from Xperi

Petronel Bigioi, CTO for Imaging at Xperi, presents the "An Ultra-low-power Multi-core Engine for Inference on Encrypted DNNs" tutorial at the May 2019 Embedded Vision Summit.

Neural network encryption is a useful method to secure a company’s IP. This presentation focuses on the design details of an ultra-low-power, scalable neural network core capable of performing inference on encrypted neural networks. Decryption of the neural network weights and topology take place inside the core, avoiding the need for decrypted networks to be present at any time in main memory. Bigioi also discusses solutions clustering together multiple neural network cores to meet the neural inference processing requirements of a target SoC platform.