Sensory Fusion for Scalable Indoor Navigation

Tuesday, May 21, 4:55 PM - 5:25 PM
Summit Track: 
Technical Insights II
Mission City M1-M3

Indoor autonomous navigation requires using a variety of sensors in different modalities. Merging together RGB, depth, lidar and odometry data streams to achieve autonomous operation requires a fusion of sensory data. In this talk, we describe our sensor-pack agnostic sensory fusion approach, which allows us to take advantage of the latest in sensor technology to achieve robust, safe and performant perception across a large fleet of industrial robots. We explain how we addressed a number of sensory fusion challenges such as robust and safe obstacle detection, fusing geometric and semantic information and dealing with moving people and sensory blind spots.


Oleg Sinyavskiy

Director of Research & Development, Brain Corp

Oleg Sinyavskiy graduated from the robotics department of Moscow Power Engineering Institute with a PhD in Computational Neuroscience. His research and effort aims at bridging the technology gap from neural science research to commercialized robotic products. Oleg joined Brain Corp in 2011, and is leading the company’s R&D efforts around next generational mobile robotics navigation systems and applications.

See you at the Summit! May 20-23 in Santa Clara, California!
Register today and reserve your hotel room!