Bookmark and Share

"Understanding Real-World Imaging Challenges for ADAS and Autonomous Vision Systems – IEEE P2020," a Presentation from Algolux

Register or sign in to access the Embedded Vision Academy's free technical training content.

The training materials provided by the Embedded Vision Academy are offered free of charge to everyone. All we ask in return is that you register, and tell us a little about yourself so that we can understand a bit about our audience. As detailed in our Privacy Policy, we will not share your registration information, nor contact you, except with your consent.

Registration is free and takes less than one minute. Click here to register, and get full access to the Embedded Vision Academy's unique technical training content.

If you've already registered, click here to sign in.

See a sample of this page's content below:


Felix Heide, CTO and Co-founder of Algolux, presents the "Understanding Real-World Imaging Challenges for ADAS and Autonomous Vision Systems – IEEE P2020" tutorial at the May 2018 Embedded Vision Summit.

ADAS and autonomous driving systems rely on sophisticated sensor, image processing and neural-network based perception technologies. This has resulted in effective driver assistance capabilities and is enabling the path to full autonomy, which we see being shown at demonstration events and in controlled regions of operation. But current systems are significantly challenged by real-world operating conditions, such as darkness, poor weather and lens issues.

In this talk, Heide examines the difficult use cases that hamper effective ADAS for drivers and cause autonomous vision systems to fail. He explores a range of challenging scenarios and explain the key reasons for vision system failure in these situations. He also introduces industry initiatives that are tackling these challenges, such as IEEE P2020.