Bookmark and Share

Blog

Will the iPhone 7 incorporate 3D vision using dual rear-facing cameras and depth-sensing capabilities for mapping and tracking applications?

There is no doubt cars will sooner or later simply become robots that take us places, but what will happen between now and then?

ADAS is becoming the new sweetheart of the automotive industry, driving most of the innovation within this enormous ecosystem.

Neural network algorithms have gained prominence in computer vision and other fields. The time may be ripe for neural network processors.

Embedded vision in robotics has the potential for drastically improving everything from mundane tasks to life’s most critical functions.

Alliance executive director Vin Ratford discusses educational resources for meeting the challenge of programming heterogeneous processors.

For demanding applications such as embedded vision, heterogeneous multicore architectures often yield the best bang for the buck (or Watt).

One great example of embedded vision's use in our lives is the vision-based safety features in advanced driver assistance systems (ADAS).

Face recognition is admittedly not yet perfect. However, other face analysis technologies are more mature and enable amazing applications.

In the consumer market, one of the most interesting uses of new vision technologies is the creation of more natural user interfaces.