Bookmark and Share

Embedded Vision Insights: November 15, 2011 Edition

Register or sign in to access the Embedded Vision Academy's free technical training content.

The training materials provided by the Embedded Vision Academy are offered free of charge to everyone. All we ask in return is that you register, and tell us a little about yourself so that we can understand a bit about our audience. As detailed in our Privacy Policy, we will not share your registration information, nor contact you, except with your consent.

Registration is free and takes less than one minute. Click here to register, and get full access to the Embedded Vision Academy's unique technical training content.

If you've already registered, click here to sign in.

See a sample of this page's content below:

Dear Colleague,

Welcome to the third edition of Embedded Vision Insights, the newsletter of the Embedded Vision Alliance.

This past few weeks have been particularly newsworthy for camera-inclusive smartphones and tablets. Consider, for example, handsets such as the HTC MyTouch Slide 4G and its plethora of "power user" snapshot settings, the 1080p video capture capabilities of the Apple iPhone 4S, the stitch-free panorama mode supported by the Samsung Galaxy Nexus and the high quality Carl Zeiss optics built into the Nokia Lumia 800. Key to new capabilities such as these are the systems' microprocessors; now-sampling CPUs built from Qualcomm's latest Krait and ARM's latest Cortex-A15 microarchitectures, for example, along with Nvidia's in-production quad-core (or more accurately, penta-core) Tegra 3 and Apple's dual-core A5.

To be clear, these systems (and the SoCs they're derived from) are useful for a diversity of embedded vision functions, not just for picture-snapping and videography purposes. Take a look, for example, at the Kinect-reminiscent gesture interfaces supported by Kinectimals for Windows Phone 7, included in latest-generation Pantech handsets, documented in both filed and granted patents from Apple, and suggested by recent Qualcomm acquisitions. Ponder the facial recognition-based unlock capabilities built into Google's "Ice Cream Sandwich" Android v4 and Nokia's Symbian O/S. And appraise the fresh perspectives represented by embryonic applications such as television program identification, augmented reality, and traffic flow optimization.

Cellular handsets and tablet computers are compelling platform for implementing embedded vision, by virtue of the prevelence of both front- and rear-mounted image sensors of sufficient resolution, the substantial available memory and processing resources, the systems' application-enabling portability, and (perhaps most importantly) the often-subsidized prices at which they're sold and their consequent large installed user base. How do you hope to harness mobile electronics' potential in actualizing your embedded...