fbpx

Embedded Vision Insights: July 31, 2014 Edition

EVA180x100

In this edition of Embedded Vision Insights:

LETTER FROM THE EDITOR

Dear Colleague,Fire Phone

Amazon's Fire Phone, rumors of which I passed along to you in mid-April, went on sale last Friday subsequent to a mid-June public unveiling. The overall reviews thus far have been somewhat lukewarm. However, the Fire Phone is chock-full of vision-based features, which have held up quite well to review scrutiny.

First off, there are the computational photography capabilities, for still image and video capture, enabled both by the Fire Phone's Google Android foundation and Amazon-developed enhancements (not to mention its 13 Mpixel rear and 2.1 Mpixel front cameras). Next is Firefly technology, which uses object and text recognition algorithms to identify whatever you point the handset's camera at, including (of course) items you might want to price-match and potentially buy from Amazon; television shows and movies shown on a screen in front of you; and web addresses, email addresses, and phone numbers.

Finally, there's Dynamic Perspective, which leverages infrared transmitters and sensors on each of the phone's front four corners to track your head location and orientation, presenting you with parallax-adjusted 3D representations of on-screen objects, along with enabling sophisticated but intuitive one-handed user interface gestures. And, as Qualcomm is happy to point out, a notable percentage of the vision processing takes place on the Hexagon DSP core integrated within the company's Snapdragon application processor.

I encourage you to check out the recently published analysis of the Fire Phone’s prospects by John Feland of Argus Insights, followed by a perusal of iFixit's product teardown. Then head to a nearby AT&T store (if you're in the United States, that is) to try out a Fire Phone for yourself. While you're on the Alliance website, please also peruse the other great content that's appeared there the past two weeks, including multiple product demonstration videos from May's Embedded Vision Summit, two article reprints (one on vision applications in industrial automation, the other on computational photography), and several press releases from Alliance member companies.

And speaking of Summits, mark your calendars now for next spring's event, currently scheduled to take place on April 30, 2015 at the Santa Clara (California) Convention Center. Thanks for your support of the Embedded Vision Alliance, and for your interest in and contributions to embedded vision technologies, products and applications. As always, I welcome your suggestions on what the Alliance can do to better service your needs.

Brian Dipert
Editor-In-Chief, Embedded Vision Alliance

FEATURED VIDEOS

Embedded Vision Summit Technical Presentation: "Using Synthetic Image Generation to Reduce the Cost of Vision Algorithm Development," Clark Dornan, Next Century CorporationNext Century
Clark Dorman, Chief Engineer at Next Century Corporation, presents the "Using Synthetic Image Generation to Reduce the Cost of Vision Algorithm Development" tutorial within the "Algorithm Development Techniques and Tools" technical session at the October 2013 Embedded Vision Summit East. One of the greatest challenges in developing computer vision applications is the development and maintenance of high-quality training and testing data. Annotated data that covers the range of object variations, poses, and environmental situations is needed to ensure that a system will perform successfully in operational situations. However, obtaining sufficient data is time consuming and expensive. The Synthetic Image Generation Harness for Training and Testing (SIGHTT) project creates annotated images by combining rendered 3D objects with real backgrounds. This talk discusses the use of synthetic data and its generation, combined with live data, to alleviate the data problem.

Consumer Electronics Show Product Demonstration: InuitiveInuitive
Yossi Zilberfarb, Vice President of Business Development at Inuitive, demonstrate the company's latest embedded vision technologies and products at the January 2014 Consumer Electronics Show.

More Videos

FEATURED ARTICLES

Industrial Automation and Embedded Vision: A Powerful CombinationISA InTech
In order for manufacturing robots and other industrial automation systems to meaningfully interact with the objects they're assembling, as well as to deftly and safely move about in their environments, they must be able to see and understand their surroundings. In this contributed article originally published at ISA InTech Magazine, the Alliance and several member companies explain how cost-effective and capable vision processors, fed by depth-discerning image sensors and running robust software algorithms, are transforming longstanding autonomous and adaptive industrial automation aspirations into reality. More

It’s Tegra K1 Everywhere at Google I/ONVIDIA
You couldn’t get very far at Google I/O’s dazzling kickoff without bumping into NVIDIA's new Tegra K1 mobile processor. The keynote showed off Google’s new Android L operating system’s gaming capabilities on a Tegra K1 reference device. Spanking-new Android TV is available to developers on a Tegra-powered devkit. The Open Auto Alliance’s just-announced Android Auto features NVIDIA. And Tegra K1 is at the heart of the Project Tango tablet devkit, which is on display on the show floor, and which opens up the doors to computer vision and computational photography. More

More Articles

FEATURED NEWS

Altera Joins the Embedded Vision Alliance

Imagination Technologies to Showcase Latest Graphics and GPU Compute Technologies at SIGGRAPH 2014

MakerBot Expands 3D Printing & Scanning Ecosystem with Exclusive Partnership with Innovative 3D Vision Company SoftKinetic

Qualcomm Announces New Ultra HD Processor for TVs and Set-Top-Boxes

SoftKinetic Teams With NVIDIA to Bring 3D Depth Sensing to Mobile

More News

 

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top