Products Versus Patents: Does Apple Also Plan Gesture-Enhanced Handsets?
Whatever Microsoft pioneers, Apple sooner or later seemingly reproduces (and visa versa). Earlier today, I mentioned that Microsoft's just-released Kinectanimals for Windows Phone game title leverages front-facing image sensors on smartphones for gesture interface purposes. Judging from a recently filed patent, Apple has similar aspirations...or at least wants to use its intellectual property as leverage in future license negotiations. And additional recent patents, both filed and approved, further fill in the pieces of Apple's potential gesture interface puzzle.
- Back in mid-January, SlashGear came across a proximity-sensing (i.e. "hover") patent filed by Apple. Later that same month, Apple Insider reported that the patent had been granted. The approach, as with that championed by iDENT Technology (which I discussed back in mid-September) appears to be an extrapolation of capacitive touch technology, sensing the change in capacitance not only when a finger or other conductive digit touches a panel but also approaches it at close range. And these discussions all occurred at roughly the same time that a (subsequently unrealized) rumor forecasted that Apple would replace the mechanical "home" button on the iPad with a capacitive touch sensor "button" on the iPad 2
- In mid-September, Apple was granted a patent for a 3-D display with integrated imaging system that responds to user (viewer?) gestures. A number of online sites picked up this news; 9to5Mac, Cult of Mac, Engadget, Gizmodo, Patently Apple, and SlashGear
- And today's news involves a recently published patent application, filed in April 2010 and entitled "Real Time Video Process Control Using Gestures," which involves gesture-based remote control and editing of video recordings on a mobile device. So far, I've seen coverage at AppleInsider, Cult of Mac, and Wired.
Patent applications, even if they're subsequently approved, don't necessarily translate into implementations in future products. Still, it's interesting to see how Apple's usability scientists are leveraging embedded vision and related technologies in their ideas.