Bookmark and Share

Embedded Vision: Assisting Those With Vision Limitations

Speaking of Apple's attempt to compensate for people's imperfections, the last paragraph of my previous writeup:

Parziale also notes that the feature is optimized for the visually impaired thanks to VoiceOver in OS X. “VoiceOver helps positioning the card in front of the camera and the very fast image processing algorithm generates very quickly the result,” according to Parziale. “The user experience is amazing.”

reminded me of some other relevant coverage I'd recently seen about VoiceOver and related technologies in the Sydney Morning Herald, thanks to an Apple website Hot News highlight. The article, entitled "Apple puts eye into iDevices," talks about how (in Apple's words):

Apple accessibility technology has enabled disabled users to thrive both personally and professionally. Barker [reporter Garry] tells the story of David Woodbridge, a technology manager for a large non-profit who has been blind since childhood. Woodbridge uses VoiceOver and other assistive technologies in Apple’s iOS devices daily.

The majority of Barker's writeup discusses voice-outfitted applications that leverage Woodbridge's fully functional ears to compensate for his visual shortcomings. However, the iOS devices' built-in image sensors also have a role to play:

''We all have our own favourite apps, movies, music and so on, but for me one of the apps I really like is Light Detector, which allows me to make sure I have turned off all the lights before I go to bed.''

I wasn't familiar with Light Detector, so I Googled it:

Light Detector transforms any natural or artificial light source it encounters into sound. Light Detector is easy to use! Just run the application and point your iPhone camera in any direction. You will hear a higher or lower sound depending on the intensity of the light.

As far as embedded vision applications on mobile electronics devices go, Light Detector is admittedly pretty simplistic from a functionality standpoint. But clearly, as Woodbridge's case study suggests, there's no consistent correlation between an app's complexity and its value to the end user.