Date of Award
Doctor of Philosophy (PhD)
Michael Lightner Date
At least as early as 1945 researchers have sought to utilize electronic devices to communicate spatial and environmental information to the blind . Despite significant research and development efforts since then, the number of electronic sensory aids (ESAs) actively utilized by the blind community remains small. A major challenge to the adoption of ESAs is the steep and protracted learning curve gen- erally associated with such devices. Impractical and/or non-intuitive man-machine interfaces contribute to this problem. Existing ESAs lack a natural and hands-free method of providing direct user manipulation of the input stream, such as is available to sighted persons by the moving of their eyes. In the case of audio devices, a secondary effect of this shortcoming can be the implementation of obscure audio codes in an attempt to disambiguate positional elements of the data-stream. Such deficiencies limit the usability of an ESA. In this work I propose the fusion of eye-tracking with spatialized audio feedback as a means of increasing ESA usability - by enabling direct user control over synthetic sensory feedback. To this end, I submit AuralEyes, a novel man-machine interface designed for use in ESAs for the visually impaired. Experimental results show that users of AuralEyes are able to perform simple range disparity tasks on simulated input with only a few minutes of training. There is evidence that user preference favors an AuralEyes implementation that employs spatialized audio feedback over a similar implementation with non-spatialized feedback. Finally, I present a fully functional implementation of the AuralEyes framework.
Jones, Frank, "AuralEyes: Investigating the Fusion of Eye-Tracking and Spatial Audio in Electronic Sensory Aids for the Blind" (2013). Computer Science Graduate Theses & Dissertations. 59.