Date of Award

Spring 1-1-2013

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Computer Science

First Advisor

Dirk Grunwald

Second Advisor

Qin Lv

Third Advisor

Nicolaus Correll

Fourth Advisor

Lewis Harvey

Fifth Advisor

Michael Lightner Date

Abstract

At least as early as 1945 researchers have sought to utilize electronic devices to communicate spatial and environmental information to the blind [9]. Despite significant research and development efforts since then, the number of electronic sensory aids (ESAs) actively utilized by the blind community remains small. A major challenge to the adoption of ESAs is the steep and protracted learning curve gen- erally associated with such devices[20]. Impractical and/or non-intuitive man-machine interfaces contribute to this problem. Existing ESAs lack a natural and hands-free method of providing direct user manipulation of the input stream, such as is available to sighted persons by the moving of their eyes. In the case of audio devices, a secondary effect of this shortcoming can be the implementation of obscure audio codes in an attempt to disambiguate positional elements of the data-stream. Such deficiencies limit the usability of an ESA. In this work I propose the fusion of eye-tracking with spatialized audio feedback as a means of increasing ESA usability - by enabling direct user control over synthetic sensory feedback. To this end, I submit AuralEyes, a novel man-machine interface designed for use in ESAs for the visually impaired. Experimental results show that users of AuralEyes are able to perform simple range disparity tasks on simulated input with only a few minutes of training. There is evidence that user preference favors an AuralEyes implementation that employs spatialized audio feedback over a similar implementation with non-spatialized feedback. Finally, I present a fully functional implementation of the AuralEyes framework.

Share

COinS