Augmented acoustic reality as assistive device for spatial awareness in rooms
* Presenting author
Abstract:
Orientation and navigation in familiar and unfamiliar environments is typically guided by vision. However, for blind and visually impaired individuals, as well as in low-light conditions, fog, or smoke, these tasks become challenging and spatial auditory cues and perception can be used. Augmented acoustic reality (AAR) might help to provide assistive devices for acoustic navigation and echolocation training. Hereby, AAR provides cues for localization of external sound sources as well as for perception of sound reflections and reverberation, including echolocation.In this contribution, we investigated identification of and orientation within invisible corridor junctions based on acoustic cues only. The underlying real-time acoustic simulation, enabled modification of source directivity and included edge diffraction at the corridor corners. Three prototype AAR mobility aids were tested, which might be referred to as “hand-held acoustic pointers”. Depending on the pointing direction, they either placed a virtual external sound source at the nearest wall, or they emitted virtual clicks with either audio-frequency or the ultrasonic range directivity, resulting in virtual room reflections for echolocation. A hand-held (virtual) laser pointer served as visual reference. For nearby walls, a similar performance as with the laser pointer was achievable with the suggested AAR device without extensive training.