Assessment of auditory function and abilities in life-like listening scenarios
* Presenting author
Abstract:
We live and interact with sound throughout the day, often without even noticing: we turn to sound sources, move to hear out better, or listen from a distance. During communication we integrate visual information from gestures and facial movements. Here, we studied how positions and gestures in two-way communication are affected by interfering sound sources. Two participants were surrounded by a video projection of an underground station while listening to their voices and interfering noises auralized via the loudspeakers of the real-time Simulated Open Field Environment (rtSOFE). Participants were only asked to talk to each other. Results show that with increasing interfering noise level: a) communication distance decreases, b) nodding and other gestures occur more frequently, c) the speed of hand gestures increases. While the simulation of early reflections with the image source method is standard, perceptual requirements for spatially rendering late reverberation are less clear. The required number of loudspeakers was investigated. Consecutively, participants detected spatial gaps in otherwise diffuse reverberation. Results show that gaps of 35° can be identified, but the gap must be much larger if it is in the direction of the direct sound. The sensitivity to anisotropy in reverberation was not affected by subject movement.