Source localization in interactive virtual acoustic environments with dynamic binaural rendering
* Presenting author
Abstract:
Interactive virtual audio-visual environments allow to investigate various factors contributing to sound source localization, which has not yet been fully explored. Here our perceptually motivated real-time room acoustics simulator liveRAZR will be presented together with experiments that investigated the ability to perceive distance and azimuthal direction of sound sources. A simulated visual environment was presented on a stereoscopic head mounted display together with the acoustics simulation using headphones. LiveRAZR allows to generate highly plausible renderings in simulated rooms with 6-DoF movements of receiver and source using measured source directivity and head-related impulse responses. First, it was investigated to what extent auditory distance perception requires externalized sources, perceived outside of the head. For this, the dynamic binaural rendering was modified, including an extended spherical head model, and static diotic rendering to create a more internalized sound image. Interestingly, results suggest externalization is not a strict requirement for estimating source distance. In a second experiment, the sound-source localization (distance and azimuth) was measured for conditions where the observer was asked to either perform translational movements or to only move the head. Here translational movements improved distance perception, based on parallax cues, while azimuth perception accuracy was not improved.