Influence of (non) intelligible background speech on memory and listening effort in conversational situations
* Presenting author
Abstract:
Verbal communication depends on a listener’s ability to accurately comprehend and recall information conveyed in a conversation. Background noise, such as speech, can significantly impair speech processing. While previous research has explored the effects of native (intelligible) and foreign (unintelligible) background speech on memory, the tasks used in these studies were rather simple, e.g., serial recall. Recent advancements in cognitive research have led to the development of the heard-text recall (HTR) paradigm, which can be used in a dual-task design to assess both listening effort and memory performance. In contrast to traditional tasks such as serial recall, this paradigm uses running speech to simulate a conversation between two talkers. It allows for talker visualization in virtual reality (VR), effectively conveying co-verbal visual cues like lip-movements, turn-taking cues, and gaze behavior. While this paradigm has been investigated under pink noise, the impact of more realistic noise, such as speech, remains unexplored. In this study, we administered the HTR task in VR under three noise conditions: silence, intelligible speech, and unintelligible pseudo-speech. The results aim to deepen the understanding of speech processing in noise under close-to-real-life conditions.