When face-to-face communication becomes effortful due to background noise or interfering talkers, the role of visual cues becomes increasingly important for communication success. While previous research has selectively examined head or hand movements, here we explore movements of the whole body in acoustically adverse conditions. We hypothesized that increasing background noise in conversations would lead to increased gesture frequency in hand, head, trunk, and leg movements typical of conversation. Increased use of hand movements should support the speaker's role, while increased head and trunk movements may help the listener. We conducted a free dyadic conversation experiment with normal-hearing participants (n=8) in a virtual acoustic environment. Conversational movements were described with a newly developed labeling system for typical conversational actions, and the frequency of individual types was analyzed. In addition, we analyzed gesture quality by assessing hand-speech synchrony, with the hypothesis that higher levels of background noise would lead to a loss of synchrony according to an interactive coupling model. Higher noise levels led to increased hand-gesture complexity during speaking and listening, more pronounced up-down head movements, and contrary to expectations, head movements during listening generally decreased relative to speaking. Synchrony and peak velocity were unaffected by noise, while gesture quality scaled only modestly. The results support previous findings regarding gesturing frequency, but we found only limited evidence for changes in speech-gesture synchrony. This work reveals communication patterns of the whole body and illustrates multimodal adaptation to communication demands.
翻译:当面对面的交流因背景噪声或干扰性谈话者而变得费力时,视觉线索的作用对沟通成功变得愈发重要。以往研究通常单独考察头部或手部动作,而本文探讨了在声学不利条件下全身的运动。我们假设,对话中背景噪声的增加会导致手部、头部、躯干和腿部等典型对话动作的手势频率上升。手部动作的增加应支持说话者的角色,而头部和躯干动作的增加可能有助于听者。我们在虚拟声学环境中进行了一项自由双人对话实验,参与者为听力正常者(n=8)。对话动作通过新开发的典型对话行为标注系统进行描述,并对各类动作的频率进行分析。此外,我们通过评估手语同步性来分析手势质量,基于交互耦合模型假设更高水平的背景噪声会导致同步性丧失。更高的噪声水平导致说话和倾听时手部手势复杂度增加、上下头部动作更显著,但出乎意料的是,倾听时的头部动作相对于说话时普遍减少。同步性和峰值速度不受噪声影响,而手势质量仅轻微变化。结果支持了先前关于手势频率的发现,但我们仅发现有限的证据表明语音-手势同步性发生变化。这项工作揭示了全身的交流模式,并阐释了多模态适应沟通需求的过程。