As online video and streaming platforms continue to grow, affective computing research has undergone a shift towards more complex studies involving multiple modalities. However, there is still a lack of readily available datasets with high-quality audiovisual stimuli. In this paper, we present GameVibe, a novel affect corpus which consists of multimodal audiovisual stimuli, including in-game behavioural observations and third-person affect traces for viewer engagement. The corpus consists of videos from a diverse set of publicly available gameplay sessions across 30 games, with particular attention to ensure high-quality stimuli with good audiovisual and gameplay diversity. Furthermore, we present an analysis on the reliability of the annotators in terms of inter-annotator agreement.
翻译:随着在线视频与流媒体平台的持续发展,情感计算研究已转向涉及多模态的更为复杂的研究。然而,目前仍缺乏易于获取且包含高质量视听刺激的数据集。本文提出GameVibe,一种新颖的情感语料库,它包含多模态视听刺激,包括游戏内行为观察以及用于衡量观众参与度的第三人称情感轨迹。该语料库由来自30款游戏的多样化公开游戏会话视频构成,特别注重确保刺激的高质量以及良好的视听与游戏玩法多样性。此外,我们还对标注者间一致性进行了分析,以评估标注者的可靠性。