Eye movements provide a window into human behaviour, attention, and interaction dynamics. Challenges in real-world, multi-person environments have, however, restrained eye-tracking research predominantly to single-person, in-lab settings. We developed a system to stream, record, and analyse synchronised data from multiple mobile eye-tracking devices during collective viewing experiences (e.g., concerts, films, lectures). We implemented lightweight operator interfaces for real-time-monitoring, remote-troubleshooting, and gaze-projection from individual egocentric perspectives to a common coordinate space for shared gaze analysis. We tested the system in a live concert and a film screening with 30 simultaneous viewers during each of two public events (N=60). We observe precise time-synchronisation between devices measured through recorded clock-offsets, and accurate gaze-projection in challenging dynamic scenes. Our novel analysis metrics and visualizations illustrate the potential of collective eye-tracking data for understanding collaborative behaviour and social interaction. This advancement promotes ecological validity in eye-tracking research and paves the way for innovative interactive tools.
翻译:眼动为理解人类行为、注意力及互动动态提供了一个窗口。然而,现实世界中多人环境所面临的挑战,使得眼动追踪研究主要局限于实验室内的单人场景。我们开发了一套系统,能够在集体观看体验(如音乐会、电影、讲座)期间,对来自多个移动眼动追踪设备的同步数据进行流式传输、记录和分析。我们实现了轻量级的操作员界面,用于实时监控、远程故障排除,以及将个体自我中心视角的注视点投影到共享坐标系中,以进行共享注视分析。我们在两次公共活动中,分别于一场现场音乐会和一场电影放映期间,对30名同时观看者(总计N=60)测试了该系统。我们观察到设备间通过记录的时钟偏移量测量出的精确时间同步,以及在具有挑战性的动态场景中准确的注视点投影。我们提出的新型分析指标和可视化方法,展示了集体眼动追踪数据在理解协作行为与社交互动方面的潜力。这一进展提升了眼动追踪研究的生态效度,并为创新的交互工具开辟了道路。