Gaze-based interaction enables intuitive, hands-free control in immersive environments, but remains susceptible to unintended inputs. We present a real-time error prevention system (EPS) that uses a temporal convolutional network autoencoder (TCNAE) to detect anomalies in gaze dynamics during selection tasks. In a visual search task in VR, 41 participants used three gaze-based methods - dwell time, gaze and head direction alignment, and nod - with and without EPS. The system reduced erroneous selections by up to 95% for dwell time and gaze and head, and was positively received by most users. Performance varied for nodding and between individuals, suggesting the need for adaptive systems. Objective metrics and subjective evaluations show that anomaly-based error prevention can improve gaze interfaces without disrupting interaction. These findings demonstrate the potential of anomaly-based error prevention for gaze interfaces and suggest applications in VR, AR, and assistive technologies.
翻译:注视交互在沉浸式环境中实现了直观的无手控制,但仍易受非预期输入影响。本文提出一种实时错误预防系统,该系统使用时态卷积网络自编码器来检测选择任务期间注视动态中的异常。在一项虚拟现实视觉搜索任务中,41名参与者使用了三种注视交互方法——驻留时间、注视与头部方向对齐及点头动作——分别在启用与未启用错误预防系统的条件下进行测试。该系统将驻留时间及注视与头部对齐方法的错误选择率降低了最高达95%,并获得了大多数用户的积极评价。点头交互的表现存在个体差异,表明需要自适应系统。客观指标与主观评估表明,基于异常检测的错误预防能在不干扰交互的情况下改进注视界面。这些发现证明了基于异常检测的错误预防在注视交互中的潜力,并提示了其在虚拟现实、增强现实及辅助技术中的应用前景。