Integrating prior knowledge of neurophysiology into neural network architecture enhances the performance of emotion decoding. While numerous techniques emphasize learning spatial and short-term temporal patterns, there has been limited emphasis on capturing the vital long-term contextual information associated with emotional cognitive processes. In order to address this discrepancy, we introduce a novel transformer model called emotion transformer (EmT). EmT is designed to excel in both generalized cross-subject EEG emotion classification and regression tasks. In EmT, EEG signals are transformed into a temporal graph format, creating a sequence of EEG feature graphs using a temporal graph construction module (TGC). A novel residual multi-view pyramid GCN module (RMPG) is then proposed to learn dynamic graph representations for each EEG feature graph within the series, and the learned representations of each graph are fused into one token. Furthermore, we design a temporal contextual transformer module (TCT) with two types of token mixers to learn the temporal contextual information. Finally, the task-specific output module (TSO) generates the desired outputs. Experiments on four publicly available datasets show that EmT achieves higher results than the baseline methods for both EEG emotion classification and regression tasks. The code is available at https://github.com/yi-ding-cs/EmT.
翻译:将神经生理学的先验知识整合到神经网络架构中,能够提升情绪解码的性能。尽管众多技术强调学习空间和短时时间模式,但对捕捉与情绪认知过程相关的关键长时上下文信息的重视仍然有限。为解决这一不足,我们引入了一种名为情绪Transformer(EmT)的新型Transformer模型。EmT旨在卓越地完成广义跨被试脑电情绪分类和回归任务。在EmT中,脑电信号通过时序图构建模块(TGC)转换为时序图格式,生成一系列脑电特征图。随后,我们提出了一种新颖的残差多视图金字塔图卷积网络模块(RMPG),用于学习序列中每个脑电特征图的动态图表示,并将每个图学习到的表示融合为一个令牌。此外,我们设计了一个包含两种令牌混合器的时序上下文Transformer模块(TCT),以学习时序上下文信息。最后,任务特定输出模块(TSO)生成所需的输出。在四个公开数据集上的实验表明,EmT在脑电情绪分类和回归任务上均取得了比基线方法更高的结果。代码可在 https://github.com/yi-ding-cs/EmT 获取。