In recent years, brain-computer interfaces have made advances in decoding various motor-related tasks, including gesture recognition and movement classification, utilizing electroencephalogram (EEG) data. These developments are fundamental in exploring how neural signals can be interpreted to recognize specific physical actions. This study centers on a written alphabet classification task, where we aim to decode EEG signals associated with handwriting. To achieve this, we incorporate hand kinematics to guide the extraction of the consistent embeddings from high-dimensional neural recordings using auxiliary variables (CEBRA). These CEBRA embeddings, along with the EEG, are processed by a parallel convolutional neural network model that extracts features from both data sources simultaneously. The model classifies nine different handwritten characters, including symbols such as exclamation marks and commas, within the alphabet. We evaluate the model using a quantitative five-fold cross-validation approach and explore the structure of the embedding space through visualizations. Our approach achieves a classification accuracy of 91 % for the nine-class task, demonstrating the feasibility of fine-grained handwriting decoding from EEG.
翻译:近年来,脑机接口领域在利用脑电图数据解码各类运动相关任务(包括手势识别与运动分类)方面取得了进展。这些进展为探索如何通过解析神经信号来识别特定物理动作奠定了基础。本研究聚焦于书写字母分类任务,旨在解码与手写动作相关联的脑电信号。为实现这一目标,我们引入手部运动学数据,通过辅助变量引导一致性嵌入提取方法,从高维神经记录中提取具有一致性的嵌入表示。这些CEBRA嵌入与原始脑电信号通过并行卷积神经网络模型进行同步处理,该模型能够同时从两类数据源中提取特征。该模型可对字母表中九种不同的手写字符(包括感叹号与逗号等符号)进行分类。我们采用定量化的五折交叉验证方法评估模型性能,并通过可视化技术深入探究嵌入空间的结构特征。本方法在九分类任务中实现了91%的分类准确率,证明了从脑电信号中实现细粒度手写解码的可行性。