Aspect-based sentiment analysis (ABSA) garnered growing research interest in multilingual contexts in the past. However, the majority of the studies lack more robust feature alignment and finer aspect-level alignment. In this paper, we propose a novel framework, MSMO: Multi-Scale and Multi-Objective optimization for cross-lingual ABSA. During multi-scale alignment, we achieve cross-lingual sentence-level and aspect-level alignment, aligning features of aspect terms in different contextual environments. Specifically, we introduce code-switched bilingual sentences into the language discriminator and consistency training modules to enhance the model's robustness. During multi-objective optimization, we design two optimization objectives: supervised training and consistency training, aiming to enhance cross-lingual semantic alignment. To further improve model performance, we incorporate distilled knowledge of the target language into the model. Results show that MSMO significantly enhances cross-lingual ABSA by achieving state-of-the-art performance across multiple languages and models.
翻译:方面级情感分析(ABSA)近年来在多语言场景下引发了日益增长的研究兴趣。然而,现有研究大多缺乏更鲁棒的特征对齐与更精细的方面级对齐。本文提出了一种新颖框架MSMO:面向跨语言ABSA的多尺度与多目标优化。在多尺度对齐过程中,我们实现了跨语言的句子级与方面级对齐,使不同上下文环境中的方面词特征得以对齐。具体而言,我们将代码混合双语语句引入语言判别器与一致性训练模块,以增强模型的鲁棒性。在多目标优化过程中,我们设计了两个优化目标:监督训练与一致性训练,旨在增强跨语言语义对齐。为进一步提升模型性能,我们将目标语言的知识蒸馏融入模型中。实验结果表明,MSMO通过实现多语言及多模型下的最优性能,显著提升了跨语言ABSA的效果。