Automated Speaking Assessment (ASA) plays a crucial role in evaluating second-language (L2) learners proficiency. However, ASA models often suffer from class imbalance, leading to biased predictions. To address this, we introduce a novel objective for training ASA models, dubbed the Balancing Logit Variation (BLV) loss, which perturbs model predictions to improve feature representation for minority classes without modifying the dataset. Evaluations on the ICNALE benchmark dataset show that integrating the BLV loss into a celebrated text-based (BERT) model significantly enhances classification accuracy and fairness, making automated speech evaluation more robust for diverse learners.
翻译:自动口语评估(ASA)在评估第二语言(L2)学习者的熟练度方面发挥着至关重要的作用。然而,ASA模型常常受到类别不平衡的影响,导致预测结果存在偏差。为解决这一问题,我们提出了一种用于训练ASA模型的新颖目标函数,称为平衡对数变异(BLV)损失函数。该函数通过扰动模型预测来改善少数类别的特征表示,而无需修改数据集。在ICNALE基准数据集上的评估表明,将BLV损失函数集成到广受赞誉的基于文本的(BERT)模型中,能显著提升分类准确性和公平性,从而使自动语音评估对于不同学习者更具鲁棒性。