Transformer-based models dominate NLP tasks like sentiment analysis, machine translation, and claim verification. However, their massive computational demands and lack of interpretability pose challenges for real-world applications requiring efficiency and transparency. In this work, we explore Graph Neural Networks (GNNs) and Hyperbolic Graph Neural Networks (HGNNs) as lightweight yet effective alternatives for Environmental Claim Detection, reframing it as a graph classification problem. We construct dependency parsing graphs to explicitly model syntactic structures, using simple word embeddings (word2vec) for node features with dependency relations encoded as edge features. Our results demonstrate that these graph-based models achieve comparable or superior performance to state-of-the-art transformers while using 30x fewer parameters. This efficiency highlights the potential of structured, interpretable, and computationally efficient graph-based approaches.
翻译:基于Transformer的模型在情感分析、机器翻译和声明验证等自然语言处理任务中占据主导地位。然而,其巨大的计算需求和缺乏可解释性,对需要效率和透明度的实际应用构成了挑战。在本工作中,我们探索了图神经网络(GNNs)和双曲图神经网络(HGNNs)作为轻量级但有效的替代方案,用于环境声明检测,并将其重新构建为一个图分类问题。我们构建依存句法分析图来显式建模句法结构,使用简单的词嵌入(word2vec)作为节点特征,并将依存关系编码为边特征。我们的结果表明,这些基于图的模型在仅使用1/30参数量的情况下,达到了与最先进的Transformer模型相当或更优的性能。这种效率凸显了结构化、可解释且计算高效的基于图的方法的潜力。