Explainable Artificial Intelligence and Formal Argumentation have received significant attention in recent years. Argumentation-based systems often lack explainability while supporting decision-making processes. Counterfactual and semifactual explanations are interpretability techniques that provide insights into the outcome of a model by generating alternative hypothetical instances. While there has been important work on counterfactual and semifactual explanations for Machine Learning models, less attention has been devoted to these kinds of problems in argumentation. In this paper, we explore counterfactual and semifactual reasoning in abstract Argumentation Framework. We investigate the computational complexity of counterfactual- and semifactual-based reasoning problems, showing that they are generally harder than classical argumentation problems such as credulous and skeptical acceptance. Finally, we show that counterfactual and semifactual queries can be encoded in weak-constrained Argumentation Framework, and provide a computational strategy through ASP solvers.
翻译:可解释人工智能与形式论证近年来受到广泛关注。基于论证的系统在支持决策过程时往往缺乏可解释性。反事实与半事实解释是一种通过生成替代假设实例来揭示模型输出机理的可解释性技术。尽管机器学习模型中的反事实与半事实解释已有重要研究,但此类问题在论证领域中的探索相对较少。本文研究抽象论证框架中的反事实与半事实推理,探讨基于反事实与半事实推理问题的计算复杂度,证明这些问题通常比经典论证问题(如轻信接受和怀疑接受)更难。最后,我们表明反事实与半事实查询可编码为弱约束论证框架,并通过ASP求解器提供计算策略。