In a world that constantly changes, it is crucial to understand how those changes impact different systems, such as industrial manufacturing or critical infrastructure. Explaining critical changes, referred to as concept drift in the field of machine learning, is the first step towards enabling targeted interventions to avoid or correct model failures, as well as malfunctions and errors in the physical world. Therefore, in this work, we extend model-based drift explanations towards causal explanations, which increases the actionability of the provided explanations. We evaluate our explanation strategy on a number of use cases, demonstrating the practical usefulness of our framework, which isolates the causally relevant features impacted by concept drift and, thus, allows for targeted intervention.
翻译:在一个不断变化的世界中,理解这些变化如何影响不同系统(如工业制造或关键基础设施)至关重要。解释关键变化——在机器学习领域称为概念漂移——是实现针对性干预以规避或修正模型失效以及物理世界故障与错误的第一步。因此,在本研究中,我们将基于模型的漂移解释扩展至因果解释,从而提升所提供解释的可操作性。我们在多个应用案例上评估了所提出的解释策略,证明了本框架的实用价值:该框架能够分离出受概念漂移影响的因果相关特征,进而支持针对性干预。