Temporal causal discovery is a crucial task aimed at uncovering the causal relations within time series data. The latest temporal causal discovery methods usually train deep learning models on prediction tasks to uncover the causality between time series. They capture causal relations by analyzing the parameters of some components of the trained models, e.g., attention weights and convolution weights. However, this is an incomplete mapping process from the model parameters to the causality and fails to investigate the other components, e.g., fully connected layers and activation functions, that are also significant for causal discovery. To facilitate the utilization of the whole deep learning models in temporal causal discovery, we proposed an interpretable transformer-based causal discovery model termed CausalFormer, which consists of the causality-aware transformer and the decomposition-based causality detector. The causality-aware transformer learns the causal representation of time series data using a prediction task with the designed multi-kernel causal convolution which aggregates each input time series along the temporal dimension under the temporal priority constraint. Then, the decomposition-based causality detector interprets the global structure of the trained causality-aware transformer with the proposed regression relevance propagation to identify potential causal relations and finally construct the causal graph. Experiments on synthetic, simulated, and real datasets demonstrate the state-of-the-art performance of CausalFormer on discovering temporal causality. Our code is available at https://github.com/lingbai-kong/CausalFormer.
翻译:时序因果发现是一项旨在揭示时间序列数据内部因果关系的核心任务。最新的时序因果发现方法通常通过在预测任务上训练深度学习模型来揭示时间序列间的因果关系。这些方法通过分析训练模型某些组件的参数(例如注意力权重和卷积权重)来捕捉因果关系。然而,这是一个从模型参数到因果关系的不完全映射过程,未能考察其他对因果发现同样至关重要的组件(例如全连接层和激活函数)。为了促进整个深度学习模型在时序因果发现中的应用,我们提出了一种基于Transformer的可解释因果发现模型,称为CausalFormer,它由因果感知Transformer和基于分解的因果检测器组成。因果感知Transformer通过一个预测任务来学习时间序列数据的因果表示,该任务采用了设计的多核因果卷积,该卷积在时序优先性约束下沿时间维度聚合每个输入时间序列。随后,基于分解的因果检测器通过提出的回归相关性传播方法来解释训练好的因果感知Transformer的全局结构,以识别潜在的因果关系,并最终构建因果图。在合成、模拟和真实数据集上的实验表明,CausalFormer在发现时序因果关系方面具有最先进的性能。我们的代码可在 https://github.com/lingbai-kong/CausalFormer 获取。