With the advancement of neural networks, diverse methods for neural Granger causality have emerged, which demonstrate proficiency in handling complex data, and nonlinear relationships. However, the existing framework of neural Granger causality has several limitations. It requires the construction of separate predictive models for each target variable, and the relationship depends on the sparsity on the weights of the first layer, resulting in challenges in effectively modeling complex relationships between variables as well as unsatisfied estimation accuracy of Granger causality. Moreover, most of them cannot grasp full-time Granger causality. To address these drawbacks, we propose a Jacobian Regularizer-based Neural Granger Causality (JRNGC) approach, a straightforward yet highly effective method for learning multivariate summary Granger causality and full-time Granger causality by constructing a single model for all target variables. Specifically, our method eliminates the sparsity constraints of weights by leveraging an input-output Jacobian matrix regularizer, which can be subsequently represented as the weighted causal matrix in the post-hoc analysis. Extensive experiments show that our proposed approach achieves competitive performance with the state-of-the-art methods for learning summary Granger causality and full-time Granger causality while maintaining lower model complexity and high scalability.
翻译:随着神经网络的进步,涌现出多种神经格兰杰因果关系方法,这些方法在处理复杂数据和非线性关系方面展现出卓越能力。然而,现有神经格兰杰因果关系框架存在若干局限性:它需要为每个目标变量构建独立的预测模型,且变量间关系依赖于首层权重的稀疏性,这导致难以有效建模变量间的复杂关系,同时格兰杰因果关系的估计精度也不尽人意。此外,多数方法无法捕捉全时域格兰杰因果关系。为解决上述缺陷,我们提出一种基于雅可比正则化的神经格兰杰因果关系(JRNGC)方法——通过为所有目标变量构建单一模型来学习多元汇总格兰杰因果关系与全时域格兰杰因果关系的简明高效方法。具体而言,我们的方法利用输入-输出雅可比矩阵正则化器消除权重的稀疏性约束,该正则化器可在事后分析中转化为加权因果矩阵。大量实验表明,所提方法在保持较低模型复杂度与高度可扩展性的同时,在学习汇总格兰杰因果关系与全时域格兰杰因果关系方面达到了与最先进方法相媲美的性能。