Kolmogorov-Arnold networks (KANs) have attracted attention recently as an alternative to multilayer perceptrons (MLPs) for scientific machine learning. However, KANs can be expensive to train, even for relatively small networks. Inspired by finite basis physics-informed neural networks (FBPINNs), in this work, we develop a domain decomposition method for KANs that allows for several small KANs to be trained in parallel to give accurate solutions for multiscale problems. We show that finite basis KANs (FBKANs) can provide accurate results with noisy data and for physics-informed training.
翻译:Kolmogorov-Arnold网络(KANs)近期作为多层感知机(MLPs)在科学机器学习中的替代方案而受到关注。然而,即使对于相对较小的网络,KANs的训练成本也可能较高。受有限基物理信息神经网络(FBPINNs)的启发,本研究为KANs开发了一种区域分解方法,该方法允许多个小型KANs并行训练,从而为多尺度问题提供精确解。我们证明,有限基KANs(FBKANs)能够在含噪声数据及物理信息训练场景下提供准确结果。