In this paper, we propose to use Sinc interpolation in the context of Kolmogorov-Arnold Networks, neural networks with learnable activation functions, which recently gained attention as alternatives to multilayer perceptron. Many different function representations have already been tried, but we show that Sinc interpolation proposes a viable alternative, since it is known in numerical analysis to represent well both smooth functions and functions with singularities. This is important not only for function approximation but also for the solutions of partial differential equations with physics-informed neural networks. Through a series of experiments, we show that SincKANs provide better results in almost all of the examples we have considered.
翻译:本文提出在Kolmogorov-Arnold网络(一种具有可学习激活函数的神经网络,近来作为多层感知机的替代方案受到关注)的框架中应用Sinc插值。尽管已有多种不同的函数表示方法被尝试,但我们证明Sinc插值提供了一种可行的替代方案,因为数值分析中已知其能同时良好表示光滑函数和具有奇异性的函数。这不仅对函数逼近很重要,对使用物理信息神经网络求解偏微分方程也很重要。通过一系列实验,我们证明SincKANs在我们考察的几乎所有示例中都提供了更好的结果。