The efficacy of interpolating via Variably Scaled Kernels (VSKs) is known to be dependent on the definition of a proper scaling function, but no numerical recipes to construct it are available. Previous works suggest that such a function should mimic the target one, but no theoretical evidence is provided. This paper fills both the gaps: it proves that a scaling function reflecting the target one may lead to enhanced approximation accuracy, and it provides a user-independent tool for learning the scaling function by means of Discontinuous Neural Networks ($\delta$NN), i.e., NNs able to deal with possible discontinuities. Numerical evidence supports our claims, as it shows that the key features of the target function can be clearly recovered in the learned scaling function.
翻译:通过变尺度核(VSKs)进行插值的有效性已知依赖于恰当缩放函数的定义,但目前尚无构造该函数的数值配方。先前的研究表明,此类函数应模拟目标函数,但未提供理论依据。本文填补了这两个空白:证明了反映目标函数的缩放函数可提高近似精度,并提供了一种通过间断神经网络($\delta$NN)——即能够处理可能间断的神经网络——来学习缩放函数的用户独立工具。数值证据支持了我们的主张,表明目标函数的关键特征可以在学习到的缩放函数中清晰复现。