Negative distance kernels $K(x,y) := - \|x-y\|$ were used in the definition of maximum mean discrepancies (MMDs) in statistics and lead to favorable numerical results in various applications. In particular, so-called slicing techniques for handling high-dimensional kernel summations profit from the simple parameter-free structure of the distance kernel. However, due to its non-smoothness in $x=y$, most of the classical theoretical results, e.g. on Wasserstein gradient flows of the corresponding MMD functional do not longer hold true. In this paper, we propose a new kernel which keeps the favorable properties of the negative distance kernel as being conditionally positive definite of order one with a nearly linear increase towards infinity and a simple slicing structure, but is Lipschitz differentiable now. Our construction is based on a simple 1D smoothing procedure of the absolute value function followed by a Riemann-Liouville fractional integral transform. Numerical results demonstrate that the new kernel performs similarly well as the negative distance kernel in gradient descent methods, but now with theoretical guarantees.
翻译:负距离核$K(x,y) := - \|x-y\|$在统计学中被用于定义最大均值差异(MMDs),并在各类应用中取得了良好的数值效果。特别是处理高维核求和的所谓切片技术受益于距离核这种无需参数的简单结构。然而,由于该核在$x=y$处的非光滑性,大多数经典理论结果(例如关于相应MMD泛函的Wasserstein梯度流)不再成立。本文提出了一种新核函数,它保持了负距离核的优良特性——即具有一阶条件正定性、近乎线性的无穷增长趋势以及简单的切片结构,同时具备Lipschitz可微性。我们的构造基于对绝对值函数进行一维平滑处理,再结合黎曼-刘维尔分数阶积分变换。数值实验表明,新核函数在梯度下降方法中表现与负距离核相当,但此时具备理论保证。