Laplacian-based methods are popular for dimensionality reduction of data lying in $\mathbb{R}^N$. Several theoretical results for these algorithms depend on the fact that the Euclidean distance approximates the geodesic distance on the underlying submanifold which the data are assumed to lie on. However, for some applications, other metrics, such as the Wasserstein distance, may provide a more appropriate notion of distance than the Euclidean distance. We provide a framework that generalizes the problem of manifold learning to metric spaces and study when a metric satisfies sufficient conditions for the pointwise convergence of the graph Laplacian.
翻译:基于拉普拉斯算子的方法在降维处理位于$\mathbb{R}^N$中的数据时广受欢迎。这些算法的若干理论结果依赖于欧氏距离近似于数据所在底层子流形上的测地距离这一事实。然而,对于某些应用场景,其他度量方式(例如Wasserstein距离)可能比欧氏距离提供更合适的距离概念。我们提出了一个将流形学习问题推广到度量空间的框架,并研究了度量满足图拉普拉斯算子逐点收敛的充分条件。