Classifying points in high dimensional spaces is a fundamental geometric problem in machine learning. In this paper, we address classifying points in the $d$-dimensional Hilbert polygonal metric. The Hilbert metric is a generalization of the Cayley-Klein hyperbolic distance to arbitrary convex bodies and has a diverse range of applications in machine learning and convex geometry. We first present an efficient LP-based algorithm in the metric for the large-margin SVM problem. Our algorithm runs in time polynomial to the number of points, bounding facets, and dimension. This is a significant improvement on previous works, which either provide no theoretical guarantees on running time, or suffer from exponential runtime. We also consider the closely related Funk metric. We also present efficient algorithms for the soft-margin SVM problem and for nearest neighbor-based classification in the Hilbert metric.
翻译:在高维空间中分类点是机器学习中的一个基本几何问题。本文研究在$d$维希尔伯特多边形度量中点的分类问题。希尔伯特度量是凯莱-克莱因双曲距离向任意凸体的推广,在机器学习和凸几何中具有广泛的应用。我们首先针对该度量下的大间隔支持向量机问题,提出了一种基于线性规划的高效算法。该算法的运行时间与点数、边界面和维度呈多项式关系。相较于先前研究——或未提供运行时间的理论保证,或存在指数级时间复杂度——这是一个显著改进。我们还考虑了密切相关的芬克度量。此外,我们针对希尔伯特度量下的软间隔支持向量机问题以及基于最近邻的分类问题,提出了高效算法。