This paper develops a novel mathematical framework for collaborative learning by means of geometrically inspired kernel machines which includes statements on the bounds of generalisation and approximation errors, and sample complexity. For classification problems, this approach allows us to learn bounded geometric structures around given data points and hence solve the global model learning problem in an efficient way by exploiting convexity properties of the related optimisation problem in a Reproducing Kernel Hilbert Space (RKHS). In this way, we can reduce classification problems to determining the closest bounded geometric structure from a given data point. Further advantages that come with our solution is that our approach does not require clients to perform multiple epochs of local optimisation using stochastic gradient descent, nor require rounds of communication between client/server for optimising the global model. We highlight that numerous experiments have shown that the proposed method is a competitive alternative to the state-of-the-art.
翻译:本文提出了一种基于几何启发的核机器的新型协同学习数学框架,该框架包含泛化误差与逼近误差的界以及样本复杂度的相关论述。对于分类问题,该方法使我们能够学习给定数据点周围的有界几何结构,从而通过在再生核希尔伯特空间(RKHS)中利用相关优化问题的凸性特性,以高效方式解决全局模型学习问题。通过这种方式,我们可以将分类问题简化为确定给定数据点最近的有界几何结构。我们提出的解决方案还具有以下优势:该方法既不需要客户端使用随机梯度下降进行多轮局部优化,也不需要通过客户端/服务器之间的多轮通信来优化全局模型。我们强调,大量实验表明所提方法是当前最先进方法的具有竞争力的替代方案。