We study classification problems using binary estimators where the decision boundary is described by horizon functions and where the data distribution satisfies a geometric margin condition. A key novelty of our work is the derivation of lower bounds for the worst-case learning rates over broad classes of functions, under a geometric margin condition -- a setting that is almost universally satisfied in practice, but remains theoretically challenging. Moreover, we work in the noiseless setting, where lower bounds are particularly hard to establish. Our general results cover, in particular, classification problems with decision boundaries belonging to several classes of functions: for Barron-regular functions, Hölder-continuous functions, and convex-Lipschitz functions with strong margins, we identify optimal rates close to the fast learning rates of $\mathcal{O}(n^{-1})$ for $n \in \mathbb{N}$ samples.
翻译:本研究探讨了使用二元估计器进行分类的问题,其中决策边界由水平函数描述,且数据分布满足几何边界条件。我们工作的一个关键创新在于,在几何边界条件下——这一条件在实践中几乎普遍满足,但在理论上仍具挑战性——推导了广泛函数类上最坏情况学习率的下界。此外,我们在无噪声环境下展开研究,该环境下建立下界尤为困难。我们的普适性结果特别涵盖了决策边界属于多个函数类的分类问题:对于Barron正则函数、Hölder连续函数以及具有强边界的凸Lipschitz函数,我们识别出接近$\mathcal{O}(n^{-1})$的快速学习率(其中$n \in \mathbb{N}$为样本数)的最优速率。