We study boosting for adversarial online nonparametric regression with general convex losses. We first introduce a parameter-free online gradient boosting (OGB) algorithm and show that its application to chaining trees achieves minimax optimal regret when competing against Lipschitz functions. While competing with nonparametric function classes can be challenging, the latter often exhibit local patterns, such as local Lipschitzness, that online algorithms can exploit to improve performance. By applying OGB over a core tree based on chaining trees, our proposed method effectively competes against all prunings that align with different Lipschitz profiles and demonstrates optimal dependence on the local regularities. As a result, we obtain the first computationally efficient algorithm with locally adaptive optimal rates for online regression in an adversarial setting.
翻译:本文研究具有一般凸损失的对抗性在线非参数回归中的提升方法。我们首先提出一种无参数的在线梯度提升(OGB)算法,并证明其在链式树上的应用,当与Lipschitz函数竞争时,能达到极小极大最优遗憾。虽然与非参数函数类竞争可能具有挑战性,但后者常表现出局部模式(如局部Lipschitz性),在线算法可利用这些模式提升性能。通过在基于链式树的核心树上应用OGB,我们提出的方法能有效与所有符合不同Lipschitz剖面的剪枝版本竞争,并展现出对局部正则性的最优依赖关系。因此,我们获得了首个在对抗性环境下具有局部自适应最优速率的计算高效在线回归算法。