We propose SGD-exp, a stochastic gradient descent approach for linear and ReLU regressions under Massart noise (adversarial semi-random corruption model) for the fully streaming setting. We show novel nearly linear convergence guarantees of SGD-exp to the true parameter with up to $50\%$ Massart corruption rate, and with any corruption rate in the case of symmetric oblivious corruptions. This is the first convergence guarantee result for robust ReLU regression in the streaming setting, and it shows the improved convergence rate over previous robust methods for $L_1$ linear regression due to a choice of an exponentially decaying step size, known for its efficiency in practice. Our analysis is based on the drift analysis of a discrete stochastic process, which could also be interesting on its own.
翻译:我们提出SGD-exp,一种适用于Massart噪声(对抗性半随机损坏模型)下线性与ReLU回归的随机梯度下降方法,且针对完全流式设置。我们展示了SGD-exp以接近线性速率收敛至真实参数的新结果:在Massart损坏率高达50%时成立,且对于对称无意识损坏,适用于任意损坏率。这是流式设置中鲁棒ReLU回归的首个收敛性保证结果,同时由于采用指数衰减步长(实践中已知的高效选择),其在L1线性回归上的收敛率优于先前的鲁棒方法。我们的分析基于离散随机过程的漂移分析,这一方法本身也可能具有独立研究价值。