Trend Filtering is a nonparametric regression method which exhibits local adaptivity, in contrast to a host of classical linear smoothing methods. However, there seems to be no unanimously agreed upon definition of local adaptivity in the literature. A question we seek to answer here is how exactly is Fused Lasso or Total Variation Denoising, which is Trend Filtering of order $0$, locally adaptive? To answer this question, we first derive a new pointwise formula for the Fused Lasso estimator in terms of min-max/max-min optimization of penalized local averages. This pointwise representation appears to be new and gives a concrete explanation of the local adaptivity of Fused Lasso. It yields that the estimation error of Fused Lasso at any given point is bounded by the best (local) bias variance tradeoff where bias and variance have a slightly different meaning than usual. We then propose higher order polynomial versions of Fused Lasso which are defined pointwise in terms of min-max/max-min optimization of penalized local polynomial regressions. These appear to be new nonparametric regression methods, different from any existing method in the nonparametric regression toolbox. We call these estimators Minmax Trend Filtering. They continue to enjoy the notion of local adaptivity in the sense that their estimation error at any given point is bounded by the best (local) bias variance tradeoff.
翻译:趋势滤波是一种展现局部自适应性的非参数回归方法,与众多经典线性平滑方法形成对比。然而,文献中似乎并未对局部自适应性达成一致的定义。我们在此试图回答的一个问题是:Fused Lasso(或称全变差去噪,即零阶趋势滤波)究竟如何实现局部自适应?为回答此问题,我们首先推导出Fused Lasso估计量的新逐点公式,该公式通过惩罚局部平均值的极小极大/极大极小优化表达。这一逐点表示形式似乎是新的,为Fused Lasso的局部自适应性提供了具体解释。由此可得,Fused Lasso在任意给定点处的估计误差受限于最优(局部)偏差-方差权衡,其中偏差和方差的含义与通常略有不同。随后,我们提出Fused Lasso的高阶多项式版本,其通过惩罚局部多项式回归的极小极大/极大极小优化逐点定义。这似乎构成了新的非参数回归方法,不同于非参数回归工具箱中的任何现有方法。我们将这些估计量称为极小极大趋势滤波。它们仍保持局部自适应性的特性,即任意给定点处的估计误差受限于最优(局部)偏差-方差权衡。