This paper introduces two new robust methods for estimation of parameters in a given parametric family. The first method is that of `minimum weighted L2', effectively minimising an estimate of the integrated (and possibly weighted) squared error. The second is `robust Kullback-Leibler', consisting of minimising a robust version of the empirical Kullback-Leibler distance, and can be viewed as a general robust modification of the maximum likelihood procedure. This second method is also related to recent local likelihood ideas for semiparametric density estimation. The methods are described, influence functions are found, as are formulae for asymptotic variances. In particular large-sample efficiencies are computed under the home turf conditions of the underlying parametric model. The methods and formulae are illustrated for the normal model.
翻译:本文针对给定参数族中的参数估计问题提出了两种新的鲁棒方法。第一种方法是“最小加权L2估计”,其本质是通过最小化积分(可能加权)平方误差的估计量来实现。第二种方法是“鲁棒Kullback-Leibler估计”,通过最小化经验Kullback-Leibler距离的鲁棒版本实现,可视为最大似然估计的广义鲁棒修正。该方法亦与近期半参数密度估计中的局部似然思想相关联。本文详细阐述了这两种方法,推导了影响函数与渐近方差公式,特别在底层参数模型的理想条件下计算了大样本效率。最后以正态模型为例对方法与公式进行了演示。