Robust inference based on the minimization of statistical divergences has proved to be a useful alternative to classical techniques based on maximum likelihood and related methods. Basu et al. (1998) introduced the density power divergence (DPD) family as a measure of discrepancy between two probability density functions and used this family for robust estimation of the parameter for independent and identically distributed data. Ghosh et al. (2017) proposed a more general class of divergence measures, namely the S-divergence family and discussed its usefulness in robust parametric estimation through several asymptotic properties and some numerical illustrations. In this paper, we develop the results concerning the asymptotic breakdown point for the minimum S-divergence estimators (in particular the minimum DPD estimator) under general model setups. The primary result of this paper provides lower bounds to the asymptotic breakdown point of these estimators which are independent of the dimension of the data, in turn corroborating their usefulness in robust inference under high dimensional data.
翻译:基于统计散度最小化的稳健推断已被证明是经典极大似然及相关方法的有用替代方案。Basu等人(1998)提出了密度幂散度族作为两个概率密度函数间差异性的度量,并将该族用于独立同分布数据的参数稳健估计。Ghosh等人(2017)提出了更广义的散度度量类别——S-散度族,并通过若干渐近性质和数值示例论证了其在稳健参数估计中的有效性。本文在一般模型设定下,发展了关于最小S-散度估计量(特别是最小密度幂散度估计量)渐近崩溃点的理论结果。本文的主要结论给出了这些估计量渐近崩溃点的下界,该下界与数据维度无关,从而印证了它们在高维数据稳健推断中的实用价值。