Classical results in asymptotic statistics show that the Fisher information matrix controls the difficulty of estimating a statistical model from observed data. In this work, we introduce a companion measure of robustness of an estimation problem: the radius of statistical efficiency (RSE) is the size of the smallest perturbation to the problem data that renders the Fisher information matrix singular. We compute RSE up to numerical constants for a variety of testbed problems, including principal component analysis, generalized linear models, phase retrieval, bilinear sensing, and matrix completion. Interestingly, we observe a precise reciprocal relationship between RSE and the intrinsic complexity/sensitivity of the problem instance, paralleling the classical Eckart-Young theorem in numerical analysis. To establish our results, we develop theory for spectral functions of measures that extends well-known results from matrix analysis and eigenvalue optimization$-$a contribution that may be of interest beyond our immediate findings.
翻译:渐近统计学中的经典结果表明,费舍尔信息矩阵控制着从观测数据估计统计模型的难度。在本研究中,我们引入了一个衡量估计问题稳健性的伴随度量:统计效率半径(RSE)是指使费舍尔信息矩阵奇异所需的问题数据最小扰动幅度。我们针对多种基准问题(包括主成分分析、广义线性模型、相位恢复、双线性感知和矩阵补全)计算了RSE的数值常数。有趣的是,我们观察到RSE与问题实例的内在复杂性/敏感性之间存在精确的倒数关系,这平行于数值分析中的经典Eckart-Young定理。为建立这些结果,我们发展了测度谱函数理论,该理论扩展了矩阵分析和特征值优化中的著名结论——这一贡献可能超越我们当前的研究发现而具有独立意义。