In the value-added literature, it is often claimed that regressing on empirical Bayes shrinkage estimates corrects for the measurement error problem in linear regression. We clarify the conditions needed; we argue that these conditions are stronger than the those needed for classical measurement error correction, which we advocate for instead. Moreover, we show that the classical estimator cannot be improved without stronger assumptions. We extend these results to regressions on nonlinear transformations of the latent attribute and find generically slow minimax estimation rates.
翻译:在增值文献中,常有人认为对经验贝叶斯收缩估计量进行回归可以修正线性回归中的测量误差问题。我们阐明了所需的条件;我们认为这些条件比经典测量误差修正所需的条件更强,而后者是我们所倡导的方法。此外,我们证明若不引入更强的假设,经典估计量无法被进一步改进。我们将这些结果推广至对潜在属性非线性变换的回归,并普遍发现极小极大估计速率较慢。