Empirical regression discontinuity (RD) studies often use covariates to increase the precision of their estimates. In this paper, we propose a novel class of estimators that use such covariate information more efficiently than existing methods and can accommodate many covariates. It involves running a standard RD analysis in which a function of the covariates has been subtracted from the original outcome variable. We characterize the function that leads to the estimator with the smallest asymptotic variance, and consider feasible versions of such estimators in which this function is estimated, for example, through modern machine learning techniques.
翻译:实证断点回归(RD)研究常利用协变量以提高估计精度。本文提出一类新颖的估计量,能比现有方法更有效地利用此类协变量信息,并可容纳大量协变量。该方法通过从原始结果变量中减去协变量的某个函数后,执行标准断点回归分析来实现。我们刻画了能使渐近方差最小化的估计量所对应的函数形式,并探讨了该函数的可行估计方案——例如通过现代机器学习技术进行估计。