Empirical regression discontinuity (RD) studies often use covariates to increase the precision of their estimates. In this paper, we propose a novel class of estimators that use such covariate information more efficiently than existing methods and can accommodate many covariates. It involves running a standard RD analysis in which a function of the covariates has been subtracted from the original outcome variable. We characterize the function that leads to the estimator with the smallest asymptotic variance, and consider feasible versions of such estimators in which this function is estimated, for example, through modern machine learning techniques.
翻译:实证断点回归研究常利用协变量以提高估计精度。本文提出一类新型估计量,相较于现有方法能更高效地利用协变量信息,并可容纳多维度协变量。该方法通过从原始结果变量中减去协变量的函数后执行标准断点回归分析来实现。我们刻画了能使估计量渐近方差最小化的函数特征,并探讨了通过现代机器学习技术估计该函数的可行估计量实现方案。