The widespread adoption of randomized controlled trials (A/B Tests) for decision-making has introduced a pervasive "Winner's Curse": experiments selected for launch often exhibit upwardly biased effect estimates and invalid confidence intervals. This selection bias leads to over-optimistic impact projections and undermines decision-making, particularly in low-power regimes. We propose Bayesian Hybrid Shrinkage (BHS), an empirical Bayes (EB) framework that leverages data-driven priors to mitigate selection bias and provides accurate uncertainty quantification. Unlike traditional EB methods that apply uniform shrinkage, BHS introduces an experiment-specific "local" shrinkage factor that incorporates individual experiment characteristics, improving robustness against prior misspecification. We also derive a closed-form inference strategy designed for high-throughput production environments. Extensive simulations and real-world evaluations at Meta Platforms demonstrate that BHS outperforms existing methods in terms of bias reduction and interval coverage, even under substantial violations of modeling assumptions.
翻译:随机对照试验(A/B测试)在决策中的广泛采用引发了一种普遍的“赢家诅咒”:被选中上线实施的实验往往存在效应值高估和置信区间失效的问题。这种选择偏差会导致过度乐观的影响预测,并损害决策质量,在统计功效较低的情况下尤为严重。本文提出贝叶斯混合收缩(BHS)方法——一种利用数据驱动先验来缓解选择偏差并提供准确不确定性量化的经验贝叶斯(EB)框架。与传统EB方法采用均匀收缩不同,BHS引入实验特定的“局部”收缩因子,该因子融合个体实验特征,从而提升对先验设定错误的鲁棒性。我们还推导出适用于高通量生产环境的闭式推断策略。在Meta Platforms进行的大量模拟和实际评估表明,即使在严重违反模型假设的情况下,BHS在偏差削减和区间覆盖方面均优于现有方法。