Adaptive designs are increasingly used in clinical trials and online experiments to improve participant outcomes by dynamically updating treatment allocation as data accumulate. In practice, experimenters often consider multiple candidate designs, each with distinct trade-offs. However, typically only one design is implemented at a time, leaving benefits and costs of alternative designs unobserved and unquantified. To address this, we propose a novel meta-level adaptive design framework that enables real-time, data-driven evaluation and selection among candidate adaptive designs. Specifically, we define a new class of causal estimands to evaluate adaptive designs and propose Targeted Maximum Likelihood Estimators for these estimands. These estimators are asymptotically normal while accommodating dependence in adaptive-design data without parametric assumptions, enabling online selection among candidate designs. We further apply this framework to a motivating example where multiple surrogates of a long-term outcome are considered for updating randomization probabilities in adaptive experiments. Unlike existing surrogate evaluation methods, our approach comprehensively quantifies surrogates' utility to accelerate detection of heterogeneous treatment effects, expedite updates to treatment randomization, and improve participant outcomes, facilitating dynamic selection among surrogate-guided designs. Overall, our framework provides a unified approach for evaluating opportunities and costs of various adaptive designs and guiding real-time decision-making in adaptive experiments.
翻译:暂无翻译