Accurate dose-response forecasting under sparse sampling is central to precision pharmacotherapy. We present the Amortized In-Context Mixed-Effect Transformer (AICMET) model, a transformer-based latent-variable framework that unifies mechanistic compartmental priors with amortized in-context Bayesian inference. AICMET is pre-trained on hundreds of thousands of synthetic pharmacokinetic trajectories with Ornstein-Uhlenbeck priors over the parameters of compartment models, endowing the model with strong inductive biases and enabling zero-shot adaptation to new compounds. At inference time, the decoder conditions on the collective context of previously profiled trial participants, generating calibrated posterior predictions for newly enrolled patients after a few early drug concentration measurements. This capability collapses traditional model-development cycles from weeks to hours while preserving some degree of expert modelling. Experiments across public datasets show that AICMET attains state-of-the-art predictive accuracy and faithfully quantifies inter-patient variability -- outperforming both nonlinear mixed-effects baselines and recent neural ODE variants. Our results highlight the feasibility of transformer-based, population-aware neural architectures as offering a new alternative for bespoke pharmacokinetic modeling pipelines, charting a path toward truly population-aware personalized dosing regimens.
翻译:在稀疏采样条件下实现精准的剂量-反应预测是精准药物治疗的核心。我们提出了摊销式上下文混合效应Transformer(AICMET)模型,这是一个基于Transformer的隐变量框架,它将机制性房室先验与摊销式上下文贝叶斯推断相统一。AICMET在数十万条合成药代动力学轨迹上进行预训练,这些轨迹的参数具有基于Ornstein-Uhlenbeck过程的房室模型先验,这赋予了模型强大的归纳偏置,并使其能够零样本适应新化合物。在推理阶段,解码器以先前已分析的试验参与者的集体上下文为条件,在获得新入组患者的少数早期药物浓度测量值后,生成经过校准的后验预测。此能力将传统的模型开发周期从数周缩短至数小时,同时保留了一定程度的专家建模。在多个公开数据集上的实验表明,AICMET达到了最先进的预测精度,并忠实地量化了患者间的变异性——其性能优于非线性混合效应基线模型和最近的神经ODE变体。我们的结果突显了基于Transformer的、群体感知的神经架构作为一种新的定制化药代动力学建模流程替代方案的可行性,为迈向真正群体感知的个性化给药方案指明了方向。