Foundation models are increasingly being deployed in contexts where understanding the uncertainty of their outputs is critical to ensuring responsible deployment. While Bayesian methods offer a principled approach to uncertainty quantification, their computational overhead renders their use impractical for training or inference at foundation model scale. State-of-the-art models achieve parameter counts in the trillions through carefully engineered sparsity including Mixture-of-Experts (MoE) layers. In this work, we demonstrate calibrated uncertainty at scale by introducing Variational Mixture-of-Experts Routing (VMoER), a structured Bayesian approach for modelling uncertainty in MoE layers. VMoER confines Bayesian inference to the expert-selection stage which is typically done by a deterministic routing network. We instantiate VMoER using two inference strategies: amortised variational inference over routing logits and inferring a temperature parameter for stochastic expert selection. Across tested foundation models, VMoER improves routing stability under noise by 38\%, reduces calibration error by 94\%, and increases out-of-distribution AUROC by 12\%, while incurring less than 1\% additional FLOPs. These results suggest VMoER offers a scalable path toward robust and uncertainty-aware foundation models.
翻译:基础模型正日益部署于需要理解其输出不确定性的关键场景,这对确保负责任部署至关重要。虽然贝叶斯方法为不确定性量化提供了原则性途径,但其计算开销使得在基础模型规模上进行训练或推理时难以实际应用。当前最先进的模型通过精心设计的稀疏性(包括专家混合层)实现了万亿级参数规模。本研究通过引入变分专家混合路由——一种用于建模MoE层不确定性的结构化贝叶斯方法,展示了大规模校准不确定性的实现路径。VMoER将贝叶斯推理限定在通常由确定性路由网络执行的专家选择阶段。我们通过两种推理策略实例化VMoER:基于路由逻辑的摊销变分推理,以及用于随机专家选择的温度参数推断。在测试的基础模型中,VMoER将噪声下的路由稳定性提升38%,校准误差降低94%,分布外AUROC提高12%,同时仅增加不足1%的FLOPs开销。这些结果表明VMoER为构建鲁棒且具有不确定性感知能力的基础模型提供了可扩展的路径。