Machine learning applications require fast and reliable per-sample uncertainty estimation. A common approach is to use predictive distributions from Bayesian or approximation methods and additively decompose uncertainty into aleatoric (i.e., data-related) and epistemic (i.e., model-related) components. However, additive decomposition has recently been questioned, with evidence that it breaks down when using finite-ensemble sampling and/or mismatched predictive distributions. This paper introduces Variance-Gated Ensembles (VGE), an intuitive, differentiable framework that injects epistemic sensitivity via a signal-to-noise gate computed from ensemble statistics. VGE provides: (i) a Variance-Gated Margin Uncertainty (VGMU) score that couples decision margins with ensemble predictive variance; and (ii) a Variance-Gated Normalization (VGN) layer that generalizes the variance-gated uncertainty mechanism to training via per-class, learnable normalization of ensemble member probabilities. We derive closed-form vector-Jacobian products enabling end-to-end training through ensemble sample mean and variance. VGE matches or exceeds state-of-the-art information-theoretic baselines while remaining computationally efficient. As a result, VGE provides a practical and scalable approach to epistemic-aware uncertainty estimation in ensemble models. An open-source implementation is available at: https://github.com/nextdevai/vge.
翻译:机器学习应用需要快速可靠的逐样本不确定性估计。一种常见方法是利用贝叶斯或近似方法产生的预测分布,并将不确定性加性分解为偶然性(即数据相关)和认知性(即模型相关)分量。然而,加性分解方法近期受到质疑,有证据表明在使用有限集成采样和/或不匹配预测分布时该方法会失效。本文提出方差门控集成(VGE),这是一种基于集成统计量计算信噪比门控来注入认知敏感性的直观可微框架。VGE提供:(i)方差门控边界不确定性(VGMU)评分,将决策边界与集成预测方差相耦合;(ii)方差门控归一化(VGN)层,通过对集成成员概率进行逐类别可学习的归一化,将方差门控不确定性机制推广至训练过程。我们推导出闭式向量-雅可比乘积,使得通过集成样本均值与方差实现端到端训练成为可能。VGE在保持计算效率的同时,达到或超越了当前最先进的信息论基线方法。因此,VGE为集成模型中的认知不确定性估计提供了一种实用且可扩展的解决方案。开源实现地址:https://github.com/nextdevai/vge。