We introduce Mixture-of-Gaussians with Uncertainty-based Gating (MoGU), a novel Mixture-of-Experts (MoE) framework designed for regression tasks. MoGU replaces standard learned gating with an intrinsic routing paradigm where expert-specific uncertainty serves as the native gating signal. By modeling each prediction as a Gaussian distribution, the system utilizes predicted variance to dynamically weight expert contributions. We validate MoGU on multivariate time-series forecasting, a domain defined by high volatility and varying noise patterns. Empirical results across multiple benchmarks, horizon lengths, and backbones demonstrate that MoGU consistently improves forecasting accuracy compared to traditional MoE. Further evaluation via conformal prediction indicates that our approach yields more efficient prediction intervals than existing baselines. These findings highlight MoGU's capacity for providing both competitive performance and reliable, high-fidelity uncertainty quantification. Our code is available at: https://github.com/yolish/moe_unc_tsf
翻译:本文提出基于不确定性的高斯混合门控机制(MoGU),这是一种专为回归任务设计的新型混合专家框架。MoGU采用内在路由范式替代传统学习型门控机制,其中专家特定的不确定性作为原生门控信号。通过将每个预测建模为高斯分布,系统利用预测方差动态加权专家贡献。我们在多元时间序列预测领域验证了MoGU的有效性,该领域具有高波动性和多变噪声模式的特点。跨多个基准测试、预测时域长度和骨干网络的实证结果表明,相较于传统混合专家模型,MoGU能持续提升预测精度。通过保形预测的进一步评估表明,本方法比现有基线能产生更高效的预测区间。这些发现彰显了MoGU在提供竞争性性能与可靠高保真不确定性量化方面的双重能力。代码已开源:https://github.com/yolish/moe_unc_tsf