Modern probabilistic regressors often remain overconfident under distribution shift. We present Functional Distribution Networks (FDN), an input-conditioned distribution over network weights that induces predictive mixtures whose dispersion adapts to the input. FDN is trained with a beta-ELBO and Monte Carlo sampling. We further propose an evaluation protocol that cleanly separates interpolation from extrapolation and stresses OOD sanity checks (e.g., that predictive likelihood degrades under shift while in-distribution accuracy and calibration are maintained). On standard regression tasks, we benchmark against strong Bayesian, ensemble, dropout, and hypernetwork baselines under matched parameter and update budgets, and assess accuracy, calibration, and shift-awareness with standard diagnostics. Together, the framework and protocol aim to make OOD-aware, well-calibrated neural regression practical and modular.
翻译:现代概率回归器在分布偏移下常表现出过度自信。本文提出功能分布网络(FDN),这是一种基于输入条件的网络权重分布,其诱导的预测混合分布的离散度能够自适应输入变化。FDN采用β-ELBO损失函数与蒙特卡洛采样进行训练。我们进一步提出一种评估方案,清晰地区分内插与外推过程,并强调分布外(OOD)稳健性检验(例如要求预测似然在分布偏移时合理衰减,同时保持分布内准确率与校准特性)。在标准回归任务中,我们在匹配参数规模与更新预算的条件下,与贝叶斯方法、集成学习、随机丢弃及超网络等基线模型进行对比,通过标准诊断指标评估其准确率、校准能力与偏移感知性能。本框架与评估方案共同致力于构建实用化、模块化且具备分布外感知能力的校准神经网络回归系统。