Predictive uncertainty quantification is crucial for reliable decision-making in various applied domains. Bayesian neural networks offer a powerful framework for this task. However, defining meaningful priors and ensuring computational efficiency remain significant challenges, especially for complex real-world applications. This paper addresses these challenges by proposing a novel neural adaptive empirical Bayes (NA-EB) framework. NA-EB leverages a class of implicit generative priors derived from low-dimensional distributions. This allows for efficient handling of complex data structures and effective capture of underlying relationships in real-world datasets. The proposed NA-EB framework combines variational inference with a gradient ascent algorithm. This enables simultaneous hyperparameter selection and approximation of the posterior distribution, leading to improved computational efficiency. We establish the theoretical foundation of the framework through posterior and classification consistency. We demonstrate the practical applications of our framework through extensive evaluations on a variety of tasks, including the two-spiral problem, regression, 10 UCI datasets, and image classification tasks on both MNIST and CIFAR-10 datasets. The results of our experiments highlight the superiority of our proposed framework over existing methods, such as sparse variational Bayesian and generative models, in terms of prediction accuracy and uncertainty quantification.
翻译:预测不确定性量化对于各应用领域中的可靠决策至关重要。贝叶斯神经网络为此任务提供了强大框架,但定义有意义的先验并确保计算效率仍是重大挑战,尤其针对复杂实际应用场景。本文提出新颖的神经自适应经验贝叶斯(NA-EB)框架以应对这些挑战。该框架利用源自低维分布的隐式生成先验类,能够高效处理复杂数据结构并有效捕捉真实数据集中的潜在关联。所提NA-EB框架将变分推断与梯度上升算法相结合,可同时实现超参数选择与后验分布近似,从而提升计算效率。我们通过后验一致性与分类一致性奠定了该框架的理论基础,并在两螺旋问题、回归任务、10个UCI数据集以及MNIST和CIFAR-10图像分类任务等多样场景中开展广泛评估,验证了其实际应用价值。实验结果凸显了本框架相较于稀疏变分贝叶斯与生成模型等现有方法在预测精度与不确定性量化方面的优越性。