Variational logistic regression is a popular method for approximate Bayesian inference seeing wide-spread use in many areas of machine learning including: Bayesian optimization, reinforcement learning and multi-instance learning to name a few. However, due to the intractability of the Evidence Lower Bound, authors have turned to the use of Monte Carlo, quadrature or bounds to perform inference, methods which are costly or give poor approximations to the true posterior. In this paper we introduce a new bound for the expectation of softplus function and subsequently show how this can be applied to variational logistic regression and Gaussian process classification. Unlike other bounds, our proposal does not rely on extending the variational family, or introducing additional parameters to ensure the bound is tight. In fact, we show that this bound is tighter than the state-of-the-art, and that the resulting variational posterior achieves state-of-the-art performance, whilst being significantly faster to compute than Monte-Carlo methods.
翻译:变分逻辑回归作为一种近似贝叶斯推断方法,在机器学习诸多领域得到广泛应用,包括但不限于贝叶斯优化、强化学习与多示例学习。然而,由于证据下界难以直接处理,研究者多采用蒙特卡洛方法、数值积分或替代下界进行推断,这些方法或计算成本高昂,或对真实后验分布的近似效果欠佳。本文提出了一种新的softplus函数期望下界,并系统阐述其在变分逻辑回归与高斯过程分类中的应用。相较于现有方法,本方案无需扩展变分分布族或引入额外参数以保证下界紧致性。实验表明,该下界较当前最优方法具有更严格的紧致特性,所得变分后验分布达到最先进性能指标,且计算效率显著优于蒙特卡洛方法。