Contemporary machine learning methods will try to approach the Bayes error, as it is the lowest possible error any model can achieve. This paper postulates that any decision is composed of not one but two Bayesian decisions and that decision-making is, therefore, a double-Bayesian process. The paper shows how this duality implies intrinsic uncertainty in decisions and how it incorporates explainability. The proposed approach understands that Bayesian learning is tantamount to finding a base for a logarithmic function measuring uncertainty, with solutions being fixed points. Furthermore, following this approach, the golden ratio describes possible solutions satisfying Bayes' theorem. The double-Bayesian framework suggests using a learning rate and momentum weight with values similar to those used in the literature to train neural networks with stochastic gradient descent.
翻译:当代机器学习方法致力于逼近贝叶斯误差,因为这是任何模型能达到的最低可能误差。本文提出假设:任何决策并非由单一贝叶斯决策构成,而是由两个贝叶斯决策组成,因此决策过程本质上是双重贝叶斯过程。论文论证了这种二元性如何隐含决策的固有不确定性,以及如何通过该框架实现可解释性。所提出的方法将贝叶斯学习理解为寻找度量不确定性的对数函数基,其解即为固定点。进一步地,遵循该方法可推导出黄金比例能够描述满足贝叶斯定理的可能解。双重贝叶斯框架建议采用与文献中训练随机梯度下降神经网络相似数值的学习率和动量权重。