Unifying probabilistic and logical learning is a key challenge in AI. We introduce a Bayesian inductive logic programming approach that learns minimum message length hypotheses from noisy data. Our approach balances hypothesis complexity and data fit through priors, which favour more general programs, and a likelihood, which favours accurate programs. Our experiments on several domains, including game playing and drug design, show that our method significantly outperforms previous methods, notably those that learn minimum description length programs. Our results also show that our approach is data-efficient and insensitive to example balance, including the ability to learn from exclusively positive examples.
翻译:统一概率学习与逻辑学习是人工智能领域的关键挑战。本文提出一种贝叶斯归纳逻辑程序设计方法,能够从含噪声数据中学习最小消息长度假设。该方法通过先验分布(倾向于更通用的程序)和似然函数(倾向于更精确的程序)来平衡假设复杂度与数据拟合度。我们在多个领域(包括游戏博弈和药物设计)的实验表明,本方法显著优于现有方法,特别是那些学习最小描述长度程序的方法。实验结果还表明,该方法具有数据高效性且对示例平衡不敏感,包括能够仅从正例中学习的能力。