This paper is about the recent notion of computably probably approximately correct learning, which lies between the statistical learning theory where there is no computational requirement on the learner and efficient PAC where the learner must be polynomially bounded. Examples have recently been given of hypothesis classes which are PAC learnable but not computably PAC learnable, but these hypothesis classes are unnatural or non-canonical in the sense that they depend on a numbering of proofs, formulas, or programs. We use the on-a-cone machinery from computability theory to prove that, under mild assumptions such as that the hypothesis class can be computably listable, any natural hypothesis class which is learnable must be computably learnable. Thus the counterexamples given previously are necessarily unnatural.
翻译:本文探讨了最近提出的可计算概率近似正确学习概念,该概念介于无计算要求的统计学习理论与要求学习器具有多项式时间复杂度的高效PAC学习之间。近期已有研究表明存在某些假设类可以被PAC学习但无法被可计算PAC学习,但这些假设类具有非自然或非规范特性,因其依赖于证明、公式或程序的编码枚举。我们运用可计算性理论中的锥上方法证明,在假设类可计算可枚举等温和条件下,任何可学习的自然假设类必然是可计算可学习的。因此先前提出的反例本质上必然是非自然的。