Ontologies are one of the richest sources of knowledge. Real-world ontologies often contain thousands of axioms and are often human-made. Hence, they may contain inconsistency and incomplete information which may impair classical reasoners to compute entailments that are considered as useful. To overcome these two challenges, we propose FALCON, a Fuzzy Ontology Neural reasoner to approximate reasoning over ALC ontologies. We provide an approximate technique for the model generation step in classical ALC reasoners. Our approximation is not guaranteed to construct exact logical models, but can approximate arbitrary models, which is notably faster for some large ontologies. Moreover, by sampling multiple approximate logical models, our technique supports approximate entailment also over inconsistent ontologies. Theoretical results show that more models generated lead to closer, i.e., faithful approximation of entailment over ALC entailments. Experimental results show that FALCON enables approximate reasoning and reasoning in the presence of inconsistency. Our experiments further demonstrate how ontologies can improve knowledge base completion in biomedicine by incorporating knowledge expressed in ALC.
翻译:本体是最丰富的知识来源之一。现实世界中的本体通常包含数千条公理,且常为人工构建。因此,它们可能包含不一致和不完整的信息,这会影响经典推理器计算被视为有用的蕴涵。为克服这两个挑战,我们提出了FALCON——一种用于近似推理ALC本体的模糊本体神经推理器。我们为经典ALC推理器中的模型生成步骤提供了一种近似技术。我们的近似方法不保证构建精确的逻辑模型,但能近似任意模型,这对于某些大型本体而言速度显著更快。此外,通过采样多个近似逻辑模型,我们的技术还支持对不一致本体进行近似蕴涵推理。理论结果表明,生成的模型越多,对ALC蕴涵的近似就越接近,即越忠实。实验结果表明,FALCON能够实现近似推理以及在存在不一致情况下的推理。我们的实验进一步展示了如何通过整合ALC表达的知识,利用本体改进生物医学领域的知识库补全。