The development of semi-supervised learning (SSL) has in recent years largely focused on the development of new consistency regularization or entropy minimization approaches, often resulting in models with complex training strategies to obtain the desired results. In this work, we instead propose a novel approach that explicitly incorporates the underlying clustering assumption in SSL through extending a recently proposed differentiable clustering module. Leveraging annotated data to guide the cluster centroids results in a simple end-to-end trainable deep SSL approach. We demonstrate that the proposed model improves the performance over the supervised-only baseline and show that our framework can be used in conjunction with other SSL methods to further boost their performance.
翻译:近年来,半监督学习(SSL)的发展主要集中于提出新的 一致性正则化 或 熵最小化 方法,这通常导致模型需要复杂的训练策略才能获得理想结果。在本研究中,我们提出了一种新颖的方法,通过扩展最近提出的可微分聚类模块,将 SSL 中基础的聚类假设显式地融入模型。利用标注数据引导聚类中心,我们构建了一种简单的端到端可训练深度 SSL 方法。实验表明,所提模型性能优于纯监督基线,并且我们的框架可以与其他 SSL 方法结合使用,以进一步提升其性能。