Hypergraph Neural Networks (HyGNNs) have demonstrated remarkable success in modeling higher-order relationships among entities. However, their performance often degrades on heterophilic hypergraphs, where nodes connected by the same hyperedge tend to have dissimilar semantic representations or belong to different classes. While several HyGNNs, including our prior work BHyGNN, have been proposed to address heterophily, their reliance on labeled data significantly limits their applicability in real-world scenarios where annotations are scarce or costly. To overcome this limitation, we introduce BHyGNN+, a self-supervised learning framework that extends BHyGNN for representation learning on heterophilic hypergraphs without requiring ground-truth labels. The core idea of BHyGNN+ is hypergraph duality, a structural transformation where the roles of nodes and hyperedges are interchanged. By contrasting augmented views of a hypergraph against its dual using cosine similarity, our framework captures essential structural patterns in a fully unsupervised manner. Notably, this duality-based formulation eliminates the need for negative samples, a common requirement in existing hypergraph contrastive learning methods that is often difficult to satisfy in practice. Extensive experiments on eleven benchmark datasets demonstrate that BHyGNN+ consistently outperforms state-of-the-art supervised and self-supervised baselines on both heterophilic and homophilic hypergraphs. Our results validate the effectiveness of leveraging hypergraph duality for self-supervised learning and establish a new paradigm for representation learning on challenging, unlabeled hypergraphs.
翻译:超图神经网络(HyGNNs)在建模实体间高阶关系方面已展现出显著成效。然而,在异配超图上,其性能往往下降,因为由同一超边连接的节点倾向于具有不同的语义表示或属于不同类别。尽管已有包括我们先前工作 BHyGNN 在内的多种 HyGNN 被提出以应对异配性,但它们对标注数据的依赖严重限制了其在标注稀缺或成本高昂的现实场景中的适用性。为克服这一局限,我们提出了 BHyGNN+,一种自监督学习框架,它扩展了 BHyGNN,可在无需真实标签的情况下对异配超图进行表示学习。BHyGNN+ 的核心思想是超图对偶性,即一种节点与超边角色互换的结构变换。通过使用余弦相似度对比超图及其对偶的增强视图,我们的框架以完全无监督的方式捕获了关键的结构模式。值得注意的是,这种基于对偶性的表述消除了对负样本的需求,而负样本是现有超图对比学习方法中常见但实践中往往难以满足的要求。在十一个基准数据集上的大量实验表明,BHyGNN+ 在异配和同配超图上均持续优于最先进的监督与自监督基线方法。我们的结果验证了利用超图对偶性进行自监督学习的有效性,并为具有挑战性的未标注超图上的表示学习确立了一种新范式。