Deep hashing, due to its low cost and efficient retrieval advantages, is widely valued in cross-modal retrieval. However, existing cross-modal hashing methods either explore the relationships between data points, which inevitably leads to intra-class dispersion, or explore the relationships between data points and categories while ignoring the preservation of inter-class structural relationships, resulting in the generation of suboptimal hash codes. How to maintain both intra-class aggregation and inter-class structural relationships, In response to this issue, this paper proposes a DCGH method. Specifically, we use proxy loss as the mainstay to maintain intra-class aggregation of data, combined with pairwise loss to maintain inter-class structural relationships, and on this basis, further propose a variance constraint to address the semantic bias issue caused by the combination. A large number of comparative experiments on three benchmark datasets show that the DCGH method has comparable or even better performance compared to existing cross-modal retrieval methods. The code for the implementation of our DCGH framework is available at https://github.com/donnotnormal/DCGH.
翻译:深度哈希因其低成本和高效检索优势,在跨模态检索中受到广泛重视。然而,现有的跨模态哈希方法要么探索数据点之间的关系,这不可避免地导致类内离散;要么探索数据点与类别之间的关系,同时忽略了类间结构关系的保持,导致生成次优的哈希码。如何在保持类内聚合的同时维持类间结构关系,针对这一问题,本文提出了一种DCGH方法。具体而言,我们使用代理损失作为主干来保持数据的类内聚合,结合成对损失来维持类间结构关系,并在此基础上进一步提出方差约束以解决组合引起的语义偏差问题。在三个基准数据集上进行的大量对比实验表明,DCGH方法与现有的跨模态检索方法相比具有相当甚至更优的性能。我们DCGH框架的实现代码可在https://github.com/donnotnormal/DCGH获取。