Approximate Nearest Neighbor Search (ANNS) is a fundamental and critical component in many applications, including recommendation systems and large language model-based applications. With the advancement of multimodal neural models, which transform data from different modalities into a shared high-dimensional space as feature vectors, cross-modal ANNS aims to use the data vector from one modality (e.g., texts) as the query to retrieve the most similar items from another (e.g., images or videos). However, there is an inherent distribution gap between embeddings from different modalities, and cross-modal queries become Out-of-Distribution (OOD) to the base data. Consequently, state-of-the-art ANNS approaches suffer poor performance for OOD workloads. In this paper, we quantitatively analyze the properties of the OOD workloads to gain an understanding of their ANNS efficiency. Unlike single-modal workloads, we reveal OOD queries spatially deviate from base data, and the k-nearest neighbors of an OOD query are distant from each other in the embedding space. The property breaks the assumptions of existing ANNS approaches and mismatches their design for efficient search. With insights from the OOD workloads, we propose pRojected bipartite Graph (RoarGraph), an efficient ANNS graph index built under the guidance of query distribution. Extensive experiments show that RoarGraph significantly outperforms state-of-the-art approaches on modern cross-modal datasets, achieving up to 3.56x faster search speed at a 90% recall rate for OOD queries.
翻译:近似最近邻搜索(ANNS)是推荐系统与基于大语言模型的应用等诸多领域中的基础且关键组件。随着多模态神经模型的发展,不同模态的数据被转化为共享高维空间中的特征向量,跨模态ANNS旨在利用某一模态(如文本)的数据向量作为查询,从另一模态(如图像或视频)中检索最相似的条目。然而,不同模态的嵌入向量之间存在固有的分布差异,跨模态查询对基础数据而言成为分布外(OOD)查询。因此,现有最先进的ANNS方法在处理OOD工作负载时性能显著下降。本文通过定量分析OOD工作负载的特性,以深入理解其ANNS效率。与单模态工作负载不同,我们发现OOD查询在空间上偏离基础数据,且OOD查询的k个最近邻在嵌入空间中彼此距离较远。这一特性打破了现有ANNS方法的假设前提,与其高效搜索的设计原则失配。基于对OOD工作负载的洞察,我们提出投影二分图(RoarGraph),这是一种在查询分布指导下构建的高效ANNS图索引。大量实验表明,RoarGraph在现代跨模态数据集上显著优于最先进的方法,在OOD查询的90%召回率下实现了最高达3.56倍的搜索速度提升。