Some recently developed code large language models (Code LLMs) have been pre-trained on repository-level code data (Repo-Code LLMs), enabling these models to recognize repository structures and utilize cross-file information for code completion. However, in real-world development scenarios, simply concatenating the entire code repository often exceeds the context window limits of these Repo-Code LLMs, leading to significant performance degradation. In this study, we conducted extensive preliminary experiments and analyses on six Repo-Code LLMs. The results indicate that maintaining the topological dependencies of files and increasing the code file content in the completion prompts can improve completion accuracy; pruning the specific implementations of functions in all dependent files does not significantly reduce the accuracy of completions. Based on these findings, we proposed a strategy named Hierarchical Context Pruning (HCP) to construct completion prompts with high informational code content. The HCP models the code repository at the function level, maintaining the topological dependencies between code files while removing a large amount of irrelevant code content, significantly reduces the input length for repository-level code completion. We applied the HCP strategy in experiments with six Repo-Code LLMs, and the results demonstrate that our proposed method can significantly enhance completion accuracy while substantially reducing the length of input. Our code and data are available at https://github.com/Hambaobao/HCP-Coder.
翻译:近期开发的代码大语言模型(Code LLMs)中,部分已在仓库级代码数据上进行预训练(Repo-Code LLMs),使这些模型能够识别仓库结构并利用跨文件信息进行代码补全。然而,在实际开发场景中,简单拼接整个代码仓库通常会超出这些 Repo-Code LLMs 的上下文窗口限制,导致性能显著下降。本研究对六种 Repo-Code LLMs 进行了广泛的初步实验与分析。结果表明:在补全提示中保持文件的拓扑依赖关系并增加代码文件内容可提升补全准确率;剪枝所有依赖文件中函数的具体实现不会显著降低补全准确率。基于这些发现,我们提出了一种名为层次化上下文剪枝(HCP)的策略,用于构建具有高信息量代码内容的补全提示。HCP 在函数级别对代码仓库进行建模,在保持代码文件间拓扑依赖关系的同时移除大量无关代码内容,显著降低了仓库级代码补全的输入长度。我们在六种 Repo-Code LLMs 的实验中应用了 HCP 策略,结果表明所提方法能在显著缩短输入长度的同时大幅提升补全准确率。我们的代码与数据公开于 https://github.com/Hambaobao/HCP-Coder。