Large language models (LLMs)such as ChatGPT have significantly advanced the field of Natural Language Processing (NLP). This trend led to the development of code-based large language models such as StarCoder, WizardCoder, and CodeLlama, which are trained extensively on vast repositories of code and programming languages. While the generic abilities of these code LLMs are useful for many programmers in tasks like code generation, the area of high-performance computing (HPC) has a narrower set of requirements that make a smaller and more domain-specific model a smarter choice. This paper presents OMPGPT, a novel domain-specific model meticulously designed to harness the inherent strengths of language models for OpenMP pragma generation. Furthermore, we leverage prompt engineering techniques from the NLP domain to create Chain-of-OMP, an innovative strategy designed to enhance OMPGPT's effectiveness. Our extensive evaluations demonstrate that OMPGPT outperforms existing large language models specialized in OpenMP tasks and maintains a notably smaller size, aligning it more closely with the typical hardware constraints of HPC environments. We consider our contribution as a pivotal bridge, connecting the advantage of language models with the specific demands of HPC tasks.
翻译:以ChatGPT为代表的大语言模型(LLMs)显著推动了自然语言处理(NLP)领域的发展。这一趋势催生了基于代码的大语言模型(如StarCoder、WizardCoder和CodeLlama)的开发,这些模型在海量代码仓库和多种编程语言上进行了广泛训练。尽管此类代码大语言模型的通用能力在代码生成等任务中对许多程序员具有实用价值,但高性能计算(HPC)领域的需求范围相对聚焦,使得规模更小、领域针对性更强的模型成为更优选择。本文提出OMPGPT,这是一个新颖的领域专用模型,旨在精心利用语言模型的内在优势来实现OpenMP编译指导语句的生成。此外,我们借鉴NLP领域的提示工程技术,提出了Chain-of-OMP这一创新策略,以提升OMPGPT的效能。广泛的评估表明,OMPGPT在OpenMP相关任务上超越了现有专用大语言模型,同时保持了显著更小的模型规模,使其更契合HPC环境中常见的硬件限制。我们认为本研究成果构建了一座关键桥梁,将语言模型的优势与HPC任务的具体需求紧密连接起来。