In the rapidly evolving realm of artificial intelligence, deploying large language models (LLMs) poses increasingly pressing computational and environmental challenges. This paper introduces MELODI - Monitoring Energy Levels and Optimization for Data-driven Inference - a multifaceted framework crafted to monitor and analyze the energy consumed during LLM inference processes. MELODI enables detailed observations of power consumption dynamics and facilitates the creation of a comprehensive dataset reflective of energy efficiency across varied deployment scenarios. The dataset, generated using MELODI, encompasses a broad spectrum of LLM deployment frameworks, multiple language models, and extensive prompt datasets, enabling a comparative analysis of energy use. Using the dataset, we investigate how prompt attributes, including length and complexity, correlate with energy expenditure. Our findings indicate substantial disparities in energy efficiency, suggesting ample scope for optimization and adoption of sustainable measures in LLM deployment. Our contribution lies not only in the MELODI framework but also in the novel dataset, a resource that can be expanded by other researchers. Thus, MELODI is a foundational tool and dataset for advancing research into energy-conscious LLM deployment, steering the field toward a more sustainable future.
翻译:在人工智能快速发展的领域中,部署大型语言模型(LLMs)带来了日益紧迫的计算和环境挑战。本文介绍了MELODI——面向数据驱动推理的能耗监测与优化框架——这是一个为监测和分析LLM推理过程能耗而设计的综合性框架。MELODI能够实现对功耗动态的详细观测,并有助于创建反映不同部署场景下能效的全面数据集。利用MELODI生成的数据集涵盖了广泛的LLM部署框架、多种语言模型以及大量的提示数据集,从而支持对能耗进行对比分析。基于该数据集,我们研究了提示属性(包括长度和复杂性)与能耗之间的关联。我们的研究结果表明,能效存在显著差异,这提示在LLM部署中存在广阔的优化空间和可持续措施的应用潜力。我们的贡献不仅在于MELODI框架,还在于这一新颖的数据集——一个可供其他研究者扩展的资源。因此,MELODI是推动能耗意识LLM部署研究的基础工具和数据集,旨在引导该领域迈向更可持续的未来。