Large language models (LLMs) have attracted considerable attention in various fields for their cost-effective solutions to diverse challenges, especially with advancements in instruction tuning and quantization. E-commerce, with its complex tasks and extensive product-user interactions, presents a promising application area for LLMs. However, the domain-specific concepts and knowledge inherent in e-commerce pose significant challenges for adapting general LLMs. To address this issue, we developed EC-Guide \href{https://github.com/fzp0424/EC-Guide-KDDUP-2024}, a comprehensive e-commerce guide for instruction tuning and quantization of LLMs. We also heuristically integrated Chain-of-Thought (CoT) during inference to enhance arithmetic performance. Our approach achieved the 2nd place in Track 2 and 5th place in Track 5 at the Amazon KDD Cup'24 \href{https://www.aicrowd.com/challenges/amazon-kdd-cup-2024-multi-task-online-shopping-challenge-for-llms}. Additionally, our solution is model-agnostic, enabling effective scalability across larger systems.
翻译:大型语言模型(LLM)因其为多样化挑战提供的高性价比解决方案而受到广泛关注,特别是在指令调优与量化技术取得进展的背景下。电子商务领域因其复杂的任务场景和丰富的产品-用户交互,为LLM提供了极具前景的应用场景。然而,电子商务固有的领域特定概念与知识对通用LLM的适配构成了显著挑战。为解决这一问题,我们开发了EC-Guide(\href{https://github.com/fzp0424/EC-Guide-KDDUP-2024}),一个面向LLM指令调优与量化的综合性电子商务指南。我们还在推理过程中启发式地集成了思维链(Chain-of-Thought, CoT)方法以提升算术性能。我们的方法在Amazon KDD Cup'24(\href{https://www.aicrowd.com/challenges/amazon-kdd-cup-2024-multi-task-online-shopping-challenge-for-llms})的Track 2中获得第二名,在Track 5中获得第五名。此外,我们的解决方案是模型无关的,能够在大规模系统中实现有效的扩展。