The exponential growth of Large Language Models (LLMs) continues to highlight the need for efficient strategies to meet ever-expanding computational and data demands. This survey provides a comprehensive analysis of two complementary paradigms: Knowledge Distillation (KD) and Dataset Distillation (DD), both aimed at compressing LLMs while preserving their advanced reasoning capabilities and linguistic diversity. We first examine key methodologies in KD, such as task-specific alignment, rationale-based training, and multi-teacher frameworks, alongside DD techniques that synthesize compact, high-impact datasets through optimization-based gradient matching, latent space regularization, and generative synthesis. Building on these foundations, we explore how integrating KD and DD can produce more effective and scalable compression strategies. Together, these approaches address persistent challenges in model scalability, architectural heterogeneity, and the preservation of emergent LLM abilities. We further highlight applications across domains such as healthcare and education, where distillation enables efficient deployment without sacrificing performance. Despite substantial progress, open challenges remain in preserving emergent reasoning and linguistic diversity, enabling efficient adaptation to continually evolving teacher models and datasets, and establishing comprehensive evaluation protocols. By synthesizing methodological innovations, theoretical foundations, and practical insights, our survey charts a path toward sustainable, resource-efficient LLMs through the tighter integration of KD and DD principles.
翻译:大型语言模型(LLMs)的指数级增长持续凸显了对高效策略的需求,以满足不断扩大的计算与数据需求。本综述对两种互补的范式——知识蒸馏(KD)与数据集蒸馏(DD)——进行了全面分析,二者皆旨在压缩LLMs的同时保持其高级推理能力与语言多样性。我们首先审视了KD中的关键方法,如任务特定对齐、基于原理的训练以及多教师框架,同时探讨了通过基于优化的梯度匹配、潜在空间正则化与生成式合成等技术来合成紧凑、高影响力数据集的DD方法。基于这些基础,我们进一步探索了如何整合KD与DD以产生更有效且可扩展的压缩策略。这些方法共同应对了模型可扩展性、架构异构性以及LLMs涌现能力保持等长期存在的挑战。我们还重点介绍了在医疗健康和教育等领域的应用,其中蒸馏技术能够在保持性能的同时实现高效部署。尽管已取得实质性进展,但在保持涌现推理与语言多样性、实现对持续演进的教师模型与数据集的高效适应,以及建立全面的评估协议等方面仍存在开放挑战。通过综合方法创新、理论基础与实践洞见,本综述为通过更紧密地整合KD与DD原理,实现可持续、资源高效的LLMs描绘了发展路径。