The transition from monolithic to multi-component neural architectures in advanced neural network controllers poses substantial challenges due to the high computational complexity of the latter. Conventional model compression techniques for complexity reduction, such as structured pruning based on norm-based metrics to estimate the relative importance of distinct parameter groups, often fail to capture functional significance. This paper introduces a component-aware pruning framework that utilizes gradient information to compute three distinct importance metrics during training: Gradient Accumulation, Fisher Information, and Bayesian Uncertainty. Experimental results with an autoencoder and a TD-MPC agent demonstrate that the proposed framework reveals critical structural dependencies and dynamic shifts in importance that static heuristics often miss, supporting more informed compression decisions.
翻译:先进神经网络控制器从单体架构向多组件架构的转变,因其较高的计算复杂度带来了重大挑战。传统的模型压缩技术(如基于范数度量的结构化剪枝)通过评估不同参数组的相对重要性来降低复杂度,但往往无法捕捉功能显著性。本文提出一种组件感知剪枝框架,该框架利用梯度信息在训练过程中计算三种不同的重要性度量:梯度累积、费舍尔信息和贝叶斯不确定性。通过自编码器和TD-MPC智能体的实验结果表明,所提框架能够揭示静态启发式方法常忽略的关键结构依赖性和重要性动态变化,从而支持更明智的压缩决策。