To implement deep learning models on edge devices, model compression methods have been widely recognized as useful. However, it remains unclear which model compression methods are effective for Structured State Space Sequence (S4) models incorporating Diagonal State Space (DSS) layers, tailored for processing long-sequence data. In this paper, we propose to use the balanced truncation, a prevalent model reduction technique in control theory, applied specifically to DSS layers in pre-trained S4 model as a novel model compression method. Moreover, we propose using the reduced model parameters obtained by the balanced truncation as initial parameters of S4 models with DSS layers during the main training process. Numerical experiments demonstrate that our trained models combined with the balanced truncation surpass conventionally trained models with Skew-HiPPO initialization in accuracy, even with fewer parameters. Furthermore, our observations reveal a positive correlation: higher accuracy in the original model consistently leads to increased accuracy in models trained using our model compression method, suggesting that our approach effectively leverages the strengths of the original model.
翻译:为在边缘设备上部署深度学习模型,模型压缩方法已被广泛认可为有效手段。然而,针对专为处理长序列数据而设计的、包含对角状态空间(DSS)层的结构化状态空间序列(S4)模型,何种模型压缩方法行之有效仍不明确。本文提出将控制理论中一种主流的模型降阶技术——平衡截断,专门应用于预训练S4模型中的DSS层,作为一种新颖的模型压缩方法。此外,我们建议将平衡截断获得的简化模型参数,作为S4模型(含DSS层)在主训练过程中的初始参数。数值实验表明,结合平衡截断训练出的模型,即使在参数更少的情况下,其精度也超越了采用Skew-HiPPO初始化的传统训练模型。进一步观察发现,原始模型精度越高,采用本模型压缩方法训练的模型精度也随之提升,二者呈现正相关关系,这表明我们的方法能有效利用原始模型的优势。