Layer pruning offers a promising alternative to standard structured pruning, effectively reducing computational costs, latency, and memory footprint. While notable layer-pruning approaches aim to detect unimportant layers for removal, they often rely on single criteria that may not fully capture the complex, underlying properties of layers. We propose a novel approach that combines multiple similarity metrics into a single expressive measure of low-importance layers, called the Consensus criterion. Our technique delivers a triple-win solution: low accuracy drop, high-performance improvement, and increased robustness to adversarial attacks. With up to 78.80% FLOPs reduction and performance on par with state-of-the-art methods across different benchmarks, our approach reduces energy consumption and carbon emissions by up to 66.99% and 68.75%, respectively. Additionally, it avoids shortcut learning and improves robustness by up to 4 percentage points under various adversarial attacks. Overall, the Consensus criterion demonstrates its effectiveness in creating robust, efficient, and environmentally friendly pruned models.
翻译:层剪枝为结构化剪枝提供了一种有前景的替代方案,能有效降低计算成本、延迟和内存占用。虽然现有的层剪枝方法旨在识别不重要的层以进行移除,但它们通常依赖单一标准,可能无法全面捕捉层背后复杂的本质特性。我们提出了一种新颖的方法,将多种相似性度量整合为一个单一且富有表现力的低重要性层度量标准,称为共识准则。我们的技术实现了三赢:低精度损失、高性能提升以及对对抗攻击的鲁棒性增强。在不同基准测试中,我们的方法在实现高达78.80%的FLOPs减少的同时,性能与最先进方法相当,并将能耗和碳排放分别降低了高达66.99%和68.75%。此外,该方法避免了捷径学习,并在多种对抗攻击下将鲁棒性提升了高达4个百分点。总体而言,共识准则证明了其在创建鲁棒、高效且环保的剪枝模型方面的有效性。