Model merging combines multiple fine-tuned models into a single model by adding their weight updates, providing a lightweight alternative to retraining. Existing methods primarily target resolving conflicts between task updates, leaving the failure mode of over-counting shared knowledge unaddressed. We show that when tasks share aligned spectral directions (i.e., overlapping singular vectors), a simple linear combination repeatedly accumulates these directions, inflating the singular values and biasing the merged model toward shared subspaces. To mitigate this issue, we propose Singular Value Calibration (SVC), a training-free and data-free post-processing method that quantifies subspace overlap and rescales inflated singular values to restore a balanced spectrum. Across vision and language benchmarks, SVC consistently improves strong merging baselines and achieves state-of-the-art performance. Furthermore, by modifying only the singular values, SVC improves the performance of Task Arithmetic by 13.0%. Code is available at: https://github.com/lyymuwu/SVC.
翻译:模型合并通过将多个微调模型的权重更新相加,将其整合为单一模型,提供了一种轻量级的替代重新训练的方法。现有方法主要致力于解决任务更新之间的冲突,而未解决过度计算共享知识这一失效模式。我们证明,当任务共享对齐的谱方向(即重叠的奇异向量)时,简单的线性组合会反复累积这些方向,导致奇异值膨胀并使合并模型偏向共享子空间。为缓解此问题,我们提出奇异值校准(SVC),这是一种无需训练且无需数据的后处理方法,可量化子空间重叠并重新缩放膨胀的奇异值以恢复平衡的谱。在视觉和语言基准测试中,SVC持续改进强大的合并基线并实现了最先进的性能。此外,通过仅修改奇异值,SVC将任务算术的性能提升了13.0%。代码可在以下网址获取:https://github.com/lyymuwu/SVC。