ROME and MEMIT are largely believed to be two different model editing algorithms, with the major difference between them being the ability to perform batched edits. In this paper, we unify these two algorithms under a single conceptual umbrella, optimizing for the same goal, which we call the preservation-memorization objective. ROME uses an equality constraint to optimize this objective to perform one edit at a time, whereas MEMIT employs a more flexible least-square constraint that allows for batched edits. We generalize ROME and enable batched editing with equality constraint in the form of EMMET - an Equality-constrained Mass Model Editing algorithm for Transformers, a new batched memory-editing algorithm. EMMET can perform batched-edits up to a batch-size of 10,000, with very similar performance to MEMIT across multiple dimensions. With the introduction of EMMET, we truly unify ROME and MEMIT and show that both algorithms are equivalent in terms of their optimization objective, their abilities (singular and batched editing), their model editing performance and their limitations.
翻译:ROME和MEMIT通常被认为是两种不同的模型编辑算法,其主要区别在于能否执行批量编辑。本文在统一的概念框架下整合这两种算法,它们优化的是同一目标——我们称之为保持-记忆目标。ROME采用等式约束来优化该目标以实现单次编辑,而MEMIT则使用更灵活的最小二乘约束以实现批量编辑。我们推广了ROME算法,通过EMMET(Transformer的等式约束大规模模型编辑算法)实现了基于等式约束的批量编辑,这是一种新型批量记忆编辑算法。EMMET能够执行批量规模高达10,000的编辑任务,在多个维度上表现出与MEMIT高度相近的性能。通过引入EMMET,我们真正实现了ROME与MEMIT的统一,并证明这两种算法在优化目标、功能特性(单点与批量编辑)、模型编辑性能及其局限性方面具有等价性。