21 May 2024 | Baolong Bi, Shenghua Liu, Lingrui Mei, Yiwei Wang, Pengliang Ji, Xueqi Cheng
This paper introduces Decoding by Contrasting Knowledge (DeCK), a novel decoding strategy to enhance in-context editing (ICE) in large language models (LLMs) for editing stubborn knowledge. Stubborn knowledge refers to facts with high pre-training confidence that are difficult to edit. DeCK improves LLMs' confidence in edited facts by contrasting the logits of new knowledge with those of parametric knowledge. It consists of two components: an editing enhancement module that improves attention to new knowledge and a contrastive decoding strategy that compares the logical distributions after in-context editing with the original parametric logical distributions to predict the next token. Experiments show that DeCK significantly enhances the performance of ICE, particularly in editing stubborn knowledge. For example, it improves the performance of LLAMA3-8B-INSTRUCT on MQUAKE by up to 219%. DeCK can be easily integrated into any ICE method as a decoding component to enhance editing capabilities. The work contributes to developing effective and accountable knowledge editing methods for LLMs.This paper introduces Decoding by Contrasting Knowledge (DeCK), a novel decoding strategy to enhance in-context editing (ICE) in large language models (LLMs) for editing stubborn knowledge. Stubborn knowledge refers to facts with high pre-training confidence that are difficult to edit. DeCK improves LLMs' confidence in edited facts by contrasting the logits of new knowledge with those of parametric knowledge. It consists of two components: an editing enhancement module that improves attention to new knowledge and a contrastive decoding strategy that compares the logical distributions after in-context editing with the original parametric logical distributions to predict the next token. Experiments show that DeCK significantly enhances the performance of ICE, particularly in editing stubborn knowledge. For example, it improves the performance of LLAMA3-8B-INSTRUCT on MQUAKE by up to 219%. DeCK can be easily integrated into any ICE method as a decoding component to enhance editing capabilities. The work contributes to developing effective and accountable knowledge editing methods for LLMs.