Copy that! Editing Sequences by Copying Spans

Neural sequence-to-sequence models are finding increasing use in editing of documents, for example in correcting a text document or repairing source code. In this paper, we argue that common seq2seq models (with a facility to copy single tokens) are not a natural fit for such tasks, as they have to explicitly copy each unchanged token... (read more)

Results in Papers With Code
(↓ scroll down to see all results)