TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/May the Force Be with Your Copy Mechanism: Enhanced Superv...

May the Force Be with Your Copy Mechanism: Enhanced Supervised-Copy Method for Natural Language Generation

Sanghyuk Choi, Jeong-in Hwang, Hyungjong Noh, Yeonsoo Lee

2021-12-20arXiv 2021 12Data-to-Text GenerationText GenerationAbstractive Text Summarization
PaperPDF

Abstract

Recent neural sequence-to-sequence models with a copy mechanism have achieved remarkable progress in various text generation tasks. These models addressed out-of-vocabulary problems and facilitated the generation of rare words. However, the identification of the word which needs to be copied is difficult, as observed by prior copy models, which suffer from incorrect generation and lacking abstractness. In this paper, we propose a novel supervised approach of a copy network that helps the model decide which words need to be copied and which need to be generated. Specifically, we re-define the objective function, which leverages source sequences and target vocabularies as guidance for copying. The experimental results on data-to-text generation and abstractive summarization tasks verify that our approach enhances the copying quality and improves the degree of abstractness.

Results

TaskDatasetMetricValueModel
Text GenerationRotoWire (Relation Generation)count27.37Force-Copy
Text GenerationMLB Dataset (Content Selection)Precision49.39Force-Copy
Text GenerationMLB Dataset (Content Selection)Recall50.89Force-Copy
Text GenerationMLB Dataset (Content Ordering)DLD21.16Force-Copy
Text GenerationMLB DatasetBLEU10.5Force-Copy
Text GenerationRotoWire (Content Ordering)BLEU15.8Force-Copy
Text GenerationRotoWireBLEU17.26Force-Copy
Text GenerationMLB Dataset (Relation Generation)Precision84.5Force-Copy
Text GenerationMLB Dataset (Relation Generation)count21.05Force-Copy
Data-to-Text GenerationRotoWire (Relation Generation)count27.37Force-Copy
Data-to-Text GenerationMLB Dataset (Content Selection)Precision49.39Force-Copy
Data-to-Text GenerationMLB Dataset (Content Selection)Recall50.89Force-Copy
Data-to-Text GenerationMLB Dataset (Content Ordering)DLD21.16Force-Copy
Data-to-Text GenerationMLB DatasetBLEU10.5Force-Copy
Data-to-Text GenerationRotoWire (Content Ordering)BLEU15.8Force-Copy
Data-to-Text GenerationRotoWireBLEU17.26Force-Copy
Data-to-Text GenerationMLB Dataset (Relation Generation)Precision84.5Force-Copy
Data-to-Text GenerationMLB Dataset (Relation Generation)count21.05Force-Copy

Related Papers

Making Language Model a Hierarchical Classifier and Generator2025-07-17Mitigating Object Hallucinations via Sentence-Level Early Intervention2025-07-16The Devil behind the mask: An emergent safety vulnerability of Diffusion LLMs2025-07-15Seq vs Seq: An Open Suite of Paired Encoders and Decoders2025-07-15Hashed Watermark as a Filter: Defeating Forging and Overwriting Attacks in Weight-based Neural Network Watermarking2025-07-15Exploiting Leaderboards for Large-Scale Distribution of Malicious Models2025-07-11CLI-RAG: A Retrieval-Augmented Framework for Clinically Structured and Context Aware Text Generation with LLMs2025-07-09FIFA: Unified Faithfulness Evaluation Framework for Text-to-Video and Video-to-Text Generation2025-07-09