TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Pragmatically Informative Text Generation

Pragmatically Informative Text Generation

Sheng Shen, Daniel Fried, Jacob Andreas, Dan Klein

2019-04-02NAACL 2019 6Data-to-Text GenerationText GenerationAbstractive Text SummarizationInformativenessGrounded language learningConditional Text Generation
PaperPDFCodeCode

Abstract

We improve the informativeness of models for conditional text generation using techniques from computational pragmatics. These techniques formulate language production as a game between speakers and listeners, in which a speaker should generate output text that a listener can use to correctly identify the original input that the text describes. While such approaches are widely used in cognitive science and grounded language learning, they have received less attention for more standard language generation tasks. We consider two pragmatic modeling methods for text generation: one where pragmatics is imposed by information preservation, and another where pragmatics is imposed by explicit modeling of distractors. We find that these methods improve the performance of strong existing systems for abstractive summarization and generation from structured meaning representations.

Results

TaskDatasetMetricValueModel
Text GenerationE2E NLG ChallengeBLEU68.6S_1^R
Text GenerationE2E NLG ChallengeCIDEr2.37S_1^R
Text GenerationE2E NLG ChallengeMETEOR45.25S_1^R
Text GenerationE2E NLG ChallengeNIST8.73S_1^R
Text GenerationE2E NLG ChallengeROUGE-L70.82S_1^R
Data-to-Text GenerationE2E NLG ChallengeBLEU68.6S_1^R
Data-to-Text GenerationE2E NLG ChallengeCIDEr2.37S_1^R
Data-to-Text GenerationE2E NLG ChallengeMETEOR45.25S_1^R
Data-to-Text GenerationE2E NLG ChallengeNIST8.73S_1^R
Data-to-Text GenerationE2E NLG ChallengeROUGE-L70.82S_1^R

Related Papers

Making Language Model a Hierarchical Classifier and Generator2025-07-17Mitigating Object Hallucinations via Sentence-Level Early Intervention2025-07-16The Devil behind the mask: An emergent safety vulnerability of Diffusion LLMs2025-07-15Seq vs Seq: An Open Suite of Paired Encoders and Decoders2025-07-15Hashed Watermark as a Filter: Defeating Forging and Overwriting Attacks in Weight-based Neural Network Watermarking2025-07-15Exploiting Leaderboards for Large-Scale Distribution of Malicious Models2025-07-11CLI-RAG: A Retrieval-Augmented Framework for Clinically Structured and Context Aware Text Generation with LLMs2025-07-09FIFA: Unified Faithfulness Evaluation Framework for Text-to-Video and Video-to-Text Generation2025-07-09