TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Salience Allocation as Guidance for Abstractive Summarizat...

Salience Allocation as Guidance for Abstractive Summarization

Fei Wang, Kaiqiang Song, Hongming Zhang, Lifeng Jin, Sangwoo Cho, Wenlin Yao, Xiaoyang Wang, Muhao Chen, Dong Yu

2022-10-22Abstractive Text Summarization
PaperPDFCode(official)

Abstract

Abstractive summarization models typically learn to capture the salient information from scratch implicitly. Recent literature adds extractive summaries as guidance for abstractive summarization models to provide hints of salient content and achieves better performance. However, extractive summaries as guidance could be over strict, leading to information loss or noisy signals. Furthermore, it cannot easily adapt to documents with various abstractiveness. As the number and allocation of salience content pieces vary, it is hard to find a fixed threshold deciding which content should be included in the guidance. In this paper, we propose a novel summarization approach with a flexible and reliable salience guidance, namely SEASON (SaliencE Allocation as Guidance for Abstractive SummarizatiON). SEASON utilizes the allocation of salience expectation to guide abstractive summarization and adapts well to articles in different abstractiveness. Automatic and human evaluations on two benchmark datasets show that the proposed method is effective and reliable. Empirical results on more than one million news articles demonstrate a natural fifteen-fifty salience split for news article sentences, providing a useful insight for composing news articles.

Results

TaskDatasetMetricValueModel
Text SummarizationCNN / Daily MailROUGE-146.27SEASON
Text SummarizationCNN / Daily MailROUGE-222.64SEASON
Text SummarizationCNN / Daily MailROUGE-L43.08SEASON
Abstractive Text SummarizationCNN / Daily MailROUGE-146.27SEASON
Abstractive Text SummarizationCNN / Daily MailROUGE-222.64SEASON
Abstractive Text SummarizationCNN / Daily MailROUGE-L43.08SEASON

Related Papers

Advancing Decoding Strategies: Enhancements in Locally Typical Sampling for LLMs2025-06-03ARC: Argument Representation and Coverage Analysis for Zero-Shot Long Document Summarization with Instruction Following LLMs2025-05-29Power-Law Decay Loss for Large Language Model Finetuning: Focusing on Information Sparsity to Enhance Generation Quality2025-05-22Enhancing Abstractive Summarization of Scientific Papers Using Structure Information2025-05-20Low-Resource Language Processing: An OCR-Driven Summarization and Translation Pipeline2025-05-16ProdRev: A DNN framework for empowering customers using generative pre-trained transformers2025-05-14A Split-then-Join Approach to Abstractive Summarization for Very Long Documents in a Low Resource Setting2025-05-11GASCADE: Grouped Summarization of Adverse Drug Event for Enhanced Cancer Pharmacovigilance2025-05-07