TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/PEGASUS

PEGASUS

Natural Language ProcessingIntroduced 200053 papers
Source Paper

Description

PEGASUS proposes a transformer-based model for abstractive summarization. It uses a special self-supervised pre-training objective called gap-sentences generation (GSG) that's designed to perform well on summarization-related downstream tasks. As reported in the paper, "both GSG and MLM are applied simultaneously to this example as pre-training objectives. Originally there are three sentences. One sentence is masked with [MASK1] and used as target generation text (GSG). The other two sentences remain in the input, but some tokens are randomly masked by [MASK2]."

Papers Using This Method

Pegasus: A Universal Framework for Scalable Deep Learning Inference on the Dataplane2025-06-06QUAD-LLM-MLTC: Large Language Models Ensemble Learning for Healthcare Text Multi-Label Classification2025-02-20Implementing Large Quantum Boltzmann Machines as Generative AI Models for Dataset Balancing2025-02-05Extract-and-Abstract: Unifying Extractive and Abstractive Summarization within Single Encoder-Decoder Framework2024-09-18GLIMMER: Incorporating Graph and Lexical Features in Unsupervised Multi-Document Summarization2024-08-19BioLay_AK_SS at BioLaySumm: Domain Adaptation by Two-Stage Fine-Tuning of Large Language Models used for Biomedical Lay Summary Generation2024-08-16Factual Dialogue Summarization via Learning from Large Language Models2024-06-20Comparing Quantum Annealing and Spiking Neuromorphic Computing for Sampling Binary Sparse Coding QUBO Problems2024-05-30Evaluating Text Summaries Generated by Large Language Models Using OpenAI's GPT2024-05-07MEDVOC: Vocabulary Adaptation for Fine-tuning Pre-trained Language Models on Medical Text Summarization2024-05-07Analysis of Multidomain Abstractive Summarization Using Salience Allocation2024-02-19PEGASUS: Personalized Generative 3D Avatars with Composable Attributes2024-02-16Source Identification in Abstractive Summarization2024-02-07PEGASUS: Physically Enhanced Gaussian Splatting Simulation System for 6DoF Object Pose Dataset Generation2024-01-04Revisiting Zero-Shot Abstractive Summarization in the Era of Large Language Models from the Perspective of Position Bias2024-01-03Harnessing the Power of Prompt-based Techniques for Generating School-Level Questions using Large Language Models2023-12-02FaMeSumm: Investigating and Improving Faithfulness of Medical Summarization2023-11-03Abstractive Summarization of Large Document Collections Using GPT2023-10-09Minimum-length chain embedding for the phase unwrapping problem on D-Wave's advantage architecture2023-09-19Automatic Personalized Impression Generation for PET Reports Using Large Language Models2023-09-18