Description
ProphetNet is a sequence-to-sequence pre-training model that introduces a novel self-supervised objective named future n-gram prediction and the proposed n-stream self-attention mechanism. Instead of optimizing one-step-ahead prediction in the traditional sequence-to-sequence model, the ProphetNet is optimized by -step ahead prediction that predicts the next tokens simultaneously based on previous context tokens at each time step. The future n-gram prediction explicitly encourages the model to plan for the future tokens and further help predict multiple future tokens.
Papers Using This Method
SATS: simplification aware text summarization of scientific documents2024-07-10Evaluating Text Summaries Generated by Large Language Models Using OpenAI's GPT2024-05-07Analysis of Multidomain Abstractive Summarization Using Salience Allocation2024-02-19Predicting Temperature of Major Cities Using Machine Learning and Deep Learning2023-09-23Enhancing Pre-trained Models with Text Structure Knowledge for Question Generation2022-09-09ProphetNet-X: Large-Scale Pre-training Models for English, Chinese, Multi-lingual, Dialog, and Code Generation2021-04-16A Survey of Recent Abstract Summarization Techniques2021-04-15GLGE: A New General Language Generation Evaluation Benchmark2020-11-24ProphetNet: Predicting Future N-gram for Sequence-to-SequencePre-training2020-11-01Topic-Guided Abstractive Text Summarization: a Joint Learning Approach2020-10-20ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training2020-01-13