TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Graph Pre-training for AMR Parsing and Generation

Graph Pre-training for AMR Parsing and Generation

Xuefeng Bai, Yulong Chen, Yue Zhang

2022-03-15ACL 2022 5Text GenerationAMR-to-Text GenerationAMR Parsing
PaperPDFCodeCode(official)

Abstract

Abstract meaning representation (AMR) highlights the core semantic information of text in a graph structure. Recently, pre-trained language models (PLMs) have advanced tasks of AMR parsing and AMR-to-text generation, respectively. However, PLMs are typically pre-trained on textual data, thus are sub-optimal for modeling structural knowledge. To this end, we investigate graph self-supervised training to improve the structure awareness of PLMs over AMR graphs. In particular, we introduce two graph auto-encoding strategies for graph-to-graph pre-training and four tasks to integrate text and graph information during pre-training. We further design a unified framework to bridge the gap between pre-training and fine-tuning tasks. Experiments on both AMR parsing and AMR-to-text generation show the superiority of our model. To our knowledge, we are the first to consider pre-training on semantic graphs.

Results

TaskDatasetMetricValueModel
Semantic ParsingThe Little PrinceSmatch79.8AMRBART large
Semantic ParsingLDC2017T10Smatch85.4AMRBART large
Semantic ParsingLDC2020T02Smatch84.2AMRBART large
Semantic ParsingNew3Smatch76.9AMRBART large
Semantic ParsingBioSmatch63.2AMRBART large
AMR ParsingThe Little PrinceSmatch79.8AMRBART large
AMR ParsingLDC2017T10Smatch85.4AMRBART large
AMR ParsingLDC2020T02Smatch84.2AMRBART large
AMR ParsingNew3Smatch76.9AMRBART large
AMR ParsingBioSmatch63.2AMRBART large

Related Papers

Making Language Model a Hierarchical Classifier and Generator2025-07-17Mitigating Object Hallucinations via Sentence-Level Early Intervention2025-07-16The Devil behind the mask: An emergent safety vulnerability of Diffusion LLMs2025-07-15Seq vs Seq: An Open Suite of Paired Encoders and Decoders2025-07-15Hashed Watermark as a Filter: Defeating Forging and Overwriting Attacks in Weight-based Neural Network Watermarking2025-07-15Exploiting Leaderboards for Large-Scale Distribution of Malicious Models2025-07-11CLI-RAG: A Retrieval-Augmented Framework for Clinically Structured and Context Aware Text Generation with LLMs2025-07-09FIFA: Unified Faithfulness Evaluation Framework for Text-to-Video and Video-to-Text Generation2025-07-09