TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Graph Convolution over Pruned Dependency Trees Improves Re...

Graph Convolution over Pruned Dependency Trees Improves Relation Extraction

Yuhao Zhang, Peng Qi, Christopher D. Manning

2018-09-26EMNLP 2018 10Relation ExtractionNegationRelation Classification
PaperPDFCode(official)

Abstract

Dependency trees help relation extraction models capture long-range relations between words. However, existing dependency-based models either neglect crucial information (e.g., negation) by pruning the dependency trees too aggressively, or are computationally inefficient because it is difficult to parallelize over different tree structures. We propose an extension of graph convolutional networks that is tailored for relation extraction, which pools information over arbitrary dependency structures efficiently in parallel. To incorporate relevant information while maximally removing irrelevant content, we further apply a novel pruning strategy to the input trees by keeping words immediately around the shortest path between the two entities among which a relation might hold. The resulting model achieves state-of-the-art performance on the large-scale TACRED dataset, outperforming existing sequence and dependency-based neural models. We also show through detailed analysis that this model has complementary strengths to sequence models, and combining them further improves the state of the art.

Results

TaskDatasetMetricValueModel
Relation ExtractionTACREDF168.2C-GCN + PA-LSTM
Relation ExtractionTACREDF167.1GCN + PA-LSTM
Relation ExtractionTACREDF166.4C-GCN
Relation ExtractionTACREDF164GCN
Relation ExtractionRe-TACREDF180.3C-GCN
Relation ExtractionTACREDF166.4C-GCN
Relation ClassificationTACREDF166.4C-GCN

Related Papers

DocIE@XLLM25: In-Context Learning for Information Extraction using Fully Synthetic Demonstrations2025-07-08Modeling (Deontic) Modal Operators With the s(CASP) Goal-directed Predicate Answer Set Programming System2025-07-07Step-by-Step Video-to-Audio Synthesis via Negative Audio Guidance2025-06-26Multiple Streams of Relation Extraction: Enriching and Recalling in Transformers2025-06-25Chaining Event Spans for Temporal Relation Grounding2025-06-17Thunder-NUBench: A Benchmark for LLMs' Sentence-Level Negation Understanding2025-06-17Summarization for Generative Relation Extraction in the Microbiome Domain2025-06-10Propositional Logic for Probing Generalization in Neural Networks2025-06-10