Connecting Language and Knowledge with Heterogeneous Representations for Neural Relation Extraction
Peng Xu, Denilson Barbosa
Abstract
Knowledge Bases (KBs) require constant up-dating to reflect changes to the world they represent. For general purpose KBs, this is often done through Relation Extraction (RE), the task of predicting KB relations expressed in text mentioning entities known to the KB. One way to improve RE is to use KB Embeddings (KBE) for link prediction. However, despite clear connections between RE and KBE, little has been done toward properly unifying these models systematically. We help close the gap with a framework that unifies the learning of RE and KBE models leading to significant improvements over the state-of-the-art in RE. The code is available at https://github.com/billy-inn/HRERE.
Results
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Relation Extraction | NYT Corpus | P@10% | 84.9 | HRERE |
| Relation Extraction | NYT Corpus | P@30% | 72.8 | HRERE |
Related Papers
DocIE@XLLM25: In-Context Learning for Information Extraction using Fully Synthetic Demonstrations2025-07-08Topic Modeling and Link-Prediction for Material Property Discovery2025-07-08Graph Collaborative Attention Network for Link Prediction in Knowledge Graphs2025-07-05Understanding Generalization in Node and Link Prediction2025-07-01Context-Driven Knowledge Graph Completion with Semantic-Aware Relational Message Passing2025-06-29Multiple Streams of Relation Extraction: Enriching and Recalling in Transformers2025-06-25Directed Link Prediction using GNN with Local and Global Feature Fusion2025-06-25Demystifying Distributed Training of Graph Neural Networks for Link Prediction2025-06-25