TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Knowledge Graph Embedding with Iterative Guidance from Sof...

Knowledge Graph Embedding with Iterative Guidance from Soft Rules

Shu Guo, Quan Wang, Lihong Wang, Bin Wang, Li Guo

2017-11-30Knowledge GraphsKnowledge Graph EmbeddingGraph EmbeddingLink Prediction
PaperPDFCode(official)

Abstract

Embedding knowledge graphs (KGs) into continuous vector spaces is a focus of current research. Combining such an embedding model with logic rules has recently attracted increasing attention. Most previous attempts made a one-time injection of logic rules, ignoring the interactive nature between embedding learning and logical inference. And they focused only on hard rules, which always hold with no exception and usually require extensive manual effort to create or validate. In this paper, we propose Rule-Guided Embedding (RUGE), a novel paradigm of KG embedding with iterative guidance from soft rules. RUGE enables an embedding model to learn simultaneously from 1) labeled triples that have been directly observed in a given KG, 2) unlabeled triples whose labels are going to be predicted iteratively, and 3) soft rules with various confidence levels extracted automatically from the KG. In the learning process, RUGE iteratively queries rules to obtain soft labels for unlabeled triples, and integrates such newly labeled triples to update the embedding model. Through this iterative procedure, knowledge embodied in logic rules may be better transferred into the learned embeddings. We evaluate RUGE in link prediction on Freebase and YAGO. Experimental results show that: 1) with rule knowledge injected iteratively, RUGE achieves significant and consistent improvements over state-of-the-art baselines; and 2) despite their uncertainties, automatically extracted soft rules are highly beneficial to KG embedding, even those with moderate confidence levels. The code and data used for this paper can be obtained from https://github.com/iieir-km/RUGE.

Results

TaskDatasetMetricValueModel
Link PredictionFB15kHits@10.703Rule-Guided Embedding
Link PredictionFB15kHits@100.865Rule-Guided Embedding
Link PredictionFB15kHits@30.815Rule-Guided Embedding
Link PredictionFB15kHits@50.836Rule-Guided Embedding
Link PredictionFB15kMRR0.768Rule-Guided Embedding
Link PredictionYAGO37Hits@10.34Rule-Guided Embedding
Link PredictionYAGO37Hits@100.603Rule-Guided Embedding
Link PredictionYAGO37Hits@30.482Rule-Guided Embedding
Link PredictionYAGO37Hits@50.541Rule-Guided Embedding
Link PredictionYAGO37MRR0.431Rule-Guided Embedding

Related Papers

SMART: Relation-Aware Learning of Geometric Representations for Knowledge Graphs2025-07-17Topic Modeling and Link-Prediction for Material Property Discovery2025-07-08Graph Collaborative Attention Network for Link Prediction in Knowledge Graphs2025-07-05Understanding Generalization in Node and Link Prediction2025-07-01Context-Driven Knowledge Graph Completion with Semantic-Aware Relational Message Passing2025-06-29Active Inference AI Systems for Scientific Discovery2025-06-26Enhancing LLM Tool Use with High-quality Instruction Data from Knowledge Graph2025-06-26Generating Reliable Adverse event Profiles for Health through Automated Integrated Data (GRAPH-AID): A Semi-Automated Ontology Building Approach2025-06-25