TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/KEPLER: A Unified Model for Knowledge Embedding and Pre-tr...

KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation

Xiaozhi Wang, Tianyu Gao, Zhaocheng Zhu, Zhengyan Zhang, Zhiyuan Liu, Juanzi Li, Jian Tang

2019-11-13Knowledge GraphsRelation ExtractionKnowledge Graph EmbeddingsEntity EmbeddingsKnowledge Graph CompletionInductive knowledge graph completionRelation ClassificationEntity TypingLanguage ModellingLink Prediction
PaperPDFCode(official)

Abstract

Pre-trained language representation models (PLMs) cannot well capture factual knowledge from text. In contrast, knowledge embedding (KE) methods can effectively represent the relational facts in knowledge graphs (KGs) with informative entity embeddings, but conventional KE models cannot take full advantage of the abundant textual information. In this paper, we propose a unified model for Knowledge Embedding and Pre-trained LanguagE Representation (KEPLER), which can not only better integrate factual knowledge into PLMs but also produce effective text-enhanced KE with the strong PLMs. In KEPLER, we encode textual entity descriptions with a PLM as their embeddings, and then jointly optimize the KE and language modeling objectives. Experimental results show that KEPLER achieves state-of-the-art performances on various NLP tasks, and also works remarkably well as an inductive KE model on KG link prediction. Furthermore, for pre-training and evaluating KEPLER, we construct Wikidata5M, a large-scale KG dataset with aligned entity descriptions, and benchmark state-of-the-art KE methods on it. It shall serve as a new KE benchmark and facilitate the research on large KG, inductive KE, and KG with text. The source code can be obtained from https://github.com/THU-KEG/KEPLER.

Results

TaskDatasetMetricValueModel
Link PredictionWikidata5MHits@10.252SimplE
Link PredictionWikidata5MHits@100.377SimplE
Link PredictionWikidata5MHits@30.317SimplE
Link PredictionWikidata5MMRR0.296SimplE
Link PredictionWikidata5MHits@10.234RotatE
Link PredictionWikidata5MHits@100.39RotatE
Link PredictionWikidata5MHits@30.322RotatE
Link PredictionWikidata5MMRR0.29RotatE
Link PredictionWikidata5MHits@10.228ComplEx
Link PredictionWikidata5MHits@100.373ComplEx
Link PredictionWikidata5MHits@30.31ComplEx
Link PredictionWikidata5MMRR0.281ComplEx
Link PredictionWikidata5MHits@10.17TransE
Link PredictionWikidata5MHits@100.392TransE
Link PredictionWikidata5MHits@30.311TransE
Link PredictionWikidata5MMRR0.253TransE
Link PredictionWikidata5MHits@10.208DistMult
Link PredictionWikidata5MHits@100.334DistMult
Link PredictionWikidata5MHits@30.278DistMult
Link PredictionWikidata5MMRR0.253DistMult
Link PredictionWikidata5MHits@10.173KEPLER-Wiki-rel
Link PredictionWikidata5MHits@100.277KEPLER-Wiki-rel
Link PredictionWikidata5MHits@30.224KEPLER-Wiki-rel
Link PredictionWikidata5MMRR0.21KEPLER-Wiki-rel
Relation ExtractionTACREDF171.7KEPLER
Relation ExtractionTACREDF171.7KEPLER
Relation ClassificationTACREDF171.7KEPLER
Knowledge GraphsWikidata5m-indHits@10.222KEPLER-Wiki-rel
Knowledge GraphsWikidata5m-indHits@100.73KEPLER-Wiki-rel
Knowledge GraphsWikidata5m-indHits@30.514KEPLER-Wiki-rel
Knowledge GraphsWikidata5m-indMRR0.402KEPLER-Wiki-rel
Knowledge Graph CompletionWikidata5m-indHits@10.222KEPLER-Wiki-rel
Knowledge Graph CompletionWikidata5m-indHits@100.73KEPLER-Wiki-rel
Knowledge Graph CompletionWikidata5m-indHits@30.514KEPLER-Wiki-rel
Knowledge Graph CompletionWikidata5m-indMRR0.402KEPLER-Wiki-rel
Large Language ModelWikidata5m-indHits@10.222KEPLER-Wiki-rel
Large Language ModelWikidata5m-indHits@100.73KEPLER-Wiki-rel
Large Language ModelWikidata5m-indHits@30.514KEPLER-Wiki-rel
Large Language ModelWikidata5m-indMRR0.402KEPLER-Wiki-rel
Inductive knowledge graph completionWikidata5m-indHits@10.222KEPLER-Wiki-rel
Inductive knowledge graph completionWikidata5m-indHits@100.73KEPLER-Wiki-rel
Inductive knowledge graph completionWikidata5m-indHits@30.514KEPLER-Wiki-rel
Inductive knowledge graph completionWikidata5m-indMRR0.402KEPLER-Wiki-rel

Related Papers

Visual-Language Model Knowledge Distillation Method for Image Quality Assessment2025-07-21SMART: Relation-Aware Learning of Geometric Representations for Knowledge Graphs2025-07-17Making Language Model a Hierarchical Classifier and Generator2025-07-17VisionThink: Smart and Efficient Vision Language Model via Reinforcement Learning2025-07-17The Generative Energy Arena (GEA): Incorporating Energy Awareness in Large Language Model (LLM) Human Evaluations2025-07-17Inverse Reinforcement Learning Meets Large Language Model Post-Training: Basics, Advances, and Opportunities2025-07-17Assay2Mol: large language model-based drug design using BioAssay context2025-07-16Describe Anything Model for Visual Question Answering on Text-rich Images2025-07-16