TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Sequence-to-Sequence Knowledge Graph Completion and Questi...

Sequence-to-Sequence Knowledge Graph Completion and Question Answering

Apoorv Saxena, Adrian Kochsiek, Rainer Gemulla

2022-03-19ACL 2022 5Question AnsweringKnowledge Graph EmbeddingKnowledge Graph CompletionPredictionGraph EmbeddingLink Prediction
PaperPDFCode(official)

Abstract

Knowledge graph embedding (KGE) models represent each entity and relation of a knowledge graph (KG) with low-dimensional embedding vectors. These methods have recently been applied to KG link prediction and question answering over incomplete KGs (KGQA). KGEs typically create an embedding for each entity in the graph, which results in large model sizes on real-world graphs with millions of entities. For downstream tasks these atomic entity representations often need to be integrated into a multi stage pipeline, limiting their utility. We show that an off-the-shelf encoder-decoder Transformer model can serve as a scalable and versatile KGE model obtaining state-of-the-art results for KG link prediction and incomplete KG question answering. We achieve this by posing KG link prediction as a sequence-to-sequence task and exchange the triple scoring approach taken by prior KGE methods with autoregressive decoding. Such a simple but powerful method reduces the model size up to 98% compared to conventional KGE models while keeping inference time tractable. After finetuning this model on the task of KGQA over incomplete KGs, our approach outperforms baselines on multiple large-scale datasets without extensive hyperparameter tuning.

Results

TaskDatasetMetricValueModel
Link PredictionWikidata5MHits@10.282KGT5 ComplEx Ensemble
Link PredictionWikidata5MHits@100.426KGT5 ComplEx Ensemble
Link PredictionWikidata5MHits@30.362KGT5 ComplEx Ensemble
Link PredictionWikidata5MMRR0.336KGT5 ComplEx Ensemble
Link PredictionWikidata5MHits@10.267KGT5
Link PredictionWikidata5MHits@100.365KGT5
Link PredictionWikidata5MHits@30.318KGT5
Link PredictionWikidata5MMRR0.3KGT5

Related Papers

Multi-Strategy Improved Snake Optimizer Accelerated CNN-LSTM-Attention-Adaboost for Trajectory Prediction2025-07-21From Roots to Rewards: Dynamic Tree Reasoning with RL2025-07-17Enter the Mind Palace: Reasoning and Planning for Long-term Active Embodied Question Answering2025-07-17Vision-and-Language Training Helps Deploy Taxonomic Knowledge but Does Not Fundamentally Alter It2025-07-17City-VLM: Towards Multidomain Perception Scene Understanding via Multimodal Incomplete Learning2025-07-17SMART: Relation-Aware Learning of Geometric Representations for Knowledge Graphs2025-07-17Describe Anything Model for Visual Question Answering on Text-rich Images2025-07-16Is This Just Fantasy? Language Model Representations Reflect Human Judgments of Event Plausibility2025-07-16