Sanxing Chen, Xiaodong Liu, Jianfeng Gao, Jian Jiao, Ruofei Zhang, Yangfeng Ji
This paper examines the challenging problem of learning representations of entities and relations in a complex multi-relational knowledge graph. We propose HittER, a Hierarchical Transformer model to jointly learn Entity-relation composition and Relational contextualization based on a source entity's neighborhood. Our proposed model consists of two different Transformer blocks: the bottom block extracts features of each entity-relation pair in the local neighborhood of the source entity and the top block aggregates the relational information from outputs of the bottom block. We further design a masked entity prediction task to balance information from the relational context and the source entity itself. Experimental results show that HittER achieves new state-of-the-art results on multiple link prediction datasets. We additionally propose a simple approach to integrate HittER into BERT and demonstrate its effectiveness on two Freebase factoid question answering datasets.
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Link Prediction | WN18RR | Hits@1 | 0.462 | HittER |
| Link Prediction | WN18RR | Hits@10 | 0.584 | HittER |
| Link Prediction | WN18RR | Hits@3 | 0.516 | HittER |
| Link Prediction | WN18RR | MRR | 0.503 | HittER |
| Link Prediction | FB15k-237 | Hit@1 | 0.279 | HittER |
| Link Prediction | FB15k-237 | Hit@10 | 0.558 | HittER |
| Link Prediction | FB15k-237 | Hits@3 | 0.409 | HittER |
| Link Prediction | FB15k-237 | MRR | 0.373 | HittER |