Matthias Lalisse, Paul Smolensky
Neural models of Knowledge Base data have typically employed compositional representations of graph objects: entity and relation embeddings are systematically combined to evaluate the truth of a candidate Knowedge Base entry. Using a model inspired by Harmonic Grammar, we propose to tokenize triplet embeddings by subjecting them to a process of optimization with respect to learned well-formedness conditions on Knowledge Base triplets. The resulting model, known as Gradient Graphs, leads to sizable improvements when implemented as a companion to compositional models. Also, we show that the "supracompositional" triplet token embeddings it produces have interpretable properties that prove helpful in performing inference on the resulting triplet representations.
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Link Prediction | FB15k | Hits@1 | 0.727 | HHolE |
| Link Prediction | FB15k | Hits@10 | 0.901 | HHolE |
| Link Prediction | FB15k | Hits@3 | 0.848 | HHolE |
| Link Prediction | FB15k | MR | 21 | HHolE |
| Link Prediction | FB15k | MRR | 0.796 | HHolE |
| Link Prediction | FB15k | Hits@1 | 0.727 | HHolE |
| Link Prediction | FB15k | Hits@10 | 0.901 | HHolE |
| Link Prediction | FB15k | Hits@3 | 0.848 | HHolE |
| Link Prediction | FB15k | MR | 21 | HHolE |
| Link Prediction | FB15k | MRR | 0.796 | HHolE |
| Link Prediction | WN18 | Hits@1 | 0.931 | HHolE |
| Link Prediction | WN18 | Hits@10 | 0.951 | HHolE |
| Link Prediction | WN18 | Hits@3 | 0.945 | HHolE |
| Link Prediction | WN18 | MR | 183 | HHolE |
| Link Prediction | WN18 | MRR | 0.939 | HHolE |
| Knowledge Graphs | FB15k | MRR | 0.796 | HHolE |
| Knowledge Graph Completion | FB15k | MRR | 0.796 | HHolE |
| Large Language Model | FB15k | MRR | 0.796 | HHolE |
| Inductive knowledge graph completion | FB15k | MRR | 0.796 | HHolE |