Metric: BERT (higher is better)
| # | Model↕ | BERT▼ | Extra Data | Paper | Date↕ | Code |
|---|---|---|---|---|---|---|
| 1 | T5B Baseline | 0.9505 | No | - | - | Code |
| 2 | FactT5B | 0.9505 | No | - | - | Code |
| 3 | JointGT Baseline | 0.9492 | No | - | - | Code |
| 4 | FactJointGT | 0.9492 | No | - | - | Code |
| 5 | HTLM (fine-tuning) | 0.94 | No | HTLM: Hyper-Text Pre-Training and Prompting of L... | 2021-07-14 | - |
| 6 | GPT-2-Large (fine-tuning) | 0.94 | No | HTLM: Hyper-Text Pre-Training and Prompting of L... | 2021-07-14 | - |