Metric: BLEU (higher is better)
| # | Model↕ | BLEU▼ | Extra Data | Paper | Date↕ | Code |
|---|---|---|---|---|---|---|
| 1 | Control Prefixes (A1, A2, T5-large) | 62.27 | Yes | Control Prefixes for Parameter-Efficient Text Ge... | 2021-10-15 | Code |
| 2 | Control Prefixes (A1, T5-large) | 61.94 | No | Control Prefixes for Parameter-Efficient Text Ge... | 2021-10-15 | Code |
| 3 | T5-large + Wiki + Position | 60.56 | Yes | Stage-wise Fine-tuning for Graph-to-Text Generat... | 2021-05-17 | Code |
| 4 | T5-large | 59.7 | No | Investigating Pretrained Language Models for Gra... | 2020-07-16 | Code |
| 5 | T5-Large | 57.1 | No | Text-to-Text Pre-Training for Data-to-Text Tasks | 2020-05-21 | Code |
| 6 | HTLM (prefix 0.1%) | 56.3 | Yes | HTLM: Hyper-Text Pre-Training and Prompting of L... | 2021-07-14 | - |
| 7 | DATATUNER_NO_FC | 52.9 | No | Have Your Text and Use It Too! End-to-End Neural... | 2020-04-08 | Code |
| 8 | Transformer (Pipeline) | 51.68 | No | Neural data-to-text generation: A comparison bet... | 2019-08-23 | Code |