Metric: F0.5 (higher is better)
| # | Model↕ | F0.5▼ | Extra Data | Paper | Date↕ | Code |
|---|---|---|---|---|---|---|
| 1 | Llama + 1M BT + gold | 76.75 | Yes | To Err Is Human, but Llamas Can Learn It Too | 2024-03-08 | Code |
| 2 | mT5-based multimodal MoE | 76.3 | Yes | - | - | - |
| 3 | gT5 xxl | 75.96 | Yes | A Simple Recipe for Multilingual Grammatical Err... | 2021-06-07 | Code |
| 4 | Transformer | 73.71 | Yes | Grammatical Error Correction in Low-Resource Sce... | 2019-10-01 | Code |
| 5 | Transformer - synthetic pretrain only | 51.41 | Yes | Grammatical Error Correction in Low-Resource Sce... | 2019-10-01 | Code |
| 6 | Multilayer Convolutional Encoder-Decoder | 43.35 | Yes | - | - | Code |