Classification on Climabench
Metric: Evaluation Macro F1 (higher is better)
LeaderboardDataset
Results
Submit a result| # | Model↕ | Evaluation Macro F1▼ | Augmentations | Paper | Date↕ | Code |
|---|---|---|---|---|---|---|
| 1 | CliReBERT (P0L3/clirebert_clirevocab_uncased) | 0.6545 | No | - | - | - |
| 2 | ClimateBERT (climatebert/distilroberta-base-climate-f) | 0.6437 | No | - | - | - |
| 3 | BERT (google-bert/bert-base-uncased) | 0.6139 | No | - | - | - |
| 4 | CliSciBERT (P0L3/cliscibert_scivocab_uncased) | 0.605 | No | - | - | - |
| 5 | SciBERT (allenai/scibert_scivocab_cased) | 0.5931 | No | - | - | - |
| 6 | DistilRoBERTa (distilbert/distilroberta-base) | 0.5835 | No | - | - | - |
| 7 | SciClimateBERT (P0L3/sciclimatebert) | 0.5783 | No | - | - | - |
| 8 | RoBERTa (FacebookAI/roberta-base) | 0.5739 | No | - | - | - |