Metric: F1 (higher is better)
| # | Model↕ | F1▼ | Extra Data | Paper | Date↕ | Code |
|---|---|---|---|---|---|---|
| 1 | SciBert (Finetune) | 83.64 | No | SciBERT: A Pretrained Language Model for Scienti... | 2019-03-26 | Code |
| 2 | BioM-BERT | 80 | No | - | - | Code |
| 3 | BioLinkBERT (large) | 79.98 | No | LinkBERT: Pretraining Language Models with Docum... | 2022-03-29 | Code |
| 4 | SciFive Large | 78 | No | SciFive: a text-to-text transformer model for bi... | 2021-05-28 | Code |
| 5 | KeBioLM | 77.5 | No | Improving Biomedical Pretrained Language Models ... | 2021-04-21 | Code |
| 6 | BioT5X (base) | 77.4 | No | SciFive: a text-to-text transformer model for bi... | 2021-05-28 | Code |
| 7 | BioMegatron | 77 | No | BioMegatron: Larger Biomedical Domain Language M... | 2020-10-12 | Code |
| 8 | BioBERT | 76.46 | No | BioBERT: a pre-trained biomedical language repre... | 2019-01-25 | Code |
| 9 | NCBI_BERT(large) (P) | 74.4 | No | Transfer Learning in Biomedical Natural Language... | 2019-06-13 | Code |
| 10 | SciBERT (Base Vocab) | 73.7 | Yes | SciBERT: A Pretrained Language Model for Scienti... | 2019-03-26 | Code |
| 11 | ELECTRAMed | 72.94 | Yes | ELECTRAMed: a new pre-trained language represent... | 2021-04-19 | Code |