TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

SotA/Natural Language Processing/Relation Extraction/ChemProt

Relation Extraction on ChemProt

Metric: F1 (higher is better)

LeaderboardDataset
Loading chart...

Results

Submit a result
#Model↕F1▼Extra DataPaperDate↕Code
1SciBert (Finetune)83.64NoSciBERT: A Pretrained Language Model for Scienti...2019-03-26Code
2BioM-BERT80No--Code
3BioLinkBERT (large)79.98NoLinkBERT: Pretraining Language Models with Docum...2022-03-29Code
4SciFive Large78NoSciFive: a text-to-text transformer model for bi...2021-05-28Code
5KeBioLM77.5NoImproving Biomedical Pretrained Language Models ...2021-04-21Code
6BioT5X (base)77.4NoSciFive: a text-to-text transformer model for bi...2021-05-28Code
7BioMegatron77NoBioMegatron: Larger Biomedical Domain Language M...2020-10-12Code
8BioBERT76.46NoBioBERT: a pre-trained biomedical language repre...2019-01-25Code
9NCBI_BERT(large) (P)74.4NoTransfer Learning in Biomedical Natural Language...2019-06-13Code
10SciBERT (Base Vocab)73.7YesSciBERT: A Pretrained Language Model for Scienti...2019-03-26Code
11ELECTRAMed72.94YesELECTRAMed: a new pre-trained language represent...2021-04-19Code