Sarkar Snigdha Sarathi Das, Arzoo Katiyar, Rebecca J. Passonneau, Rui Zhang
Named Entity Recognition (NER) in Few-Shot setting is imperative for entity tagging in low resource domains. Existing approaches only learn class-specific semantic features and intermediate representations from source domains. This affects generalizability to unseen target domains, resulting in suboptimal performances. To this end, we present CONTaiNER, a novel contrastive learning technique that optimizes the inter-token distribution distance for Few-Shot NER. Instead of optimizing class-specific attributes, CONTaiNER optimizes a generalized objective of differentiating between token categories based on their Gaussian-distributed embeddings. This effectively alleviates overfitting issues originating from training domains. Our experiments in several traditional test domains (OntoNotes, CoNLL'03, WNUT '17, GUM) and a new large scale Few-Shot NER dataset (Few-NERD) demonstrate that on average, CONTaiNER outperforms previous methods by 3%-13% absolute F1 points while showing consistent performance trends, even in challenging scenarios where previous approaches could not achieve appreciable performance.
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Named Entity Recognition (NER) | Few-NERD (INTRA) | 10 way 1~2 shot | 33.84 | CONTaiNER |
| Named Entity Recognition (NER) | Few-NERD (INTRA) | 10 way 5~10 shot | 47.49 | CONTaiNER |
| Named Entity Recognition (NER) | Few-NERD (INTRA) | 5 way 1~2 shot | 40.43 | CONTaiNER |
| Named Entity Recognition (NER) | Few-NERD (INTRA) | 5 way 5~10 shot | 53.7 | CONTaiNER |
| Named Entity Recognition (NER) | Few-NERD (INTER) | 10 way 1~2 shot | 48.35 | CONTaiNER |
| Named Entity Recognition (NER) | Few-NERD (INTER) | 10 way 5~10 shot | 57.12 | CONTaiNER |
| Named Entity Recognition (NER) | Few-NERD (INTER) | 5 way 1~2 shot | 55.95 | CONTaiNER |
| Named Entity Recognition (NER) | Few-NERD (INTER) | 5 way 5~10 shot | 61.83 | CONTaiNER |