Qianhui Wu, Zijia Lin, Guoxin Wang, Hui Chen, Börje F. Karlsson, Biqing Huang, Chin-Yew Lin
For languages with no annotated resources, transferring knowledge from rich-resource languages is an effective solution for named entity recognition (NER). While all existing methods directly transfer from source-learned model to a target language, in this paper, we propose to fine-tune the learned model with a few similar examples given a test case, which could benefit the prediction by leveraging the structural and semantic information conveyed in such similar examples. To this end, we present a meta-learning algorithm to find a good model parameter initialization that could fast adapt to the given test case and propose to construct multiple pseudo-NER tasks for meta-training by computing sentence similarities. To further improve the model's generalization ability across different languages, we introduce a masking scheme and augment the loss function with an additional maximum term during meta-training. We conduct extensive experiments on cross-lingual named entity recognition with minimal resources over five target languages. The results show that our approach significantly outperforms existing state-of-the-art methods across the board.
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Cross-Lingual | CoNLL Dutch | F1 | 80.44 | Meta-Cross |
| Cross-Lingual | CoNLL Dutch | F1 | 79.57 | Base Model |
| Cross-Lingual | CoNLL German | F1 | 73.16 | Meta-Cross |
| Cross-Lingual | CoNLL German | F1 | 70.79 | Base Model |
| Cross-Lingual | CoNLL Spanish | F1 | 76.75 | Meta-Cross |
| Cross-Lingual | CoNLL Spanish | F1 | 74.59 | Base Model |
| Cross-Lingual | Europeana French | F1 | 55.3 | Meta-Cross |
| Cross-Lingual | Europeana French | F1 | 50.89 | Base Model |
| Cross-Lingual | MSRA | F1 | 77.89 | Meta-Cross |
| Cross-Lingual | MSRA | F1 | 76.42 | Base Model |
| Cross-Lingual Transfer | CoNLL Dutch | F1 | 80.44 | Meta-Cross |
| Cross-Lingual Transfer | CoNLL Dutch | F1 | 79.57 | Base Model |
| Cross-Lingual Transfer | CoNLL German | F1 | 73.16 | Meta-Cross |
| Cross-Lingual Transfer | CoNLL German | F1 | 70.79 | Base Model |
| Cross-Lingual Transfer | CoNLL Spanish | F1 | 76.75 | Meta-Cross |
| Cross-Lingual Transfer | CoNLL Spanish | F1 | 74.59 | Base Model |
| Cross-Lingual Transfer | Europeana French | F1 | 55.3 | Meta-Cross |
| Cross-Lingual Transfer | Europeana French | F1 | 50.89 | Base Model |
| Cross-Lingual Transfer | MSRA | F1 | 77.89 | Meta-Cross |
| Cross-Lingual Transfer | MSRA | F1 | 76.42 | Base Model |