TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Enhanced Meta-Learning for Cross-lingual Named Entity Reco...

Enhanced Meta-Learning for Cross-lingual Named Entity Recognition with Minimal Resources

Qianhui Wu, Zijia Lin, Guoxin Wang, Hui Chen, Börje F. Karlsson, Biqing Huang, Chin-Yew Lin

2019-11-14Meta-Learningnamed-entity-recognitionCross-Lingual TransferNamed Entity RecognitionNERCross-Lingual NERNamed Entity Recognition (NER)
PaperPDFCode(official)

Abstract

For languages with no annotated resources, transferring knowledge from rich-resource languages is an effective solution for named entity recognition (NER). While all existing methods directly transfer from source-learned model to a target language, in this paper, we propose to fine-tune the learned model with a few similar examples given a test case, which could benefit the prediction by leveraging the structural and semantic information conveyed in such similar examples. To this end, we present a meta-learning algorithm to find a good model parameter initialization that could fast adapt to the given test case and propose to construct multiple pseudo-NER tasks for meta-training by computing sentence similarities. To further improve the model's generalization ability across different languages, we introduce a masking scheme and augment the loss function with an additional maximum term during meta-training. We conduct extensive experiments on cross-lingual named entity recognition with minimal resources over five target languages. The results show that our approach significantly outperforms existing state-of-the-art methods across the board.

Results

TaskDatasetMetricValueModel
Cross-LingualCoNLL DutchF180.44Meta-Cross
Cross-LingualCoNLL DutchF179.57Base Model
Cross-LingualCoNLL GermanF173.16Meta-Cross
Cross-LingualCoNLL GermanF170.79Base Model
Cross-LingualCoNLL SpanishF176.75Meta-Cross
Cross-LingualCoNLL SpanishF174.59Base Model
Cross-LingualEuropeana FrenchF155.3Meta-Cross
Cross-LingualEuropeana FrenchF150.89Base Model
Cross-LingualMSRAF177.89Meta-Cross
Cross-LingualMSRAF176.42Base Model
Cross-Lingual TransferCoNLL DutchF180.44Meta-Cross
Cross-Lingual TransferCoNLL DutchF179.57Base Model
Cross-Lingual TransferCoNLL GermanF173.16Meta-Cross
Cross-Lingual TransferCoNLL GermanF170.79Base Model
Cross-Lingual TransferCoNLL SpanishF176.75Meta-Cross
Cross-Lingual TransferCoNLL SpanishF174.59Base Model
Cross-Lingual TransferEuropeana FrenchF155.3Meta-Cross
Cross-Lingual TransferEuropeana FrenchF150.89Base Model
Cross-Lingual TransferMSRAF177.89Meta-Cross
Cross-Lingual TransferMSRAF176.42Base Model

Related Papers

Enhancing Cross-task Transfer of Large Language Models via Activation Steering2025-07-17Are encoders able to learn landmarkers for warm-starting of Hyperparameter Optimization?2025-07-16Imbalanced Regression Pipeline Recommendation2025-07-16CLID-MU: Cross-Layer Information Divergence Based Meta Update Strategy for Learning with Noisy Labels2025-07-16Mixture of Experts in Large Language Models2025-07-15HanjaBridge: Resolving Semantic Ambiguity in Korean LLMs via Hanja-Augmented Pre-Training2025-07-15Iceberg: Enhancing HLS Modeling with Synthetic Data2025-07-14Meta-Reinforcement Learning for Fast and Data-Efficient Spectrum Allocation in Dynamic Wireless Networks2025-07-13