TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Cross-lingual Language Model Pretraining

Cross-lingual Language Model Pretraining

Guillaume Lample, Alexis Conneau

2019-01-22NeurIPS 2019 12Machine TranslationUnsupervised Machine TranslationNatural Language UnderstandingTranslationLanguage Modelling
PaperPDFCodeCodeCodeCodeCodeCodeCodeCodeCodeCodeCodeCodeCodeCodeCodeCodeCode

Abstract

Recent studies have demonstrated the efficiency of generative pretraining for English natural language understanding. In this work, we extend this approach to multiple languages and show the effectiveness of cross-lingual pretraining. We propose two methods to learn cross-lingual language models (XLMs): one unsupervised that only relies on monolingual data, and one supervised that leverages parallel data with a new cross-lingual language model objective. We obtain state-of-the-art results on cross-lingual classification, unsupervised and supervised machine translation. On XNLI, our approach pushes the state of the art by an absolute gain of 4.9% accuracy. On unsupervised machine translation, we obtain 34.3 BLEU on WMT'16 German-English, improving the previous state of the art by more than 9 BLEU. On supervised machine translation, we obtain a new state of the art of 38.5 BLEU on WMT'16 Romanian-English, outperforming the previous best approach by more than 4 BLEU. Our code and pretrained models will be made publicly available.

Results

TaskDatasetMetricValueModel
Machine TranslationWMT2016 Romanian-EnglishBLEU score35.3MLM pretraining
Machine TranslationWMT2014 English-FrenchBLEU33.4MLM pretraining for encoder and decoder
Machine TranslationWMT2014 French-EnglishBLEU33.3MLM pretraining for encoder and decoder
Machine TranslationWMT2016 English-GermanBLEU26.4MLM pretraining for encoder and decoder
Machine TranslationWMT2016 English--RomanianBLEU33.3MLM pretraining for encoder and decoder
Machine TranslationWMT2016 Romanian-EnglishBLEU31.8MLM pretraining for encoder and decoder
Machine TranslationWMT2016 German-EnglishBLEU34.3MLM pretraining for encoder and decoder
Machine TranslationWMT2016 English-RomanianBLEU33.3MLM pretraining for encoder and decoder
Natural Language InferenceXNLI FrenchAccuracy80.2XLM (MLM+TLM)

Related Papers

Visual-Language Model Knowledge Distillation Method for Image Quality Assessment2025-07-21A Translation of Probabilistic Event Calculus into Markov Decision Processes2025-07-17Making Language Model a Hierarchical Classifier and Generator2025-07-17VisionThink: Smart and Efficient Vision Language Model via Reinforcement Learning2025-07-17The Generative Energy Arena (GEA): Incorporating Energy Awareness in Large Language Model (LLM) Human Evaluations2025-07-17Inverse Reinforcement Learning Meets Large Language Model Post-Training: Basics, Advances, and Opportunities2025-07-17Assay2Mol: large language model-based drug design using BioAssay context2025-07-16Describe Anything Model for Visual Question Answering on Text-rich Images2025-07-16