TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

SotA/Natural Language Processing/Machine Translation/WMT2014 English-German

Machine Translation on WMT2014 English-German

Metric: SacreBLEU (higher is better)

LeaderboardDataset
Loading chart...

Results

Submit a result
#Model↕SacreBLEU▼Extra DataPaperDate↕Code
1Noisy back-translation33.8YesUnderstanding Back-Translation at Scale2018-08-28Code
2Transformer Cycle (Rev)33.54NoLessons on Parameter Sharing across Layers in Tr...2021-04-13Code
3Transformer+Rep(Uni)32.35NoRethinking Perturbations in Encoder-Decoders for...2021-04-05Code
4MAT29.9NoMulti-branch Attentive Transformer2020-06-18Code
5Transformer (ADMIN init)29.5NoVery Deep Transformers for Neural Machine Transl...2020-08-18Code
6Evolved Transformer Big29.2NoThe Evolved Transformer2019-01-30Code
7Mega27.96NoMega: Moving Average Equipped Gated Attention2022-09-21Code