TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Language Models with Transformers

Language Models with Transformers

Chenguang Wang, Mu Li, Alexander J. Smola

2019-04-20arXiv 2019 10Neural Architecture SearchLanguage Modelling
PaperPDFCode(official)

Abstract

The Transformer architecture is superior to RNN-based models in computational efficiency. Recently, GPT and BERT demonstrate the efficacy of Transformer models on various NLP tasks using pre-trained language models on large-scale corpora. Surprisingly, these Transformer architectures are suboptimal for language model itself. Neither self-attention nor the positional encoding in the Transformer is able to efficiently incorporate the word-level sequential context crucial to language modeling. In this paper, we explore effective Transformer architectures for language model, including adding additional LSTM layers to better capture the sequential context while still keeping the computation efficient. We propose Coordinate Architecture Search (CAS) to find an effective architecture through iterative refinement of the model. Experimental results on the PTB, WikiText-2, and WikiText-103 show that CAS achieves perplexities between 20.42 and 34.11 on all problems, i.e. on average an improvement of 12.0 perplexity units compared to state-of-the-art LSTMs. The source code is publicly available.

Results

TaskDatasetMetricValueModel
Language ModellingPenn Treebank (Word Level)Test perplexity31.3BERT-Large-CAS
Language ModellingPenn Treebank (Word Level)Validation perplexity36.1BERT-Large-CAS
Language ModellingWikiText-103Test perplexity20.4BERT-Large-CAS
Language ModellingWikiText-103Validation perplexity19.6BERT-Large-CAS
Language ModellingWikiText-2Test perplexity34.1BERT-Large-CAS
Language ModellingWikiText-2Validation perplexity37.7BERT-Large-CAS

Related Papers

Visual-Language Model Knowledge Distillation Method for Image Quality Assessment2025-07-21DASViT: Differentiable Architecture Search for Vision Transformer2025-07-17Making Language Model a Hierarchical Classifier and Generator2025-07-17VisionThink: Smart and Efficient Vision Language Model via Reinforcement Learning2025-07-17The Generative Energy Arena (GEA): Incorporating Energy Awareness in Large Language Model (LLM) Human Evaluations2025-07-17Inverse Reinforcement Learning Meets Large Language Model Post-Training: Basics, Advances, and Opportunities2025-07-17Assay2Mol: large language model-based drug design using BioAssay context2025-07-16Describe Anything Model for Visual Question Answering on Text-rich Images2025-07-16