TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/Reformer

Reformer

Natural Language ProcessingIntroduced 200020 papers
Source Paper

Description

Reformer is a Transformer based architecture that seeks to make efficiency improvements. Dot-product attention is replaced by one that uses locality-sensitive hashing, changing its complexity from O(L2L^2L2) to O(Llog⁡LL\log LLlogL), where LLL is the length of the sequence. Furthermore, Reformers use reversible residual layers instead of the standard residuals, which allows storing activations only once in the training process instead of NNN times, where NNN is the number of layers.

Papers Using This Method

Cardioformer: Advancing AI in ECG Analysis with Multi-Granularity Patching and ResNet2025-05-08Fast-Powerformer: A Memory-Efficient Transformer for Accurate Mid-Term Wind Power Forecasting2025-04-15ReFormer: Generating Radio Fakes for Data Augmentation2024-12-31GLMHA A Guided Low-rank Multi-Head Self-Attention for Efficient Image Restoration and Spectral Reconstruction2024-10-01Masked Face Recognition with Generative-to-Discriminative Representations2024-05-27A novel transformer-based approach for soil temperature prediction2023-11-20LegalRelectra: Mixed-domain Language Modeling for Long-range Legal Text Comprehension2022-12-16FECAM: Frequency Enhanced Channel Attention Mechanism for Time Series Forecasting2022-12-02Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting2022-05-28Discovering material information using hierarchical Reformer model on financial regulatory filings2022-03-28Dynamic Token Normalization Improves Vision Transformers2021-12-05End-to-End User Behavior Retrieval in Click-Through RatePrediction Model2021-08-10ReFormer: The Relational Transformer for Image Captioning2021-07-29A Practical Survey on Faster and Lighter Transformers2021-03-26ReAssert: Deep Learning for Assert Generation2020-11-19Learning to Unknot2020-10-28Robustification of Segmentation Models Against Adversarial Perturbations In Medical Imaging2020-09-23Efficient Transformers: A Survey2020-09-14Neural Machine Translation with Joint Representation2020-02-16Reformer: The Efficient Transformer2020-01-13