TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/modReLU

modReLU

GeneralIntroduced 20006 papers
Source Paper

Description

modReLU is an activation that is a modification of a ReLU. It is a pointwise nonlinearity, σ_modReLU(z):C→C\sigma\_{modReLU}\left(z\right) : C \rightarrow Cσ_modReLU(z):C→C, which affects only the absolute value of a complex number, defined as:

σ_modReLU(z)=(∣z∣+b)z∣z∣ if ∣z∣+b≥0\sigma\_{modReLU}\left(z\right) = \left(|z| + b\right)\frac{z}{|z|} \text{ if } |z| + b \geq 0σ_modReLU(z)=(∣z∣+b)∣z∣z​ if ∣z∣+b≥0 σ_modReLU(z)=0 if ∣z∣+b≤0\sigma\_{modReLU}\left(z\right) = 0 \text{ if } |z| + b \leq 0σ_modReLU(z)=0 if ∣z∣+b≤0

where b∈Rb \in \mathbb{R}b∈R is a bias parameter of the nonlinearity. For a n_hn\_{h}n_h dimensional hidden space we learn n_hn\_{h}n_h nonlinearity bias parameters, one per dimension.

Papers Using This Method

SPECTRE: An FFT-Based Efficient Drop-In Replacement to Self-Attention for Long Contexts2025-02-25Optimal approximation using complex-valued neural networks2023-03-29Quantitative approximation results for complex-valued neural networks2021-02-25An AutoML-based Approach to Multimodal Image Sentiment Analysis2021-02-16Complex Unitary Recurrent Neural Networks using Scaled Cayley Transform2018-11-09Unitary Evolution Recurrent Neural Networks2015-11-20