TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/PELU

PELU

Parametric Exponential Linear Unit

GeneralIntroduced 20002 papers
Source Paper

Description

Parameterized Exponential Linear Units, or PELU, is an activation function for neural networks. It involves learning a parameterization of ELU in order to learn the proper activation shape at each layer in a CNN.

The PELU has two additional parameters over the ELU:

f(x)=cx if x>0f\left(x\right) = cx \text{ if } x > 0f(x)=cx if x>0 f(x)=αexp⁡xb−1 if x≤0f\left(x\right) = \alpha\exp^{\frac{x}{b}} - 1 \text{ if } x \leq 0f(x)=αexpbx​−1 if x≤0

Where aaa, bbb, and c>0c > 0c>0. Here ccc causes a change in the slope in the positive quadrant, bbb controls the scale of the exponential decay, and α\alphaα controls the saturation in the negative quadrant.

Source: Activation Functions

Papers Using This Method

Adaptive Rational Activations to Boost Deep Reinforcement Learning2021-02-18Parametric Exponential Linear Unit for Deep Convolutional Neural Networks2016-05-30