TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/HardELiSH

HardELiSH

GeneralIntroduced 20001 papers
Source Paper

Description

HardELiSH is an activation function for neural networks. The HardELiSH is a multiplication of the HardSigmoid and ELU in the negative part and a multiplication of the Linear and the HardSigmoid in the positive part:

f(x)=xmax⁡(0,min⁡(1,(x+12))) if x≥1f\left(x\right) = x\max\left(0, \min\left(1, \left(\frac{x+1}{2}\right)\right) \right) \text{ if } x \geq 1f(x)=xmax(0,min(1,(2x+1​))) if x≥1 f(x)=(ex−1)max⁡(0,min⁡(1,(x+12))) if x<0f\left(x\right) = \left(e^{x}-1\right)\max\left(0, \min\left(1, \left(\frac{x+1}{2}\right)\right)\right) \text{ if } x < 0 f(x)=(ex−1)max(0,min(1,(2x+1​))) if x<0

Source: Activation Functions

Papers Using This Method

The Quest for the Golden Activation Function2018-08-02