ShiLU

Shifted Rectified Linear Unit

GeneralIntroduced 20001 papers

Description

The Shifted Rectified Linear Unit, or ShiLU, is a modification of ReLU activation function that has trainable parameters.

ShiLU(x)=αReLU(x)+βShiLU(x) = \alpha ReLU(x) + \beta

Papers Using This Method