Description
The Shifted Rectified Linear Unit, or ShiLU, is a modification of ReLU activation function that has trainable parameters.
The Shifted Rectified Linear Unit, or ShiLU, is a modification of ReLU activation function that has trainable parameters.