Shifted Softplus is an activation function , which SchNet employs as non-linearity throughout the network in order to obtain a smooth potential energy surface. The shifting ensures that and improves the convergence of the network. This activation function shows similarity to ELUs, while having infinite order of continuity.