Hard Swish is a type of activation function based on Swish, but replaces the computationally expensive sigmoid with a piecewise linear analogue:
h-swish(x)=xReLU6(x+3)6\text{h-swish}\left(x\right) = x\frac{\text{ReLU6}\left(x+3\right)}{6} h-swish(x)=x6ReLU6(x+3)