ReLU6 is a modification of the rectified linear unit where we limit the activation to a maximum size of 666. This is due to increased robustness when used with low-precision computation.
Image Credit: PyTorch