ReGLU is an activation function which is a variant of GLU. The definition is as follows:
ReGLU(x,W,V,b,c)=max(0,xW+b)⊗(xV+c)\text{ReGLU}\left(x, W, V, b, c\right) = \max\left(0, xW + b\right) \otimes \left(xV + c\right)ReGLU(x,W,V,b,c)=max(0,xW+b)⊗(xV+c)