modReLU is an activation that is a modification of a ReLU. It is a pointwise nonlinearity, σ_modReLU(z):C→C, which affects only the absolute value of a complex number, defined as:
σ_modReLU(z)=(∣z∣+b)∣z∣z if ∣z∣+b≥0
σ_modReLU(z)=0 if ∣z∣+b≤0
where b∈R is a bias parameter of the nonlinearity. For a n_h dimensional hidden space we learn n_h nonlinearity bias parameters, one per dimension.