Randomized Leaky Rectified Linear Units, or RReLU, are an activation function that randomly samples the negative slope for activation values. It was first proposed and used in the Kaggle NDSB Competition. During training, a_ji is a random number sampled from a uniform distribution U(l,u). Formally:
y_ji=x_ji if x_ji≥0
y_ji=a_jix_ji if x_ji<0
where
α_ji∼U(l,u),l<u and l,u∈[0,1)
In the test phase, we take average of all the a_ji in training similar to dropout, and thus set a_ji to 2l+u to get a deterministic result. As suggested by the NDSB competition winner, a_ji is sampled from U(3,8).
At test time, we use:
y_ji=2l+ux_ji