Additional link functions for neural networks

Link/activation functions: Regression

Absolute value rectification


Rectified Linear Unit (ReLU)

The function

\(a(z)=\max (0,z)\)

The derivative

Its differential is \(1\) for values of \(z\) above \(0\), and \(0\) for values of \(z\) below \(0\).

The differential is undefined at \(z=0\), however this is unlikely to occur in practice.


The ReLU activation function induces sparcity.

Noisy ReLU

Leaky ReLU

Parametric ReLU


The function

\(a(z)=\ln (1+e^z)\)

The derivative

Its derivative is the sigmoid function:



The softplus function is a smooth approximation of the ReLU function.

Unlike the ReLU function, Softplus does not induce sparcity.

Exponential Linear Unit (ELU)