# Additional link functions for neural networks

## Link/activation functions: Regression

### Absolute value rectification

\(a(x)=|x|\)

### Rectified Linear Unit (ReLU)

#### The function

\(a(z)=\max (0,z)\)

#### The derivative

Its differential is \(1\) for values of \(z\) above \(0\), and \(0\) for values of \(z\) below \(0\).

The differential is undefined at \(z=0\), however this is unlikely to occur in practice.

#### Notes

The ReLU activation function induces sparcity.

### Noisy ReLU

### Leaky ReLU

### Parametric ReLU

### Softplus

#### The function

\(a(z)=\ln (1+e^z)\)

#### The derivative

Its derivative is the sigmoid function:

\(a'(z)=\dfrac{1}{1+e^{-z}}\)

#### Notes

The softplus function is a smooth approximation of the ReLU function.

Unlike the ReLU function, Softplus does not induce sparcity.

### Exponential Linear Unit (ELU)