# Univariate optimisation

## Unconstrained optimisation

### Introduction to unconstrained optimisation

#### Goals

We want to identify either the maximum or the minimum.

There exist local minima and global minima.

#### Optimising through limits

If we are looking to minimise a function, and the limits are $$\infty$$ or $$-\infty$$ then we can optimise by taking large or small values.

We can examine this for each variable.

This also applies for maximising a function.

#### Optimisation through stationary points

Stationary points of a function are points where marginal changes do not have an impact on the value of the function. As a result they are either local maxima or minima.

#### Optimisation through algorithms

If we cannot identify stationary points easily, we can instead use algorithms to identify optima.

#### Stationary points of strictly concave and convex functions

If a function is strictly concave it will only have one stationary point, a local, and global, maxima.

If a function is strictly convex it will only have one stationary point, a local, and global, minima.

### Analytic optimisation

#### Convex and concave functions

Convex functions only have one minimum, and concave functions have only one maximum.

If a function is not concave or convex, it may have multiple minima

If a function is convex, then there is only one critical point – the local minimum. We can identify this this by looking for critical points using first-order conditions.

Similarly, if a function is concave, then there is only one critical point – the local maximum.

We can identify whether a function is concave or convex by evaluating the Hessian matrix.

#### Evaluating multiple local optima

We can evaluate each of the local minima or maxima, and compare the sizes.

We can identify these by taking partial derivatives of the function in question and identifying where this function is equal to zero.

$$u=f(x)$$

$$u_{x_i}=\dfrac{\delta f}{\delta x_i}=0$$

We can then solve this bundle of equations to find the stationary values of $$x$$.

After identifying the vector $$x$$ for these points we can then determine whether or not the points are minima or maxima by examining the second derivative at these points. If it is positive it is a local minima, and therefore not an optimal point. Points beyond these will be higher, and may be higher than any local maxima.

### Hessian matrix

We can take a function and make a matrix of its second order partial derivatives. This is the Hessian matrix, and it describes the local curvature of the function.

If the function $$f$$ has $$n$$ parameters, the Hessian matrix is $$n\times n$$, and is defined as:

$$H_{ij}=\dfrac{\delta^2 f}{\delta x_i \delta x_j}$$

If the function is convex, then the Hessian matrix is positive semi-definite for all points, and vice versa.

If the function is concave, then the Hessian matrix is negative semi-definite for all points, and vice versa.

We can diagnose critical points by evaluating the Hessian matrix at those points.

If it is positive definite, it is a local minimum. If it is negative definite it is a local maximum. If there are both positive and negative eigenvalues it is a saddle point.