Markov models

Markov chains

Estimating the Markov chain stochastic matrix


Given a sequence: \(x_1,...x_n\).

The likelihood is:

\(L=\prod_{i=2}^n p_{x_{i-1},x_i}\)

If there are \(k\) states we can rewrite this as:

\(L=prod_{i=1}^k\prod_{j=1}^k n_{ij}p_{ij}\)

Where \(p_{ij}\) is the chance of moving from state \(i\) to state \(j\), and \(n_{ij}\) is the number of transtions between \(i\) and \(j\).

The log likelhood is:

\(\ln L=\sum_{i=1}^k\sum_{j=1}^kn_{ij}\ln p_{ij}\)

Constrained optimisation

Not all parameters are free. All probabilities must sum to \(1\).

\(\ln L=\sum_{i=1}^k\sum_{j=1}^kn_{ij}\ln p_{ij}-sum_{i=1}\lambda_i (\sum_{j=1}p_{ij}-1)\)

This gives us:

\(\hat p_{ij}=\dfrac{n_{ij}}{\sum_k n_{ik}}\)

Hidden Markov Models (HMMs)

Defining Hidden Markov Models (HMMs)

We don’t see state

Each state produces a visible output. this output is drawn from a distribution for each state.

We observe a sequence of outputs, not states.

Estimating HMMs with the Viterbi algorithm

Assume we know transition matrix. and starting probls

Given we observe sequence of outputs, what were most likely actual paths?

Virbiti returns this

Estimating HMMs with the forward algorithm

Given we have observed outputs, what is the chance of being in a certain state at a certain time?

Estimating HMMs with the forward-backward algorithm

We calculate state x at time t given all obs.

Baum-Welch algorithm

Kalman filters

Kalman filters

Dynamic Bayesian Network

Dynamic Bayesian Network