Abstract
Markov chain Monte Carlo (MCMC) sampling strategies can be used to simulate hidden Markov model (HMM) parameters from their posterior distribution given observed data. Some MCMC methods used in practice (for computing likelihood, conditional probabilities of hidden states, and the most likely sequence of states) can be improved by incorporating established recursive algorithms. The most important of these is a set of forward-backward recursions calculating conditional distributions of the hidden states given observed data and model parameters. I show how to use the recursive algorithms in an MCMC context and demonstrate mathematical and empirical results showing a Gibbs sampler using the forward-backward recursions mixes more rapidly than another sampler often used for HMMs. Iintroduce an augmented variables technique for obtaining unique state labels in HMMs and finite mixture models. I show how recursive computing allows the statistically efficient use of MCMC output when estimating the hidden states. I directly calculate the posterior distribution of the hidden chain's state-space size by MCMC, circumventing asymptotic arguments underlying the Bayesian information criterion, which is shown to be inappropriate for a frequently analyzed dataset in the HMM literature. The use of log-likelihood for assessing MCMC convergence is illustrated, and posterior predictive checks are used to investigate application specific questions of model adequacy.

This publication has 38 references indexed in Scilit: