A minimum discrimination information approach for hidden Markov modeling
- 24 March 2005
Abstract
A new iterative approach for hidden Markov modeling of information sources which aims at minimizing the discrimination information (or the cross-entropy) between the source and the model is proposed. This approach does not require the commonly used assumption that the source to be modeled is a hidden Markov process. The algorithm is started from the model estimated by the traditional maximum likelihood (ML) approach and alternatively decreases the discrimination information over all probability distributions of the source which agree with the given measurements and all hidden Markov models. The proposed procedure generalizes the Baum algorithm for ML hidden Markov modeling. The procedure is shown to be a descent algorithm for the discrimination information measure and its local convergence is proved.Keywords
This publication has 5 references indexed in Scilit:
- Maximum mutual information estimation of hidden Markov model parameters for speech recognitionPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2005
- Global convergence and empirical consistency of the generalized Lloyd algorithmIEEE Transactions on Information Theory, 1986
- On the uniqueness of the maximum-likeliwood estimate of structured covariance matricesIEEE Transactions on Acoustics, Speech, and Signal Processing, 1984
- Rate-distortion speech coding with a minimum discrimination information distortion measureIEEE Transactions on Information Theory, 1981
- $I$-Divergence Geometry of Probability Distributions and Minimization ProblemsThe Annals of Probability, 1975