Learning Mixture Models with the Regularized Latent Maximum Entropy Principle

TitleLearning Mixture Models with the Regularized Latent Maximum Entropy Principle
Publication TypeJournal Article
Year of Publication2004
AuthorsD. Schuurmans, F. Peng, Y. Zhao, Shaojun Wang
Abstract

We present a new approach to estimating mixture models based on a new inference principle we have proposed: the latent maximum entropy principle (LME). LME is different both from Jaynes' maximum entropy principle and from standard maximum likelihood estimation. We demonstrate the LME principle by deriving new algorithms for mixture model estimation, and show how robust new variants of the EM algorithm can be developed. Our experiments show that estimation based on LME generally yields better results than maximum likelihood estimation, particularly when inferring latent variable models from small amounts of data.

Full Text

S. Wang, D. Schuurmans, F. Peng and Y. Zhao, 'Learning Mixture Models with the Regularized Latent Maximum Entropy Principle,' IEEE Trans. on Neural Networks, Special Issue on Information Theoretic Learning, Vol. 15, No. 4, pp. 903-916, July 2004
pages: 903-916
publisher: IEEE Computational Intelligence Society
year: 2004
hasEditor: Marios M. Polycarpou
hasURL: http://knoesis.wright.edu/library/publications/MixModMaxEntropy.pdf
hasBookTitle: IEEE Trans. on Neural Networks