{"title":"Application of the Gibbs distribution to hidden Markov modeling in isolated word recognition","authors":"Yunxin Zhao, L. Atlas, X. Zhuang","doi":"10.1109/ICASSP.1988.196501","DOIUrl":null,"url":null,"abstract":"A new method of formulating hidden Markov models (HMM) for isolated word recognition is presented. The authors model probabilities of hidden state sequences as Gibbs distributions (GDs) instead of the conventional products of transition probabilities. This formulation is based on the Hammersley-Clifford theorem which establishes the equivalence between Markov random fields (MRF) and GDs. The Markov chains in HMM are equivalent to one-dimensional, first order neighborhood MRFs. The observation sequences are modeled by the usual autoregressive Gaussian densities. The flexibility in the choice of energy functions in GDs makes it possible to use only a few parameters while maintaining a powerful model. The authors have developed a learning algorithm to estimate the parameters using maximum likelihood estimation and an algorithm to efficiently compute 1-D, first order neighborhood GDs using a lattice structure.<<ETX>>","PeriodicalId":448544,"journal":{"name":"ICASSP-88., International Conference on Acoustics, Speech, and Signal Processing","volume":"112 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1988-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ICASSP-88., International Conference on Acoustics, Speech, and Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICASSP.1988.196501","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
A new method of formulating hidden Markov models (HMM) for isolated word recognition is presented. The authors model probabilities of hidden state sequences as Gibbs distributions (GDs) instead of the conventional products of transition probabilities. This formulation is based on the Hammersley-Clifford theorem which establishes the equivalence between Markov random fields (MRF) and GDs. The Markov chains in HMM are equivalent to one-dimensional, first order neighborhood MRFs. The observation sequences are modeled by the usual autoregressive Gaussian densities. The flexibility in the choice of energy functions in GDs makes it possible to use only a few parameters while maintaining a powerful model. The authors have developed a learning algorithm to estimate the parameters using maximum likelihood estimation and an algorithm to efficiently compute 1-D, first order neighborhood GDs using a lattice structure.<>