Xin Wei, Jianxin Chen, Lei Wang, Jingwu Cui, B. Zheng
{"title":"扩展高斯混合模型的变分学习与推理算法","authors":"Xin Wei, Jianxin Chen, Lei Wang, Jingwu Cui, B. Zheng","doi":"10.1109/ICCCHINA.2014.7008278","DOIUrl":null,"url":null,"abstract":"In this paper, in order to properly evaluate the relative importance of priors and observed data in the Bayesian framework, we propose an extended Gaussian mixture model (EGMM) and design the corresponding learning inference algorithms. First, we define the likelihood function of the EGMM and then propose the variational learning algorithm for this EGMM. Moreover, the proposed model and approach are applied to speaker recognition. Experimental results demonstrate that this new approach generalizes the traditional GMM, offering a more powerful performance.","PeriodicalId":353402,"journal":{"name":"2014 IEEE/CIC International Conference on Communications in China (ICCC)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Variational learning and inference algorithms for extended Gaussian mixture model\",\"authors\":\"Xin Wei, Jianxin Chen, Lei Wang, Jingwu Cui, B. Zheng\",\"doi\":\"10.1109/ICCCHINA.2014.7008278\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, in order to properly evaluate the relative importance of priors and observed data in the Bayesian framework, we propose an extended Gaussian mixture model (EGMM) and design the corresponding learning inference algorithms. First, we define the likelihood function of the EGMM and then propose the variational learning algorithm for this EGMM. Moreover, the proposed model and approach are applied to speaker recognition. Experimental results demonstrate that this new approach generalizes the traditional GMM, offering a more powerful performance.\",\"PeriodicalId\":353402,\"journal\":{\"name\":\"2014 IEEE/CIC International Conference on Communications in China (ICCC)\",\"volume\":\"8 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2014 IEEE/CIC International Conference on Communications in China (ICCC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICCCHINA.2014.7008278\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE/CIC International Conference on Communications in China (ICCC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCCHINA.2014.7008278","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Variational learning and inference algorithms for extended Gaussian mixture model
In this paper, in order to properly evaluate the relative importance of priors and observed data in the Bayesian framework, we propose an extended Gaussian mixture model (EGMM) and design the corresponding learning inference algorithms. First, we define the likelihood function of the EGMM and then propose the variational learning algorithm for this EGMM. Moreover, the proposed model and approach are applied to speaker recognition. Experimental results demonstrate that this new approach generalizes the traditional GMM, offering a more powerful performance.