{"title":"从缺失特征数据估计高斯混合模型","authors":"D. McMichael","doi":"10.1109/ISSPA.1996.615761","DOIUrl":null,"url":null,"abstract":"Maximum likelihood (ML) fitting of Gaussian mixture model (GMMs) to feature data is most efficiently handled by the EM algorithm [1, 2, 3, 4]. The EM algorithm is directly applicable to multivariate data in which all the features are always present, and there are no missing values. Unfortunately, missing values are common: caused either by random or systematic effects. This study presents a novel algorithm for estimating the parameters of GMMs when there are random missing values. The approach is Bayesian in the missing values and ML in the GMM parameters. The same model can be applied to heteroscedastic data, and to indirectly observable mixed Gaussian observations.","PeriodicalId":359344,"journal":{"name":"Fourth International Symposium on Signal Processing and Its Applications","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Estimating Gaussian Mixture Models from Data with Missing Features\",\"authors\":\"D. McMichael\",\"doi\":\"10.1109/ISSPA.1996.615761\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Maximum likelihood (ML) fitting of Gaussian mixture model (GMMs) to feature data is most efficiently handled by the EM algorithm [1, 2, 3, 4]. The EM algorithm is directly applicable to multivariate data in which all the features are always present, and there are no missing values. Unfortunately, missing values are common: caused either by random or systematic effects. This study presents a novel algorithm for estimating the parameters of GMMs when there are random missing values. The approach is Bayesian in the missing values and ML in the GMM parameters. The same model can be applied to heteroscedastic data, and to indirectly observable mixed Gaussian observations.\",\"PeriodicalId\":359344,\"journal\":{\"name\":\"Fourth International Symposium on Signal Processing and Its Applications\",\"volume\":\"28 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1900-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Fourth International Symposium on Signal Processing and Its Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISSPA.1996.615761\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Fourth International Symposium on Signal Processing and Its Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISSPA.1996.615761","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Estimating Gaussian Mixture Models from Data with Missing Features
Maximum likelihood (ML) fitting of Gaussian mixture model (GMMs) to feature data is most efficiently handled by the EM algorithm [1, 2, 3, 4]. The EM algorithm is directly applicable to multivariate data in which all the features are always present, and there are no missing values. Unfortunately, missing values are common: caused either by random or systematic effects. This study presents a novel algorithm for estimating the parameters of GMMs when there are random missing values. The approach is Bayesian in the missing values and ML in the GMM parameters. The same model can be applied to heteroscedastic data, and to indirectly observable mixed Gaussian observations.