{"title":"Combining the Likelihood and the Kullback-Leibler Distance in Estimating the Universal Background Model for Speaker Verification Using SVM","authors":"Zhenchun Lei","doi":"10.1109/ICPR.2010.1106","DOIUrl":null,"url":null,"abstract":"The state-of-the-art methods for speaker verification are based on the support vector machine. The Gaussian supervector SVM is a typical method which uses the Gaussian mixture model for creating “feature vectors” for the discriminative SVM. And all GMMs are adapted from the same universal background model, which is got by maximum likelihood estimation on a large number of data sets. So the UBM should cover the feature space widely as possible. We propose a new method to estimate the parameters of the UBM by combining the likelihood and the Kullback-Leibler distances in the UBM. Its aim is to find the model parameters which get the high likelihood value and all Gaussian distributions are dispersed to cover the feature space in a great measuring. Experiments on NIST 2001 task show that our method can improve the performance obviously.","PeriodicalId":309591,"journal":{"name":"2010 20th International Conference on Pattern Recognition","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 20th International Conference on Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICPR.2010.1106","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The state-of-the-art methods for speaker verification are based on the support vector machine. The Gaussian supervector SVM is a typical method which uses the Gaussian mixture model for creating “feature vectors” for the discriminative SVM. And all GMMs are adapted from the same universal background model, which is got by maximum likelihood estimation on a large number of data sets. So the UBM should cover the feature space widely as possible. We propose a new method to estimate the parameters of the UBM by combining the likelihood and the Kullback-Leibler distances in the UBM. Its aim is to find the model parameters which get the high likelihood value and all Gaussian distributions are dispersed to cover the feature space in a great measuring. Experiments on NIST 2001 task show that our method can improve the performance obviously.