{"title":"半捆绑协方差训练效率的改进","authors":"Sibao Chen, Yu Hu, B. Luo, Ren-Hua Wang","doi":"10.1109/CHINSL.2008.ECP.62","DOIUrl":null,"url":null,"abstract":"Semi-tied covariance (STC) is applied widely in speech recognition due to its feature de-correlation ability. Solving the transform matrices of STC is a nonlinear optimization problem. Gales proposed an efficient method by iteratively updating a row of transform matrices. However, it needs to solve cofactors of elements of a matrix row in two layers of loops. Directly solving them is very time-consuming. Based on the property that only one row is updated in each iteration, it can be found from algebraic procedures, that the inverse and determinant of transform matrix in current iteration can be obtained by simple multiplications and additions of those in the previous iteration, and the cofactor vector of a row is equal to the corresponding column of multiplication between the inverse and determinant. This clearly improves the training efficiency of STC. Experiments on the RM database show that the proposed iteration method achieves a 33.56% relative reduction of training time over original STC method.","PeriodicalId":291958,"journal":{"name":"2008 6th International Symposium on Chinese Spoken Language Processing","volume":"252 ","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2008-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An Improvement for Training Efficiency of Semi-Tied Covariance\",\"authors\":\"Sibao Chen, Yu Hu, B. Luo, Ren-Hua Wang\",\"doi\":\"10.1109/CHINSL.2008.ECP.62\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Semi-tied covariance (STC) is applied widely in speech recognition due to its feature de-correlation ability. Solving the transform matrices of STC is a nonlinear optimization problem. Gales proposed an efficient method by iteratively updating a row of transform matrices. However, it needs to solve cofactors of elements of a matrix row in two layers of loops. Directly solving them is very time-consuming. Based on the property that only one row is updated in each iteration, it can be found from algebraic procedures, that the inverse and determinant of transform matrix in current iteration can be obtained by simple multiplications and additions of those in the previous iteration, and the cofactor vector of a row is equal to the corresponding column of multiplication between the inverse and determinant. This clearly improves the training efficiency of STC. Experiments on the RM database show that the proposed iteration method achieves a 33.56% relative reduction of training time over original STC method.\",\"PeriodicalId\":291958,\"journal\":{\"name\":\"2008 6th International Symposium on Chinese Spoken Language Processing\",\"volume\":\"252 \",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2008-12-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2008 6th International Symposium on Chinese Spoken Language Processing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CHINSL.2008.ECP.62\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2008 6th International Symposium on Chinese Spoken Language Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CHINSL.2008.ECP.62","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An Improvement for Training Efficiency of Semi-Tied Covariance
Semi-tied covariance (STC) is applied widely in speech recognition due to its feature de-correlation ability. Solving the transform matrices of STC is a nonlinear optimization problem. Gales proposed an efficient method by iteratively updating a row of transform matrices. However, it needs to solve cofactors of elements of a matrix row in two layers of loops. Directly solving them is very time-consuming. Based on the property that only one row is updated in each iteration, it can be found from algebraic procedures, that the inverse and determinant of transform matrix in current iteration can be obtained by simple multiplications and additions of those in the previous iteration, and the cofactor vector of a row is equal to the corresponding column of multiplication between the inverse and determinant. This clearly improves the training efficiency of STC. Experiments on the RM database show that the proposed iteration method achieves a 33.56% relative reduction of training time over original STC method.