{"title":"Empirical risk optimisation: neural networks and dynamic programming","authors":"X. Driancourt, P. Gallinari","doi":"10.1109/NNSP.1992.253701","DOIUrl":null,"url":null,"abstract":"The authors propose a novel system for speech recognition which makes a multilayer perceptron and a dynamic programming module cooperate. It is trained through a cost function inspired by learning vector quantization which approximates the empirical average risk of misclassification. All the modules of the system are trained simultaneously through gradient backpropagation; this ensures the optimality of the system. This system has achieved very good performance for isolated-word problems and is now trained on continuous speech recognition.<<ETX>>","PeriodicalId":438250,"journal":{"name":"Neural Networks for Signal Processing II Proceedings of the 1992 IEEE Workshop","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1992-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks for Signal Processing II Proceedings of the 1992 IEEE Workshop","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NNSP.1992.253701","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
The authors propose a novel system for speech recognition which makes a multilayer perceptron and a dynamic programming module cooperate. It is trained through a cost function inspired by learning vector quantization which approximates the empirical average risk of misclassification. All the modules of the system are trained simultaneously through gradient backpropagation; this ensures the optimality of the system. This system has achieved very good performance for isolated-word problems and is now trained on continuous speech recognition.<>