{"title":"一个简单的单词识别网络,有能力选择自己的决策标准","authors":"K.A. Fischer, H. Strube","doi":"10.1109/NNSP.1991.239496","DOIUrl":null,"url":null,"abstract":"Various reliable algorithms for the word classification problem have been developed. All these models are necessarily based on the classification of certain 'features' that have to be extracted from the presented word. The general problem in speech recognition is: what kind of features are both word dependent as well as speaker independent? The majority of the existing systems requires a feature selection by the designer, so the system cannot choose the features that best fit the above mentioned criterion. Therefore, the authors tried to build a neural network that is able to rank all the features (here: the cells of the input layer) according to their functional relevance. This method reduces both the necessity to preselect the features as well as the numerical effort by a stepwise removal of the cells that proved to be unimportant.<<ETX>>","PeriodicalId":354832,"journal":{"name":"Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop","volume":"33 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1991-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"A simple word-recognition network with the ability to choose its own decision criteria\",\"authors\":\"K.A. Fischer, H. Strube\",\"doi\":\"10.1109/NNSP.1991.239496\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Various reliable algorithms for the word classification problem have been developed. All these models are necessarily based on the classification of certain 'features' that have to be extracted from the presented word. The general problem in speech recognition is: what kind of features are both word dependent as well as speaker independent? The majority of the existing systems requires a feature selection by the designer, so the system cannot choose the features that best fit the above mentioned criterion. Therefore, the authors tried to build a neural network that is able to rank all the features (here: the cells of the input layer) according to their functional relevance. This method reduces both the necessity to preselect the features as well as the numerical effort by a stepwise removal of the cells that proved to be unimportant.<<ETX>>\",\"PeriodicalId\":354832,\"journal\":{\"name\":\"Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop\",\"volume\":\"33 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1991-09-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NNSP.1991.239496\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NNSP.1991.239496","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A simple word-recognition network with the ability to choose its own decision criteria
Various reliable algorithms for the word classification problem have been developed. All these models are necessarily based on the classification of certain 'features' that have to be extracted from the presented word. The general problem in speech recognition is: what kind of features are both word dependent as well as speaker independent? The majority of the existing systems requires a feature selection by the designer, so the system cannot choose the features that best fit the above mentioned criterion. Therefore, the authors tried to build a neural network that is able to rank all the features (here: the cells of the input layer) according to their functional relevance. This method reduces both the necessity to preselect the features as well as the numerical effort by a stepwise removal of the cells that proved to be unimportant.<>