{"title":"利用最近邻学习改进Sanger树结构算法","authors":"C.-C. Chen","doi":"10.1109/IJCNN.1991.170503","DOIUrl":null,"url":null,"abstract":"The author identifies several different neural network models which are related to nearest neighbor learning. They include radial basis functions, sparse distributed memory, and localized receptive fields. One way to improve the neural networks' performance is by using the cooperation of different learning algorithms. The prediction of chaotic time series is used as an example to show how nearest neighbor learning can be employed to improve Sanger's tree-structured algorithm which predicts future values of the Mackey-Glass differential delay equation.<<ETX>>","PeriodicalId":211135,"journal":{"name":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1991-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Using nearest neighbor learning to improve Sanger's tree-structured algorithm\",\"authors\":\"C.-C. Chen\",\"doi\":\"10.1109/IJCNN.1991.170503\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The author identifies several different neural network models which are related to nearest neighbor learning. They include radial basis functions, sparse distributed memory, and localized receptive fields. One way to improve the neural networks' performance is by using the cooperation of different learning algorithms. The prediction of chaotic time series is used as an example to show how nearest neighbor learning can be employed to improve Sanger's tree-structured algorithm which predicts future values of the Mackey-Glass differential delay equation.<<ETX>>\",\"PeriodicalId\":211135,\"journal\":{\"name\":\"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks\",\"volume\":\"35 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1991-11-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.1991.170503\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.1991.170503","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Using nearest neighbor learning to improve Sanger's tree-structured algorithm
The author identifies several different neural network models which are related to nearest neighbor learning. They include radial basis functions, sparse distributed memory, and localized receptive fields. One way to improve the neural networks' performance is by using the cooperation of different learning algorithms. The prediction of chaotic time series is used as an example to show how nearest neighbor learning can be employed to improve Sanger's tree-structured algorithm which predicts future values of the Mackey-Glass differential delay equation.<>