{"title":"利用双链表实现自开发神经网络","authors":"Tsu-Chang Lee, A. Peterson","doi":"10.1109/CMPSAC.1989.65164","DOIUrl":null,"url":null,"abstract":"A novel algorithm for dynamically adapting the size of neural networks is proposed. According to the measures to be defined, a neuron in the network will generate a new neuron when the variation of its weight vector is high (i.e. when it is not learned) and will be annihilated if it is not active for a long time. This algorithm is tested on a simple but popular neural network model, Self Organization Feature Map (SOFM), and implemented in software using a double linked list. Using this algorithm, one can initially put a set of seed neurons in the network and then let the network grow according to the training patterns. It is observed from the simulation results that the network will eventually grow to a configuration suitable to the class of problems characterized by the training patterns, i.e. the neural network synthesizes itself to fit the problem space.<<ETX>>","PeriodicalId":339677,"journal":{"name":"[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1989-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Implementing a self-development neural network using doubly linked lists\",\"authors\":\"Tsu-Chang Lee, A. Peterson\",\"doi\":\"10.1109/CMPSAC.1989.65164\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A novel algorithm for dynamically adapting the size of neural networks is proposed. According to the measures to be defined, a neuron in the network will generate a new neuron when the variation of its weight vector is high (i.e. when it is not learned) and will be annihilated if it is not active for a long time. This algorithm is tested on a simple but popular neural network model, Self Organization Feature Map (SOFM), and implemented in software using a double linked list. Using this algorithm, one can initially put a set of seed neurons in the network and then let the network grow according to the training patterns. It is observed from the simulation results that the network will eventually grow to a configuration suitable to the class of problems characterized by the training patterns, i.e. the neural network synthesizes itself to fit the problem space.<<ETX>>\",\"PeriodicalId\":339677,\"journal\":{\"name\":\"[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference\",\"volume\":\"29 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1989-09-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CMPSAC.1989.65164\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"[1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CMPSAC.1989.65164","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Implementing a self-development neural network using doubly linked lists
A novel algorithm for dynamically adapting the size of neural networks is proposed. According to the measures to be defined, a neuron in the network will generate a new neuron when the variation of its weight vector is high (i.e. when it is not learned) and will be annihilated if it is not active for a long time. This algorithm is tested on a simple but popular neural network model, Self Organization Feature Map (SOFM), and implemented in software using a double linked list. Using this algorithm, one can initially put a set of seed neurons in the network and then let the network grow according to the training patterns. It is observed from the simulation results that the network will eventually grow to a configuration suitable to the class of problems characterized by the training patterns, i.e. the neural network synthesizes itself to fit the problem space.<>