{"title":"分数阶主从神经网络特性研究","authors":"Yongming Jing, Huaying Dong, Guishu Liang","doi":"10.1109/ISCID.2012.279","DOIUrl":null,"url":null,"abstract":"As a good artificial intelligent method, BP neural network has been applied in many engineering research questions. However, because of some inherent shortages, especially chaotic behaviors in the network learning, it is very difficult or impossible to apply the artificial neural network into complicated engineering tasks. to solve this problem, many methods had been proposed in the past time. One of the typical approaches is Master-Slave Neural Network (MSNN), whose master network is two Hop field networks, and the other slave network is a BP network, respectively. Although this new kind of method has more advantages than the BP network, such as a quick asymptotic convergence rate and the smallest network system errors, we can further enhance its performance. in this paper, based on the non-local property of fractional operator which is more approximate reality than traditional calculus, we extend the two Hop field networks in MSNN to the fractional net in which fractional equations describe its dynamical structure. after introducing the structure of Fractional Master-Slave Neural Network (FMSNN) and the concept of fractional calculus, we take a simulation for the FMSNN, MSNN and BP neural network respectively. the result shows this new kind of neural network has a quicker asymptotic convergence rate and a smaller network system error, which improves the performance of MSNN.","PeriodicalId":246432,"journal":{"name":"2012 Fifth International Symposium on Computational Intelligence and Design","volume":"285 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Study on Characteristic of Fractional Master-Slave Neural Network\",\"authors\":\"Yongming Jing, Huaying Dong, Guishu Liang\",\"doi\":\"10.1109/ISCID.2012.279\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As a good artificial intelligent method, BP neural network has been applied in many engineering research questions. However, because of some inherent shortages, especially chaotic behaviors in the network learning, it is very difficult or impossible to apply the artificial neural network into complicated engineering tasks. to solve this problem, many methods had been proposed in the past time. One of the typical approaches is Master-Slave Neural Network (MSNN), whose master network is two Hop field networks, and the other slave network is a BP network, respectively. Although this new kind of method has more advantages than the BP network, such as a quick asymptotic convergence rate and the smallest network system errors, we can further enhance its performance. in this paper, based on the non-local property of fractional operator which is more approximate reality than traditional calculus, we extend the two Hop field networks in MSNN to the fractional net in which fractional equations describe its dynamical structure. after introducing the structure of Fractional Master-Slave Neural Network (FMSNN) and the concept of fractional calculus, we take a simulation for the FMSNN, MSNN and BP neural network respectively. the result shows this new kind of neural network has a quicker asymptotic convergence rate and a smaller network system error, which improves the performance of MSNN.\",\"PeriodicalId\":246432,\"journal\":{\"name\":\"2012 Fifth International Symposium on Computational Intelligence and Design\",\"volume\":\"285 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2012-10-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2012 Fifth International Symposium on Computational Intelligence and Design\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISCID.2012.279\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 Fifth International Symposium on Computational Intelligence and Design","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISCID.2012.279","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Study on Characteristic of Fractional Master-Slave Neural Network
As a good artificial intelligent method, BP neural network has been applied in many engineering research questions. However, because of some inherent shortages, especially chaotic behaviors in the network learning, it is very difficult or impossible to apply the artificial neural network into complicated engineering tasks. to solve this problem, many methods had been proposed in the past time. One of the typical approaches is Master-Slave Neural Network (MSNN), whose master network is two Hop field networks, and the other slave network is a BP network, respectively. Although this new kind of method has more advantages than the BP network, such as a quick asymptotic convergence rate and the smallest network system errors, we can further enhance its performance. in this paper, based on the non-local property of fractional operator which is more approximate reality than traditional calculus, we extend the two Hop field networks in MSNN to the fractional net in which fractional equations describe its dynamical structure. after introducing the structure of Fractional Master-Slave Neural Network (FMSNN) and the concept of fractional calculus, we take a simulation for the FMSNN, MSNN and BP neural network respectively. the result shows this new kind of neural network has a quicker asymptotic convergence rate and a smaller network system error, which improves the performance of MSNN.