{"title":"基于Lyapunov稳定性的离散时间系统自适应反向传播","authors":"Z. Man, Serig Kah Phooi, H. Wu","doi":"10.1109/ISSPA.1999.815759","DOIUrl":null,"url":null,"abstract":"Lyapunov stability-based adaptive backpropagation (LABP) for discrete systems is proposed in this paper. It can be applied to various aspects of adaptive signal processing. A Lyapunov function of the error between the desired and actual outputs of the neural network is first defined. Then the error is backward-propagated based on Lyapunov stability theory so that it can be used to adaptively adjust the weights of the inner layers of the neural networks. Subsequently, this will lead to an error between the desired and actual outputs converging to zero asymptotically. The proposed scheme possesses distinct advantages over the conventional BP by assuring that the system will not get stuck in local minima. Furthermore, this scheme has a faster convergence property and the stability is guaranteed by Lyapunov stability theory. A simulation example is performed to support the proposed scheme.","PeriodicalId":302569,"journal":{"name":"ISSPA '99. Proceedings of the Fifth International Symposium on Signal Processing and its Applications (IEEE Cat. No.99EX359)","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1999-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Lyapunov stability-based adaptive backpropagation for discrete time system\",\"authors\":\"Z. Man, Serig Kah Phooi, H. Wu\",\"doi\":\"10.1109/ISSPA.1999.815759\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Lyapunov stability-based adaptive backpropagation (LABP) for discrete systems is proposed in this paper. It can be applied to various aspects of adaptive signal processing. A Lyapunov function of the error between the desired and actual outputs of the neural network is first defined. Then the error is backward-propagated based on Lyapunov stability theory so that it can be used to adaptively adjust the weights of the inner layers of the neural networks. Subsequently, this will lead to an error between the desired and actual outputs converging to zero asymptotically. The proposed scheme possesses distinct advantages over the conventional BP by assuring that the system will not get stuck in local minima. Furthermore, this scheme has a faster convergence property and the stability is guaranteed by Lyapunov stability theory. A simulation example is performed to support the proposed scheme.\",\"PeriodicalId\":302569,\"journal\":{\"name\":\"ISSPA '99. Proceedings of the Fifth International Symposium on Signal Processing and its Applications (IEEE Cat. No.99EX359)\",\"volume\":\"26 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1999-08-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ISSPA '99. Proceedings of the Fifth International Symposium on Signal Processing and its Applications (IEEE Cat. No.99EX359)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISSPA.1999.815759\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISSPA '99. Proceedings of the Fifth International Symposium on Signal Processing and its Applications (IEEE Cat. No.99EX359)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISSPA.1999.815759","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Lyapunov stability-based adaptive backpropagation for discrete time system
Lyapunov stability-based adaptive backpropagation (LABP) for discrete systems is proposed in this paper. It can be applied to various aspects of adaptive signal processing. A Lyapunov function of the error between the desired and actual outputs of the neural network is first defined. Then the error is backward-propagated based on Lyapunov stability theory so that it can be used to adaptively adjust the weights of the inner layers of the neural networks. Subsequently, this will lead to an error between the desired and actual outputs converging to zero asymptotically. The proposed scheme possesses distinct advantages over the conventional BP by assuring that the system will not get stuck in local minima. Furthermore, this scheme has a faster convergence property and the stability is guaranteed by Lyapunov stability theory. A simulation example is performed to support the proposed scheme.