{"title":"不等式约束下的改进拉格朗日非线性规划神经网络","authors":"Yuancan Huang, Chuang Yu","doi":"10.1109/IJCNN.2007.4371088","DOIUrl":null,"url":null,"abstract":"By redefining multiplier associated with inequality constraint as a positive definite function of the originally-defined multiplier, ui 2, i = 1, 2,..., m, say, the nonnegative constraints imposed on inequality constraints in Karush-Kuhn-Tucker necessary conditions are removed completely. Hence it is no longer necessary to convert inequality constraints into equality constraints by slack variables in order to reuse those results concerned only with equality constraints. Utilizing this technique, improved Lagrange nonlinear programming neural networks are devised, which handle inequality constraints directly without adding slack variables. Then the local stability of the proposed Lagrange neural networks is analyzed rigorously with Lyapunov's first approximation principle, and its convergence is discussed deeply with LaSalle's invariance principle. Finally, an illustrative example shows that the proposed neural networks can effectively solve the nonlinear programming problems","PeriodicalId":116729,"journal":{"name":"Sixth International Conference on Intelligent Systems Design and Applications","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Improved Lagrange Nonlinear Programming Neural Networks for Inequality Constraints\",\"authors\":\"Yuancan Huang, Chuang Yu\",\"doi\":\"10.1109/IJCNN.2007.4371088\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"By redefining multiplier associated with inequality constraint as a positive definite function of the originally-defined multiplier, ui 2, i = 1, 2,..., m, say, the nonnegative constraints imposed on inequality constraints in Karush-Kuhn-Tucker necessary conditions are removed completely. Hence it is no longer necessary to convert inequality constraints into equality constraints by slack variables in order to reuse those results concerned only with equality constraints. Utilizing this technique, improved Lagrange nonlinear programming neural networks are devised, which handle inequality constraints directly without adding slack variables. Then the local stability of the proposed Lagrange neural networks is analyzed rigorously with Lyapunov's first approximation principle, and its convergence is discussed deeply with LaSalle's invariance principle. Finally, an illustrative example shows that the proposed neural networks can effectively solve the nonlinear programming problems\",\"PeriodicalId\":116729,\"journal\":{\"name\":\"Sixth International Conference on Intelligent Systems Design and Applications\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2007-10-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Sixth International Conference on Intelligent Systems Design and Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.2007.4371088\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sixth International Conference on Intelligent Systems Design and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2007.4371088","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
摘要
通过将与不等式约束相关的乘数重新定义为原定义乘数的正定函数,ui 2, i = 1,2,…例如,在Karush-Kuhn-Tucker必要条件下,对不平等约束施加的非负约束被完全去除。因此,不再需要通过松弛变量将不等式约束转换为相等约束,以便重用那些只涉及相等约束的结果。利用这一技术,设计了改进的拉格朗日非线性规划神经网络,该网络不添加松弛变量,直接处理不等式约束。然后用李亚普诺夫第一近似原理严格分析了所提出的拉格朗日神经网络的局部稳定性,并用拉萨尔不变性原理深入讨论了其收敛性。最后,算例表明,所提神经网络能有效地解决非线性规划问题
Improved Lagrange Nonlinear Programming Neural Networks for Inequality Constraints
By redefining multiplier associated with inequality constraint as a positive definite function of the originally-defined multiplier, ui 2, i = 1, 2,..., m, say, the nonnegative constraints imposed on inequality constraints in Karush-Kuhn-Tucker necessary conditions are removed completely. Hence it is no longer necessary to convert inequality constraints into equality constraints by slack variables in order to reuse those results concerned only with equality constraints. Utilizing this technique, improved Lagrange nonlinear programming neural networks are devised, which handle inequality constraints directly without adding slack variables. Then the local stability of the proposed Lagrange neural networks is analyzed rigorously with Lyapunov's first approximation principle, and its convergence is discussed deeply with LaSalle's invariance principle. Finally, an illustrative example shows that the proposed neural networks can effectively solve the nonlinear programming problems