{"title":"神经网络收缩指标的计算与形式验证","authors":"Maxwell Fitzsimmons;Jun Liu","doi":"10.1109/LCSYS.2024.3478272","DOIUrl":null,"url":null,"abstract":"A contraction metric defines a differential Lyapunov-like function that robustly captures the convergence between trajectories. In this letter, we investigate the use of neural networks for computing verifiable contraction metrics. We first prove the existence of a smooth neural network contraction metric within the domain of attraction of an exponentially stable equilibrium point. We then focus on the computation of a neural network contraction metric over a compact invariant set within the domain of attraction certified by a physics-informed neural network Lyapunov function. We consider both partial differential inequality (PDI) and equation (PDE) losses for computation. We show that sufficiently accurate neural approximate solutions to the PDI and PDE are guaranteed to be a contraction metric under mild technical assumptions. We rigorously verify the computed neural network contraction metric using a satisfiability modulo theories solver. Through numerical examples, we demonstrate that the proposed approach outperforms traditional semidefinite programming methods for finding sum-of-squares polynomial contraction metrics.","PeriodicalId":37235,"journal":{"name":"IEEE Control Systems Letters","volume":null,"pages":null},"PeriodicalIF":2.4000,"publicationDate":"2024-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Computation and Formal Verification of Neural Network Contraction Metrics\",\"authors\":\"Maxwell Fitzsimmons;Jun Liu\",\"doi\":\"10.1109/LCSYS.2024.3478272\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A contraction metric defines a differential Lyapunov-like function that robustly captures the convergence between trajectories. In this letter, we investigate the use of neural networks for computing verifiable contraction metrics. We first prove the existence of a smooth neural network contraction metric within the domain of attraction of an exponentially stable equilibrium point. We then focus on the computation of a neural network contraction metric over a compact invariant set within the domain of attraction certified by a physics-informed neural network Lyapunov function. We consider both partial differential inequality (PDI) and equation (PDE) losses for computation. We show that sufficiently accurate neural approximate solutions to the PDI and PDE are guaranteed to be a contraction metric under mild technical assumptions. We rigorously verify the computed neural network contraction metric using a satisfiability modulo theories solver. Through numerical examples, we demonstrate that the proposed approach outperforms traditional semidefinite programming methods for finding sum-of-squares polynomial contraction metrics.\",\"PeriodicalId\":37235,\"journal\":{\"name\":\"IEEE Control Systems Letters\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.4000,\"publicationDate\":\"2024-10-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Control Systems Letters\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10714396/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"AUTOMATION & CONTROL SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Control Systems Letters","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10714396/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
Computation and Formal Verification of Neural Network Contraction Metrics
A contraction metric defines a differential Lyapunov-like function that robustly captures the convergence between trajectories. In this letter, we investigate the use of neural networks for computing verifiable contraction metrics. We first prove the existence of a smooth neural network contraction metric within the domain of attraction of an exponentially stable equilibrium point. We then focus on the computation of a neural network contraction metric over a compact invariant set within the domain of attraction certified by a physics-informed neural network Lyapunov function. We consider both partial differential inequality (PDI) and equation (PDE) losses for computation. We show that sufficiently accurate neural approximate solutions to the PDI and PDE are guaranteed to be a contraction metric under mild technical assumptions. We rigorously verify the computed neural network contraction metric using a satisfiability modulo theories solver. Through numerical examples, we demonstrate that the proposed approach outperforms traditional semidefinite programming methods for finding sum-of-squares polynomial contraction metrics.