{"title":"用于压缩成像的可证明有界动态稀疏化变换网络","authors":"Baoshun Shi, Dan Li","doi":"10.1109/TNNLS.2025.3543766","DOIUrl":null,"url":null,"abstract":"<p><p>Compressive imaging (CI) aims to recover the underlying image from the under-sampled observations. Recently, deep unfolded CI (DUCI) algorithms, which unfold the iterative algorithms into deep neural networks (DNNs), have achieved remarkable results. Theoretically, unfolding a convergent iterative algorithm could ensure a stable DUCI algorithm, i.e., its performance increases as the increasing stage. However, ensuring convergence often involves imposing constraints, such as bounded spectral norm or tight property, on the filter weights or sparsifying transform. Unfortunately, these constraints may compromise algorithm performance. To address this challenge, we present a provably bounded dynamic sparsifying transform network (BSTNet), which can be explicitly proven to be a bounded network without imposing constraints on the analysis sparsifying transform. Leveraging this advantage, the analysis sparsifying transform can be adaptively generated via a trainable DNN. Specifically, we elaborate a dynamic sparsifying transform generator capable of extracting multiple feature information from input instances, facilitating the creation of a faithful content-adaptive sparsifying transform. We explicitly demonstrate that the proposed BSTNet is a bounded network, and further embed it as the prior network into a DUCI framework to evaluate its performance on two CI tasks, i.e., spectral snapshot CI (SCI) and compressed sensing magnetic resonance imaging (CSMRI). Experimental results showcase that our DUCI algorithms can achieve competitive recovery quality compared to benchmark algorithms. Theoretically, we explicitly prove that the proposed BSTNet is bounded, and we provide a comprehensive theoretical convergence analysis of the proposed iteration algorithms.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":""},"PeriodicalIF":10.2000,"publicationDate":"2025-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Provably Bounded Dynamic Sparsifying Transform Network for Compressive Imaging.\",\"authors\":\"Baoshun Shi, Dan Li\",\"doi\":\"10.1109/TNNLS.2025.3543766\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Compressive imaging (CI) aims to recover the underlying image from the under-sampled observations. Recently, deep unfolded CI (DUCI) algorithms, which unfold the iterative algorithms into deep neural networks (DNNs), have achieved remarkable results. Theoretically, unfolding a convergent iterative algorithm could ensure a stable DUCI algorithm, i.e., its performance increases as the increasing stage. However, ensuring convergence often involves imposing constraints, such as bounded spectral norm or tight property, on the filter weights or sparsifying transform. Unfortunately, these constraints may compromise algorithm performance. To address this challenge, we present a provably bounded dynamic sparsifying transform network (BSTNet), which can be explicitly proven to be a bounded network without imposing constraints on the analysis sparsifying transform. Leveraging this advantage, the analysis sparsifying transform can be adaptively generated via a trainable DNN. Specifically, we elaborate a dynamic sparsifying transform generator capable of extracting multiple feature information from input instances, facilitating the creation of a faithful content-adaptive sparsifying transform. We explicitly demonstrate that the proposed BSTNet is a bounded network, and further embed it as the prior network into a DUCI framework to evaluate its performance on two CI tasks, i.e., spectral snapshot CI (SCI) and compressed sensing magnetic resonance imaging (CSMRI). Experimental results showcase that our DUCI algorithms can achieve competitive recovery quality compared to benchmark algorithms. Theoretically, we explicitly prove that the proposed BSTNet is bounded, and we provide a comprehensive theoretical convergence analysis of the proposed iteration algorithms.</p>\",\"PeriodicalId\":13303,\"journal\":{\"name\":\"IEEE transactions on neural networks and learning systems\",\"volume\":\"PP \",\"pages\":\"\"},\"PeriodicalIF\":10.2000,\"publicationDate\":\"2025-03-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on neural networks and learning systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1109/TNNLS.2025.3543766\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1109/TNNLS.2025.3543766","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Provably Bounded Dynamic Sparsifying Transform Network for Compressive Imaging.
Compressive imaging (CI) aims to recover the underlying image from the under-sampled observations. Recently, deep unfolded CI (DUCI) algorithms, which unfold the iterative algorithms into deep neural networks (DNNs), have achieved remarkable results. Theoretically, unfolding a convergent iterative algorithm could ensure a stable DUCI algorithm, i.e., its performance increases as the increasing stage. However, ensuring convergence often involves imposing constraints, such as bounded spectral norm or tight property, on the filter weights or sparsifying transform. Unfortunately, these constraints may compromise algorithm performance. To address this challenge, we present a provably bounded dynamic sparsifying transform network (BSTNet), which can be explicitly proven to be a bounded network without imposing constraints on the analysis sparsifying transform. Leveraging this advantage, the analysis sparsifying transform can be adaptively generated via a trainable DNN. Specifically, we elaborate a dynamic sparsifying transform generator capable of extracting multiple feature information from input instances, facilitating the creation of a faithful content-adaptive sparsifying transform. We explicitly demonstrate that the proposed BSTNet is a bounded network, and further embed it as the prior network into a DUCI framework to evaluate its performance on two CI tasks, i.e., spectral snapshot CI (SCI) and compressed sensing magnetic resonance imaging (CSMRI). Experimental results showcase that our DUCI algorithms can achieve competitive recovery quality compared to benchmark algorithms. Theoretically, we explicitly prove that the proposed BSTNet is bounded, and we provide a comprehensive theoretical convergence analysis of the proposed iteration algorithms.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.