{"title":"通过 ket 增强和自动加权策略的非凸方法实现稳健的张量恢复","authors":"Wenhui Xie, Chen Ling, Hongjin He, Lei‐Hong Zhang","doi":"10.1002/nla.2580","DOIUrl":null,"url":null,"abstract":"In this article, we introduce a nonconvex tensor recovery approach, which employs the powerful ket augmentation technique to expand a low order tensor into a high‐order one so that we can exploit the advantage of tensor train (TT) decomposition tailored for high‐order tensors. Moreover, we define a new nonconvex surrogate function to approximate the tensor rank, and develop an auto‐weighted mechanism to adjust the weights of the resulting high‐order tensor's TT ranks. To make our approach robust, we add two mode‐unfolding regularization terms to enhance the model for the purpose of exploring spatio‐temporal continuity and self‐similarity of the underlying tensors. Also, we propose an implementable algorithm to solve the proposed optimization model in the sense that each subproblem enjoys a closed‐form solution. A series of numerical results demonstrate that our approach works well on recovering color images and videos.","PeriodicalId":49731,"journal":{"name":"Numerical Linear Algebra with Applications","volume":null,"pages":null},"PeriodicalIF":1.8000,"publicationDate":"2024-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Robust tensor recovery via a nonconvex approach with ket augmentation and auto‐weighted strategy\",\"authors\":\"Wenhui Xie, Chen Ling, Hongjin He, Lei‐Hong Zhang\",\"doi\":\"10.1002/nla.2580\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this article, we introduce a nonconvex tensor recovery approach, which employs the powerful ket augmentation technique to expand a low order tensor into a high‐order one so that we can exploit the advantage of tensor train (TT) decomposition tailored for high‐order tensors. Moreover, we define a new nonconvex surrogate function to approximate the tensor rank, and develop an auto‐weighted mechanism to adjust the weights of the resulting high‐order tensor's TT ranks. To make our approach robust, we add two mode‐unfolding regularization terms to enhance the model for the purpose of exploring spatio‐temporal continuity and self‐similarity of the underlying tensors. Also, we propose an implementable algorithm to solve the proposed optimization model in the sense that each subproblem enjoys a closed‐form solution. A series of numerical results demonstrate that our approach works well on recovering color images and videos.\",\"PeriodicalId\":49731,\"journal\":{\"name\":\"Numerical Linear Algebra with Applications\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2024-07-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Numerical Linear Algebra with Applications\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1002/nla.2580\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Numerical Linear Algebra with Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1002/nla.2580","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0
摘要
在本文中,我们介绍了一种非凸张量恢复方法,该方法利用强大的 ket 增强技术将低阶张量扩展为高阶张量,从而利用为高阶张量量身定制的张量列车(TT)分解的优势。此外,我们定义了一个新的非凸替代函数来近似张量秩,并开发了一种自动加权机制来调整由此产生的高阶张量的 TT 秩的权重。为了使我们的方法具有鲁棒性,我们添加了两个模式解折正则化项来增强模型,以探索底层张量的时空连续性和自相似性。此外,我们还提出了一种可实施的算法来解决所提出的优化模型,即每个子问题都有一个闭式解。一系列数值结果表明,我们的方法在恢复彩色图像和视频时效果良好。
Robust tensor recovery via a nonconvex approach with ket augmentation and auto‐weighted strategy
In this article, we introduce a nonconvex tensor recovery approach, which employs the powerful ket augmentation technique to expand a low order tensor into a high‐order one so that we can exploit the advantage of tensor train (TT) decomposition tailored for high‐order tensors. Moreover, we define a new nonconvex surrogate function to approximate the tensor rank, and develop an auto‐weighted mechanism to adjust the weights of the resulting high‐order tensor's TT ranks. To make our approach robust, we add two mode‐unfolding regularization terms to enhance the model for the purpose of exploring spatio‐temporal continuity and self‐similarity of the underlying tensors. Also, we propose an implementable algorithm to solve the proposed optimization model in the sense that each subproblem enjoys a closed‐form solution. A series of numerical results demonstrate that our approach works well on recovering color images and videos.
期刊介绍:
Manuscripts submitted to Numerical Linear Algebra with Applications should include large-scale broad-interest applications in which challenging computational results are integral to the approach investigated and analysed. Manuscripts that, in the Editor’s view, do not satisfy these conditions will not be accepted for review.
Numerical Linear Algebra with Applications receives submissions in areas that address developing, analysing and applying linear algebra algorithms for solving problems arising in multilinear (tensor) algebra, in statistics, such as Markov Chains, as well as in deterministic and stochastic modelling of large-scale networks, algorithm development, performance analysis or related computational aspects.
Topics covered include: Standard and Generalized Conjugate Gradients, Multigrid and Other Iterative Methods; Preconditioning Methods; Direct Solution Methods; Numerical Methods for Eigenproblems; Newton-like Methods for Nonlinear Equations; Parallel and Vectorizable Algorithms in Numerical Linear Algebra; Application of Methods of Numerical Linear Algebra in Science, Engineering and Economics.