Low-rank tensor completion via tensor tri-factorization and sparse transformation

IF 3.4 2区 工程技术 Q2 ENGINEERING, ELECTRICAL & ELECTRONIC
Fanyin Yang , Bing Zheng , Ruijuan Zhao
{"title":"Low-rank tensor completion via tensor tri-factorization and sparse transformation","authors":"Fanyin Yang ,&nbsp;Bing Zheng ,&nbsp;Ruijuan Zhao","doi":"10.1016/j.sigpro.2025.109935","DOIUrl":null,"url":null,"abstract":"<div><div>Low-rank tensor factorization techniques have gained significant attention in low-rank tensor completion (LRTC) tasks due to their ability to reduce computational costs while maintaining the tensor’s low-rank structure. However, existing methods often overlook the significance of tensor singular values and the sparsity of the tensor’s third-mode fibers in the transformation domain, leading to an incomplete capture of both the low-rank structure and the inherent sparsity, which limits recovery accuracy. To address these issues, we propose a novel tensor tri-factorization logarithmic norm (TTF-LN) that more effectively captures the low-rank structure by emphasizing the significance of tensor singular values. Building on this, we introduce the tensor tri-factorization with sparse transformation (TTF-ST) model for LRTC, which integrates both low-rank and sparse priors to improve accuracy of incomplete tensor recovery. The TTF-ST model incorporates a sparse transformation that represents the tensor as the product of a low-dimensional sparse representation tensor and a compact orthogonal matrix, which extracts sparsity while reducing computational complexity. To solve the proposed model, we design an optimization algorithm based on the alternating direction method of multipliers (ADMM) and provide a rigorous theoretical analysis. Extensive experiments demonstrate that the proposed method outperforms state-of-the-art methods in both recovery accuracy and computational efficiency.</div></div>","PeriodicalId":49523,"journal":{"name":"Signal Processing","volume":"233 ","pages":"Article 109935"},"PeriodicalIF":3.4000,"publicationDate":"2025-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Signal Processing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0165168425000507","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Low-rank tensor factorization techniques have gained significant attention in low-rank tensor completion (LRTC) tasks due to their ability to reduce computational costs while maintaining the tensor’s low-rank structure. However, existing methods often overlook the significance of tensor singular values and the sparsity of the tensor’s third-mode fibers in the transformation domain, leading to an incomplete capture of both the low-rank structure and the inherent sparsity, which limits recovery accuracy. To address these issues, we propose a novel tensor tri-factorization logarithmic norm (TTF-LN) that more effectively captures the low-rank structure by emphasizing the significance of tensor singular values. Building on this, we introduce the tensor tri-factorization with sparse transformation (TTF-ST) model for LRTC, which integrates both low-rank and sparse priors to improve accuracy of incomplete tensor recovery. The TTF-ST model incorporates a sparse transformation that represents the tensor as the product of a low-dimensional sparse representation tensor and a compact orthogonal matrix, which extracts sparsity while reducing computational complexity. To solve the proposed model, we design an optimization algorithm based on the alternating direction method of multipliers (ADMM) and provide a rigorous theoretical analysis. Extensive experiments demonstrate that the proposed method outperforms state-of-the-art methods in both recovery accuracy and computational efficiency.
基于张量三分解和稀疏变换的低秩张量补全
低秩张量分解技术在低秩张量补全(LRTC)任务中获得了很大的关注,因为它们能够在保持张量低秩结构的同时降低计算成本。然而,现有的方法往往忽略了张量奇异值的重要性和张量三模光纤在变换域中的稀疏性,导致低秩结构和固有稀疏性都不能完全捕获,从而限制了恢复精度。为了解决这些问题,我们提出了一种新的张量三因子对数范数(TTF-LN),通过强调张量奇异值的重要性来更有效地捕获低秩结构。在此基础上,我们引入了LRTC的张量三分解与稀疏变换(TTF-ST)模型,该模型结合了低秩先验和稀疏先验,提高了不完全张量恢复的精度。TTF-ST模型采用稀疏变换,将张量表示为低维稀疏表示张量与紧致正交矩阵的乘积,在提取稀疏性的同时降低了计算复杂度。为了求解该模型,我们设计了一种基于乘法器交替方向法(ADMM)的优化算法,并进行了严格的理论分析。大量的实验表明,该方法在恢复精度和计算效率方面都优于现有的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Signal Processing
Signal Processing 工程技术-工程:电子与电气
CiteScore
9.20
自引率
9.10%
发文量
309
审稿时长
41 days
期刊介绍: Signal Processing incorporates all aspects of the theory and practice of signal processing. It features original research work, tutorial and review articles, and accounts of practical developments. It is intended for a rapid dissemination of knowledge and experience to engineers and scientists working in the research, development or practical application of signal processing. Subject areas covered by the journal include: Signal Theory; Stochastic Processes; Detection and Estimation; Spectral Analysis; Filtering; Signal Processing Systems; Software Developments; Image Processing; Pattern Recognition; Optical Signal Processing; Digital Signal Processing; Multi-dimensional Signal Processing; Communication Signal Processing; Biomedical Signal Processing; Geophysical and Astrophysical Signal Processing; Earth Resources Signal Processing; Acoustic and Vibration Signal Processing; Data Processing; Remote Sensing; Signal Processing Technology; Radar Signal Processing; Sonar Signal Processing; Industrial Applications; New Applications.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信