A high-order tensor completion algorithm based on Fully-Connected Tensor Network weighted optimization

Pei-Yi Yang, Yonghui Huang, Yuning Qiu, Weijun Sun, Guoxu Zhou
{"title":"A high-order tensor completion algorithm based on Fully-Connected Tensor Network weighted optimization","authors":"Pei-Yi Yang, Yonghui Huang, Yuning Qiu, Weijun Sun, Guoxu Zhou","doi":"10.48550/arXiv.2204.01732","DOIUrl":null,"url":null,"abstract":". Tensor completion aimes at recovering missing data, and it is one of the popular concerns in deep learning and signal processing. Among the higher-order tensor decomposition algorithms, the recently proposed fully-connected tensor network decomposition (FCTN) algorithm is the most advanced. In this paper, by leveraging the superior expression of the fully-connected tensor network (FCTN) decomposition, we propose a new tensor completion method named the fully connected tensor network weighted optization(FCTN-WOPT). The algorithm per-forms a composition of the completed tensor by initialising the factors from the FCTN decomposition. We build a loss function with the weight tensor, the completed tensor and the incomplete tensor together, and then update the completed tensor using the lbfgs gradient descent algorithm to reduce the spatial memory occupation and speed up iterations. Finally we test the completion with synthetic data and real data (both image data and video data) and the results show the advanced performance of our FCTN-WOPT when it is applied to higher-order tensor completion.","PeriodicalId":420492,"journal":{"name":"Chinese Conference on Pattern Recognition and Computer Vision","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chinese Conference on Pattern Recognition and Computer Vision","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.48550/arXiv.2204.01732","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

. Tensor completion aimes at recovering missing data, and it is one of the popular concerns in deep learning and signal processing. Among the higher-order tensor decomposition algorithms, the recently proposed fully-connected tensor network decomposition (FCTN) algorithm is the most advanced. In this paper, by leveraging the superior expression of the fully-connected tensor network (FCTN) decomposition, we propose a new tensor completion method named the fully connected tensor network weighted optization(FCTN-WOPT). The algorithm per-forms a composition of the completed tensor by initialising the factors from the FCTN decomposition. We build a loss function with the weight tensor, the completed tensor and the incomplete tensor together, and then update the completed tensor using the lbfgs gradient descent algorithm to reduce the spatial memory occupation and speed up iterations. Finally we test the completion with synthetic data and real data (both image data and video data) and the results show the advanced performance of our FCTN-WOPT when it is applied to higher-order tensor completion.
基于全连通张量网络加权优化的高阶张量补全算法
. 张量补全旨在恢复缺失的数据,是深度学习和信号处理领域的热门问题之一。在高阶张量分解算法中,最近提出的全连通张量网络分解(FCTN)算法是最先进的。本文利用全连通张量网络(FCTN)分解的优越表达式,提出了一种新的张量补全方法——全连通张量网络加权优化(FCTN- wopt)。该算法通过初始化来自FCTN分解的因子来执行完成张量的组合。我们将权张量、完备张量和不完备张量一起构建损失函数,然后使用lbfgs梯度下降算法更新完备张量,以减少空间内存占用,加快迭代速度。最后用合成数据和实际数据(包括图像数据和视频数据)对补全进行了测试,结果表明FCTN-WOPT在应用于高阶张量补全时具有先进的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信