Gradient Descent for Symmetric Tensor Decomposition

Jiong Cai, Haixia Liu, Yang Wang
{"title":"Gradient Descent for Symmetric Tensor Decomposition","authors":"Jiong Cai, Haixia Liu, Yang Wang","doi":"10.4208/aam.oa-2021-0090","DOIUrl":null,"url":null,"abstract":". Symmetric tensor decomposition is of great importance in applications. Several studies have employed a greedy approach, where the main idea is to (cid:12)rst (cid:12)nd a best rank-one approximation of a given tensor, and then repeat the process to the residual tensor by subtracting the rank-one component. In this paper, we focus on (cid:12)nding a best rank-one approximation of a given orthogonally order-3 symmetric tensor. We give a geometric landscape analysis of a nonconvex optimization for the best rank-one approximation of orthogonally symmetric tensors. We show that any local minimizer must be a factor in this orthogonally symmetric tensor decomposition, and any other critical points are linear combinations of the factors. Then, we propose a gradient descent algorithm with a carefully designed initialization to solve this nonconvex optimization problem, and we prove that the algorithm converges to the global minimum with high probability for orthogonal decomposable tensors. This result, combined with the landscape analysis, reveals that the greedy algorithm will get the tensor CP low-rank decomposition. Numerical results are provided to verify our theoretical results.","PeriodicalId":58853,"journal":{"name":"应用数学年刊:英文版","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"应用数学年刊:英文版","FirstCategoryId":"1089","ListUrlMain":"https://doi.org/10.4208/aam.oa-2021-0090","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

. Symmetric tensor decomposition is of great importance in applications. Several studies have employed a greedy approach, where the main idea is to (cid:12)rst (cid:12)nd a best rank-one approximation of a given tensor, and then repeat the process to the residual tensor by subtracting the rank-one component. In this paper, we focus on (cid:12)nding a best rank-one approximation of a given orthogonally order-3 symmetric tensor. We give a geometric landscape analysis of a nonconvex optimization for the best rank-one approximation of orthogonally symmetric tensors. We show that any local minimizer must be a factor in this orthogonally symmetric tensor decomposition, and any other critical points are linear combinations of the factors. Then, we propose a gradient descent algorithm with a carefully designed initialization to solve this nonconvex optimization problem, and we prove that the algorithm converges to the global minimum with high probability for orthogonal decomposable tensors. This result, combined with the landscape analysis, reveals that the greedy algorithm will get the tensor CP low-rank decomposition. Numerical results are provided to verify our theoretical results.
对称张量分解的梯度下降
对称张量分解在应用中具有重要意义。一些研究采用了贪婪方法,其中主要思想是(cid:12)rst(cid:12)nd给定张量的最佳秩一近似,然后通过减去秩一分量来对残差张量重复该过程。在本文中,我们专注于(cid:12)确定给定正交序3对称张量的最佳秩一近似。我们给出了正交对称张量的最佳秩一近似的非凸优化的几何景观分析。我们证明了任何局部极小值都必须是这个正交对称张量分解中的一个因子,而任何其他临界点都是这些因子的线性组合。然后,我们提出了一种具有精心设计的初始化的梯度下降算法来解决这个非凸优化问题,并证明了该算法对于正交可分解张量以高概率收敛到全局最小值。这一结果,结合景观分析,表明贪婪算法将得到张量CP的低秩分解。数值结果验证了我们的理论结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
544
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信