{"title":"Randomized Tensor Wheel Decomposition","authors":"Mengyu Wang, Yajie Yu, Hanyu Li","doi":"10.1137/23m1583934","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Scientific Computing, Volume 46, Issue 3, Page A1714-A1746, June 2024. <br/> Abstract. Tensor wheel (TW) decomposition is an elegant compromise of the popular tensor ring decomposition and fully connected tensor network decomposition, and it has many applications. In this work, we investigate the computation of this decomposition. Three randomized algorithms based on random sampling or random projection are proposed. Specifically, by defining a new tensor product called the subwheel product, the structures of the coefficient matrices of the alternating least squares subproblems from the minimization problem of TW decomposition are first figured out. Then, using the structures and the properties of the subwheel product, a random sampling algorithm based on leverage sampling and two random projection algorithms respectively based on Kronecker subsampled randomized Fourier transform and TensorSketch are derived. These algorithms can implement the sampling and projection on TW factors and hence can avoid forming the full coefficient matrices of subproblems. We present the complexity analysis and numerical performance on synthetic data, real data, and image reconstruction for our algorithms. Experimental results show that, compared with the deterministic algorithm in the literature, they need much less computing time while achieving similar accuracy and reconstruction effect. We also apply the proposed algorithms to tensor completion and find that the sampling-based algorithm always has excellent performance and the projection-based algorithms behave well when the sampling rate is higher than 50%.","PeriodicalId":3,"journal":{"name":"ACS Applied Electronic Materials","volume":null,"pages":null},"PeriodicalIF":4.3000,"publicationDate":"2024-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Electronic Materials","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/23m1583934","RegionNum":3,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
SIAM Journal on Scientific Computing, Volume 46, Issue 3, Page A1714-A1746, June 2024. Abstract. Tensor wheel (TW) decomposition is an elegant compromise of the popular tensor ring decomposition and fully connected tensor network decomposition, and it has many applications. In this work, we investigate the computation of this decomposition. Three randomized algorithms based on random sampling or random projection are proposed. Specifically, by defining a new tensor product called the subwheel product, the structures of the coefficient matrices of the alternating least squares subproblems from the minimization problem of TW decomposition are first figured out. Then, using the structures and the properties of the subwheel product, a random sampling algorithm based on leverage sampling and two random projection algorithms respectively based on Kronecker subsampled randomized Fourier transform and TensorSketch are derived. These algorithms can implement the sampling and projection on TW factors and hence can avoid forming the full coefficient matrices of subproblems. We present the complexity analysis and numerical performance on synthetic data, real data, and image reconstruction for our algorithms. Experimental results show that, compared with the deterministic algorithm in the literature, they need much less computing time while achieving similar accuracy and reconstruction effect. We also apply the proposed algorithms to tensor completion and find that the sampling-based algorithm always has excellent performance and the projection-based algorithms behave well when the sampling rate is higher than 50%.