{"title":"投影张量-张量积用于最优多路数据表示的有效计算","authors":"Katherine Keegan , Elizabeth Newman","doi":"10.1016/j.laa.2025.09.018","DOIUrl":null,"url":null,"abstract":"<div><div>Tensor decompositions have become essential tools for feature extraction and compression of multiway data. Recent advances in tensor operators have enabled desirable properties of standard matrix algebra to be retained for multilinear factorizations. Behind this matrix-mimetic tensor operation is an invertible matrix whose size depends quadratically on certain dimensions of the data. As a result, for large-scale multiway data, the invertible matrix can be computationally demanding to apply and invert and can lead to inefficient tensor representations in terms of construction and storage costs. In this work, we propose a new projected tensor-tensor product that relaxes the invertibility restriction to reduce computational overhead and still preserves fundamental linear algebraic properties. The transformation behind the projected product is a tall-and-skinny matrix with unitary columns, which depends only linearly on certain dimensions of the data, thereby reducing computational complexity by an order of magnitude. We provide extensive theory to prove the matrix mimeticity and the optimality of compressed representations within the projected product framework. We further prove that projected-product-based approximations outperform a comparable, non-matrix-mimetic tensor factorization. We support the theoretical findings and demonstrate the practical benefits of projected products through numerical experiments on video, hyperspectral imaging, synthetic, and dynamical systems data. All code for this paper is available at <span><span>https://github.com/elizabethnewman/projected-products.git</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":18043,"journal":{"name":"Linear Algebra and its Applications","volume":"729 ","pages":"Pages 100-147"},"PeriodicalIF":1.1000,"publicationDate":"2025-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Projected tensor-tensor products for efficient computation of optimal multiway data representations\",\"authors\":\"Katherine Keegan , Elizabeth Newman\",\"doi\":\"10.1016/j.laa.2025.09.018\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Tensor decompositions have become essential tools for feature extraction and compression of multiway data. Recent advances in tensor operators have enabled desirable properties of standard matrix algebra to be retained for multilinear factorizations. Behind this matrix-mimetic tensor operation is an invertible matrix whose size depends quadratically on certain dimensions of the data. As a result, for large-scale multiway data, the invertible matrix can be computationally demanding to apply and invert and can lead to inefficient tensor representations in terms of construction and storage costs. In this work, we propose a new projected tensor-tensor product that relaxes the invertibility restriction to reduce computational overhead and still preserves fundamental linear algebraic properties. The transformation behind the projected product is a tall-and-skinny matrix with unitary columns, which depends only linearly on certain dimensions of the data, thereby reducing computational complexity by an order of magnitude. We provide extensive theory to prove the matrix mimeticity and the optimality of compressed representations within the projected product framework. We further prove that projected-product-based approximations outperform a comparable, non-matrix-mimetic tensor factorization. We support the theoretical findings and demonstrate the practical benefits of projected products through numerical experiments on video, hyperspectral imaging, synthetic, and dynamical systems data. All code for this paper is available at <span><span>https://github.com/elizabethnewman/projected-products.git</span><svg><path></path></svg></span>.</div></div>\",\"PeriodicalId\":18043,\"journal\":{\"name\":\"Linear Algebra and its Applications\",\"volume\":\"729 \",\"pages\":\"Pages 100-147\"},\"PeriodicalIF\":1.1000,\"publicationDate\":\"2025-09-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Linear Algebra and its Applications\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S002437952500391X\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Linear Algebra and its Applications","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S002437952500391X","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
Projected tensor-tensor products for efficient computation of optimal multiway data representations
Tensor decompositions have become essential tools for feature extraction and compression of multiway data. Recent advances in tensor operators have enabled desirable properties of standard matrix algebra to be retained for multilinear factorizations. Behind this matrix-mimetic tensor operation is an invertible matrix whose size depends quadratically on certain dimensions of the data. As a result, for large-scale multiway data, the invertible matrix can be computationally demanding to apply and invert and can lead to inefficient tensor representations in terms of construction and storage costs. In this work, we propose a new projected tensor-tensor product that relaxes the invertibility restriction to reduce computational overhead and still preserves fundamental linear algebraic properties. The transformation behind the projected product is a tall-and-skinny matrix with unitary columns, which depends only linearly on certain dimensions of the data, thereby reducing computational complexity by an order of magnitude. We provide extensive theory to prove the matrix mimeticity and the optimality of compressed representations within the projected product framework. We further prove that projected-product-based approximations outperform a comparable, non-matrix-mimetic tensor factorization. We support the theoretical findings and demonstrate the practical benefits of projected products through numerical experiments on video, hyperspectral imaging, synthetic, and dynamical systems data. All code for this paper is available at https://github.com/elizabethnewman/projected-products.git.
期刊介绍:
Linear Algebra and its Applications publishes articles that contribute new information or new insights to matrix theory and finite dimensional linear algebra in their algebraic, arithmetic, combinatorial, geometric, or numerical aspects. It also publishes articles that give significant applications of matrix theory or linear algebra to other branches of mathematics and to other sciences. Articles that provide new information or perspectives on the historical development of matrix theory and linear algebra are also welcome. Expository articles which can serve as an introduction to a subject for workers in related areas and which bring one to the frontiers of research are encouraged. Reviews of books are published occasionally as are conference reports that provide an historical record of major meetings on matrix theory and linear algebra.