{"title":"多层稀疏矩阵分解的结构化支持探索","authors":"Quoc-Tung Le, R. Gribonval","doi":"10.1109/ICASSP39728.2021.9414238","DOIUrl":null,"url":null,"abstract":"Matrix factorization with sparsity constraints plays an important role in many machine learning and signal processing problems such as dictionary learning, data visualization, dimension reduction. Among the most popular tools for sparse matrix factorization are proximal algorithms, a family of algorithms based on proximal operators. In this paper, we address two problems with the application of proximal algorithms to sparse matrix factorization. On the one hand, we analyze a weakness of proximal algorithms in sparse matrix factorization: the premature convergence of the support. A remedy is also proposed to address this problem. On the other hand, we describe a new tractable proximal operator called Generalized Hungarian Method, associated to so-called k-regular matrices, which are useful for the factorization of a class of matrices associated to fast linear transforms. We further illustrate the effectiveness of our proposals by numerical experiments on the Hadamard Transform and magnetoencephalography matrix factorization.","PeriodicalId":347060,"journal":{"name":"ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"Structured Support Exploration for Multilayer Sparse Matrix Factorization\",\"authors\":\"Quoc-Tung Le, R. Gribonval\",\"doi\":\"10.1109/ICASSP39728.2021.9414238\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Matrix factorization with sparsity constraints plays an important role in many machine learning and signal processing problems such as dictionary learning, data visualization, dimension reduction. Among the most popular tools for sparse matrix factorization are proximal algorithms, a family of algorithms based on proximal operators. In this paper, we address two problems with the application of proximal algorithms to sparse matrix factorization. On the one hand, we analyze a weakness of proximal algorithms in sparse matrix factorization: the premature convergence of the support. A remedy is also proposed to address this problem. On the other hand, we describe a new tractable proximal operator called Generalized Hungarian Method, associated to so-called k-regular matrices, which are useful for the factorization of a class of matrices associated to fast linear transforms. We further illustrate the effectiveness of our proposals by numerical experiments on the Hadamard Transform and magnetoencephalography matrix factorization.\",\"PeriodicalId\":347060,\"journal\":{\"name\":\"ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-06-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICASSP39728.2021.9414238\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICASSP39728.2021.9414238","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Structured Support Exploration for Multilayer Sparse Matrix Factorization
Matrix factorization with sparsity constraints plays an important role in many machine learning and signal processing problems such as dictionary learning, data visualization, dimension reduction. Among the most popular tools for sparse matrix factorization are proximal algorithms, a family of algorithms based on proximal operators. In this paper, we address two problems with the application of proximal algorithms to sparse matrix factorization. On the one hand, we analyze a weakness of proximal algorithms in sparse matrix factorization: the premature convergence of the support. A remedy is also proposed to address this problem. On the other hand, we describe a new tractable proximal operator called Generalized Hungarian Method, associated to so-called k-regular matrices, which are useful for the factorization of a class of matrices associated to fast linear transforms. We further illustrate the effectiveness of our proposals by numerical experiments on the Hadamard Transform and magnetoencephalography matrix factorization.