{"title":"Tuning-Free Online Robust Principal Component Analysis Through Implicit Regularization","authors":"Lakshmi Jayalal;Gokularam Muthukrishnan;Sheetal Kalyani","doi":"10.1109/LSP.2025.3599784","DOIUrl":null,"url":null,"abstract":"The performance of Online Robust Principal Component Analysis (OR-PCA) technique heavily depends on the optimum tuning of the explicit regularizers. This tuning is dataset-sensitive and often impractical to optimize in real-world scenarios. We aim to remove the dependency on these tuning parameters by using implicit regularization. To this end, we develop an approach that integrates implicit regularization properties of various gradient descent methods to estimate sparse outliers and low-dimensional representations in a streaming setting—a non-trivial extension of existing techniques. A key novelty lies in the design of a new parameterization for matrix estimation in OR-PCA. Our method incorporates three different versions of modified gradient descent that separate but naturally encourage sparsity and low-rank structures in the data. Experimental results on synthetic and real-world video datasets demonstrate that the proposed method, namely, Tuning-Free OR-PCA (TF-ORPCA), outperforms existing OR-PCA methods. TF-ORPCA makes it more scalable for large datasets.","PeriodicalId":13154,"journal":{"name":"IEEE Signal Processing Letters","volume":"32 ","pages":"3360-3364"},"PeriodicalIF":3.9000,"publicationDate":"2025-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Signal Processing Letters","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/11127046/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
The performance of Online Robust Principal Component Analysis (OR-PCA) technique heavily depends on the optimum tuning of the explicit regularizers. This tuning is dataset-sensitive and often impractical to optimize in real-world scenarios. We aim to remove the dependency on these tuning parameters by using implicit regularization. To this end, we develop an approach that integrates implicit regularization properties of various gradient descent methods to estimate sparse outliers and low-dimensional representations in a streaming setting—a non-trivial extension of existing techniques. A key novelty lies in the design of a new parameterization for matrix estimation in OR-PCA. Our method incorporates three different versions of modified gradient descent that separate but naturally encourage sparsity and low-rank structures in the data. Experimental results on synthetic and real-world video datasets demonstrate that the proposed method, namely, Tuning-Free OR-PCA (TF-ORPCA), outperforms existing OR-PCA methods. TF-ORPCA makes it more scalable for large datasets.
期刊介绍:
The IEEE Signal Processing Letters is a monthly, archival publication designed to provide rapid dissemination of original, cutting-edge ideas and timely, significant contributions in signal, image, speech, language and audio processing. Papers published in the Letters can be presented within one year of their appearance in signal processing conferences such as ICASSP, GlobalSIP and ICIP, and also in several workshop organized by the Signal Processing Society.