{"title":"基于极大似然估计的低秩矩阵补全","authors":"Jinhui Chen, Jian Yang","doi":"10.1109/ACPR.2013.120","DOIUrl":null,"url":null,"abstract":"Low-rank matrix completion has recently emerged in computational data analysis. The problem aims to recover a low-rank representation from the contaminated data. The errors in data are assumed to be sparse, which is generally characterized by minimizing the L1-norm of the residual. This actually assumes that the residual follows the Laplacian distribution. The Laplacian assumption, however, may not be accurate enough to describe various noises in real scenarios. In this paper, we estimate the error in data with robust regression. Assuming the noises are respectively independent and identically distributed, the minimization of noise is equivalent to find the maximum likelihood estimation (MLE) solution for the residuals. We also design an iteratively reweight inexact augmented Lagrange multiplier algorithm to solve the optimization. Experimental results confirm the efficiency of our proposed approach under different conditions.","PeriodicalId":365633,"journal":{"name":"2013 2nd IAPR Asian Conference on Pattern Recognition","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-11-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Low-Rank Matrix Completion Based on Maximum Likelihood Estimation\",\"authors\":\"Jinhui Chen, Jian Yang\",\"doi\":\"10.1109/ACPR.2013.120\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Low-rank matrix completion has recently emerged in computational data analysis. The problem aims to recover a low-rank representation from the contaminated data. The errors in data are assumed to be sparse, which is generally characterized by minimizing the L1-norm of the residual. This actually assumes that the residual follows the Laplacian distribution. The Laplacian assumption, however, may not be accurate enough to describe various noises in real scenarios. In this paper, we estimate the error in data with robust regression. Assuming the noises are respectively independent and identically distributed, the minimization of noise is equivalent to find the maximum likelihood estimation (MLE) solution for the residuals. We also design an iteratively reweight inexact augmented Lagrange multiplier algorithm to solve the optimization. Experimental results confirm the efficiency of our proposed approach under different conditions.\",\"PeriodicalId\":365633,\"journal\":{\"name\":\"2013 2nd IAPR Asian Conference on Pattern Recognition\",\"volume\":\"30 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-11-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2013 2nd IAPR Asian Conference on Pattern Recognition\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ACPR.2013.120\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 2nd IAPR Asian Conference on Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACPR.2013.120","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Low-Rank Matrix Completion Based on Maximum Likelihood Estimation
Low-rank matrix completion has recently emerged in computational data analysis. The problem aims to recover a low-rank representation from the contaminated data. The errors in data are assumed to be sparse, which is generally characterized by minimizing the L1-norm of the residual. This actually assumes that the residual follows the Laplacian distribution. The Laplacian assumption, however, may not be accurate enough to describe various noises in real scenarios. In this paper, we estimate the error in data with robust regression. Assuming the noises are respectively independent and identically distributed, the minimization of noise is equivalent to find the maximum likelihood estimation (MLE) solution for the residuals. We also design an iteratively reweight inexact augmented Lagrange multiplier algorithm to solve the optimization. Experimental results confirm the efficiency of our proposed approach under different conditions.