{"title":"Deep Learning Approach of Sparse Autoencoders with Lp/L2 Regularization","authors":"Ziheng Wu, Cong Li, Baigen Pan","doi":"10.1145/3424978.3425011","DOIUrl":null,"url":null,"abstract":"In this paper, we put forward a novel deep learning approach with Lp / L2 regularization based on sparse autoencoders network, trained by a nonnegativity constraint algorithm. Since L2 norm regularization penalizes the negative weights with smaller magnitudes much weaker than those with bigger magnitudes, lots of the weights could take small negative values. In order to address this issue, non-Lipschitz nonconvex LP norm (0<p<1) regularization which could force most of the negative weights to become non-negative is introduced, and the combination of LP/L2 norm regularization is applied for nonnegativity constraint. The proposed approach is analyzed for accuracy on the MNIST dataset for image classification, the experimental results have indicated that using both LP and L2 regularizations could induce non-negativity of weights and have promising performance.","PeriodicalId":178822,"journal":{"name":"Proceedings of the 4th International Conference on Computer Science and Application Engineering","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 4th International Conference on Computer Science and Application Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3424978.3425011","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we put forward a novel deep learning approach with Lp / L2 regularization based on sparse autoencoders network, trained by a nonnegativity constraint algorithm. Since L2 norm regularization penalizes the negative weights with smaller magnitudes much weaker than those with bigger magnitudes, lots of the weights could take small negative values. In order to address this issue, non-Lipschitz nonconvex LP norm (0