{"title":"Classification efficiency of LassoNet model in image recognition","authors":"Xingkai Wen, Zhiji Yang","doi":"10.1109/AEMCSE55572.2022.00083","DOIUrl":null,"url":null,"abstract":"LassoNet is a neural network framework proposed by Robert Tibshirani et al. and published in the \"Journal of Machine Learning Research\" in 2021. The model generalizes the existing Lasso regression and its feature sparsity to a feedforward neural network, and performs feature selection and parameter learning at the same time under the premise of unknown optimal number of selected features. In order to verify whether the classification efficiency of LassoNet is efficient, LassoNet is first compared with four shallow learning methods (logistic regression, Fisher linear discriminant, random forest and support vector machine) and three deep learning methods (CNN, Inception and Residual Module), respectively. For the classification of high-dimensional and large-sample datasets in five different fields, the experimental results show that LassoNet has a significant classification effect, which is significantly better than the general shallow learning method, and is comparable to the deep learning method. It can be seen that LassoNet has strong versatility and It is easy to use, but it takes a lot of time to run. In the follow-up work, the feedforward neural network can be optimized or replaced to further improve the classification efficiency.","PeriodicalId":309096,"journal":{"name":"2022 5th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE)","volume":"75 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 5th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AEMCSE55572.2022.00083","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
LassoNet is a neural network framework proposed by Robert Tibshirani et al. and published in the "Journal of Machine Learning Research" in 2021. The model generalizes the existing Lasso regression and its feature sparsity to a feedforward neural network, and performs feature selection and parameter learning at the same time under the premise of unknown optimal number of selected features. In order to verify whether the classification efficiency of LassoNet is efficient, LassoNet is first compared with four shallow learning methods (logistic regression, Fisher linear discriminant, random forest and support vector machine) and three deep learning methods (CNN, Inception and Residual Module), respectively. For the classification of high-dimensional and large-sample datasets in five different fields, the experimental results show that LassoNet has a significant classification effect, which is significantly better than the general shallow learning method, and is comparable to the deep learning method. It can be seen that LassoNet has strong versatility and It is easy to use, but it takes a lot of time to run. In the follow-up work, the feedforward neural network can be optimized or replaced to further improve the classification efficiency.