{"title":"从标记特征学习时文本分类中的L1与L2正则化","authors":"Sinziana Mazilu, J. Iria","doi":"10.1109/ICMLA.2011.85","DOIUrl":null,"url":null,"abstract":"In this paper we study the problem of building document classifiers using labeled features and unlabeled documents, where not all the features are helpful for the process of learning. This is an important setting, since building classifiers using labeled words has been recently shown to require considerably less human labeling effort than building classifiers using labeled documents. We propose the use of Generalized Expectation (GE) criteria combined with a L1 regularization term for learning from labeled features. This lets the feature labels guide model expectation constraints, while approaching feature selection from a regularization perspective. We show that GE criteria combined with L1 regularization consistently outperforms -- up to 12% increase in accuracy -- the best previously reported results in the literature under the same setting, obtained using L2 regularization. Furthermore, the results obtained with GE criteria and L1 regularizer are competitive to those obtained in the traditional instance-labeling setting, with the same labeling cost.","PeriodicalId":439926,"journal":{"name":"2011 10th International Conference on Machine Learning and Applications and Workshops","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"L1 vs. L2 Regularization in Text Classification when Learning from Labeled Features\",\"authors\":\"Sinziana Mazilu, J. Iria\",\"doi\":\"10.1109/ICMLA.2011.85\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper we study the problem of building document classifiers using labeled features and unlabeled documents, where not all the features are helpful for the process of learning. This is an important setting, since building classifiers using labeled words has been recently shown to require considerably less human labeling effort than building classifiers using labeled documents. We propose the use of Generalized Expectation (GE) criteria combined with a L1 regularization term for learning from labeled features. This lets the feature labels guide model expectation constraints, while approaching feature selection from a regularization perspective. We show that GE criteria combined with L1 regularization consistently outperforms -- up to 12% increase in accuracy -- the best previously reported results in the literature under the same setting, obtained using L2 regularization. Furthermore, the results obtained with GE criteria and L1 regularizer are competitive to those obtained in the traditional instance-labeling setting, with the same labeling cost.\",\"PeriodicalId\":439926,\"journal\":{\"name\":\"2011 10th International Conference on Machine Learning and Applications and Workshops\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-12-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 10th International Conference on Machine Learning and Applications and Workshops\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICMLA.2011.85\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 10th International Conference on Machine Learning and Applications and Workshops","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLA.2011.85","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
L1 vs. L2 Regularization in Text Classification when Learning from Labeled Features
In this paper we study the problem of building document classifiers using labeled features and unlabeled documents, where not all the features are helpful for the process of learning. This is an important setting, since building classifiers using labeled words has been recently shown to require considerably less human labeling effort than building classifiers using labeled documents. We propose the use of Generalized Expectation (GE) criteria combined with a L1 regularization term for learning from labeled features. This lets the feature labels guide model expectation constraints, while approaching feature selection from a regularization perspective. We show that GE criteria combined with L1 regularization consistently outperforms -- up to 12% increase in accuracy -- the best previously reported results in the literature under the same setting, obtained using L2 regularization. Furthermore, the results obtained with GE criteria and L1 regularizer are competitive to those obtained in the traditional instance-labeling setting, with the same labeling cost.