T. Onoda
{"title":"增强和正则化增强算法的过拟合","authors":"T. Onoda","doi":"10.1002/ECJC.20344","DOIUrl":null,"url":null,"abstract":"The impressive generalization capacity of AdaBoost has been explained using the concept of a margin introduced in the context of support vector machines. However, this ability to generalize is limited to cases where the data does not include misclassification errors or significant amounts of noise. In addition, the research of Schapire and colleagues has served to provide theoretical support for these results from the perspective of improving margins. In this paper we propose a set of new algorithms, AdaBoostReg,ν-Arc, and ν-Boost, that attempt to avoid the overfitting that can occur with AdaBoost by introducing a normalization term into the objective function minimized by AdaBoost. © 2007 Wiley Periodicals, Inc. Electron Comm Jpn Pt 3, 90(9): 69– 78, 2007; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ecjc.20344","PeriodicalId":100407,"journal":{"name":"Electronics and Communications in Japan (Part III: Fundamental Electronic Science)","volume":"19 1","pages":"69-78"},"PeriodicalIF":0.0000,"publicationDate":"2007-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Overfitting of boosting and regularized Boosting algorithms\",\"authors\":\"T. Onoda\",\"doi\":\"10.1002/ECJC.20344\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The impressive generalization capacity of AdaBoost has been explained using the concept of a margin introduced in the context of support vector machines. However, this ability to generalize is limited to cases where the data does not include misclassification errors or significant amounts of noise. In addition, the research of Schapire and colleagues has served to provide theoretical support for these results from the perspective of improving margins. In this paper we propose a set of new algorithms, AdaBoostReg,ν-Arc, and ν-Boost, that attempt to avoid the overfitting that can occur with AdaBoost by introducing a normalization term into the objective function minimized by AdaBoost. © 2007 Wiley Periodicals, Inc. Electron Comm Jpn Pt 3, 90(9): 69– 78, 2007; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ecjc.20344\",\"PeriodicalId\":100407,\"journal\":{\"name\":\"Electronics and Communications in Japan (Part III: Fundamental Electronic Science)\",\"volume\":\"19 1\",\"pages\":\"69-78\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2007-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Electronics and Communications in Japan (Part III: Fundamental Electronic Science)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1002/ECJC.20344\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Electronics and Communications in Japan (Part III: Fundamental Electronic Science)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1002/ECJC.20344","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Overfitting of boosting and regularized Boosting algorithms
The impressive generalization capacity of AdaBoost has been explained using the concept of a margin introduced in the context of support vector machines. However, this ability to generalize is limited to cases where the data does not include misclassification errors or significant amounts of noise. In addition, the research of Schapire and colleagues has served to provide theoretical support for these results from the perspective of improving margins. In this paper we propose a set of new algorithms, AdaBoostReg,ν-Arc, and ν-Boost, that attempt to avoid the overfitting that can occur with AdaBoost by introducing a normalization term into the objective function minimized by AdaBoost. © 2007 Wiley Periodicals, Inc. Electron Comm Jpn Pt 3, 90(9): 69– 78, 2007; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ecjc.20344