{"title":"A Multi-class Incremental and Decremental SVM Approach Using Adaptive Directed Acyclic Graphs","authors":"H. Gâlmeanu, Răzvan Andonie","doi":"10.1109/ICAIS.2009.27","DOIUrl":null,"url":null,"abstract":"Multi-class approaches for SVMs are based on composition of binary SVM classifiers. Due to the numerous binary classifiers to be considered, for large training sets, this approach is known to be time expensive. In our approach, we improve time efficiency using concurrently two strategies: incremental training and reduction of trained binary SVMs. We present the exact migration conditions for the binary SVMs during their incremental training. We rewrite these conditions for the case when the regularization parameter is optimized. The obtained results are applied to a multi-class incremental / decremental SVM based on the Adaptive Directed Acyclic Graph. The regularization parameter is optimized on-line, and not by retraining the SVM with all input samples for each value of the regularization parameter.","PeriodicalId":161840,"journal":{"name":"2009 International Conference on Adaptive and Intelligent Systems","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 International Conference on Adaptive and Intelligent Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAIS.2009.27","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
Multi-class approaches for SVMs are based on composition of binary SVM classifiers. Due to the numerous binary classifiers to be considered, for large training sets, this approach is known to be time expensive. In our approach, we improve time efficiency using concurrently two strategies: incremental training and reduction of trained binary SVMs. We present the exact migration conditions for the binary SVMs during their incremental training. We rewrite these conditions for the case when the regularization parameter is optimized. The obtained results are applied to a multi-class incremental / decremental SVM based on the Adaptive Directed Acyclic Graph. The regularization parameter is optimized on-line, and not by retraining the SVM with all input samples for each value of the regularization parameter.