{"title":"Incremental / decremental SVM for function approximation","authors":"H. Gâlmeanu, Răzvan Andonie","doi":"10.1109/OPTIM.2008.4602473","DOIUrl":null,"url":null,"abstract":"Training a support vector regression (SVR) resumes to the process of migrating the vectors in and out of the support set along with modifying the associated thresholds. This paper gives a complete overview of all the boundary conditions implied by vector migration through the process. The process is similar to that of training a SVM, though the process of incrementing / decrementing of vectors into / out of the solution does not coincide with the increase / decrease of the associated threshold. The analysis shows the details of incremental and decremental procedures used to train the SVR. Vectors with duplicate contribution are also considered. The migration of vectors among sets on decreasing the regularization parameter C is particularly given attention. Eventually, experimental data show the possibility of modifying this parameter on a large scale, varying it from complete training (overfitting) to a calibrated value, to tune up the approximation performance of the regression.","PeriodicalId":244464,"journal":{"name":"2008 11th International Conference on Optimization of Electrical and Electronic Equipment","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2008-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2008 11th International Conference on Optimization of Electrical and Electronic Equipment","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/OPTIM.2008.4602473","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
Training a support vector regression (SVR) resumes to the process of migrating the vectors in and out of the support set along with modifying the associated thresholds. This paper gives a complete overview of all the boundary conditions implied by vector migration through the process. The process is similar to that of training a SVM, though the process of incrementing / decrementing of vectors into / out of the solution does not coincide with the increase / decrease of the associated threshold. The analysis shows the details of incremental and decremental procedures used to train the SVR. Vectors with duplicate contribution are also considered. The migration of vectors among sets on decreasing the regularization parameter C is particularly given attention. Eventually, experimental data show the possibility of modifying this parameter on a large scale, varying it from complete training (overfitting) to a calibrated value, to tune up the approximation performance of the regression.