M. Lázaro, I. Santamaría, F. Pérez-Cruz, Antonio Artés-Rodríguez
{"title":"支持向量机同时逼近一个函数和它的导数","authors":"M. Lázaro, I. Santamaría, F. Pérez-Cruz, Antonio Artés-Rodríguez","doi":"10.1109/NNSP.2003.1318018","DOIUrl":null,"url":null,"abstract":"In this paper, the problem of simultaneously approximating a function and its derivative is formulated within the support vector machine (SVM) framework. The problem has been solved by using the /spl epsiv/-insensitive loss function and introducing new linear constraints in the approximation of the derivative. The resulting quadratic problem can be solved by quadratic programming (QP) techniques. Moreover, a computationally efficient iterative re-weighted least square (IRWLS) procedure has been derived to solve the problem in large data sets. The performance of the method has been compared with the conventional SVM for regression, providing outstanding results.","PeriodicalId":315958,"journal":{"name":"2003 IEEE XIII Workshop on Neural Networks for Signal Processing (IEEE Cat. No.03TH8718)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2003-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Support vector machine for the simultaneous approximation of a function and its derivative\",\"authors\":\"M. Lázaro, I. Santamaría, F. Pérez-Cruz, Antonio Artés-Rodríguez\",\"doi\":\"10.1109/NNSP.2003.1318018\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, the problem of simultaneously approximating a function and its derivative is formulated within the support vector machine (SVM) framework. The problem has been solved by using the /spl epsiv/-insensitive loss function and introducing new linear constraints in the approximation of the derivative. The resulting quadratic problem can be solved by quadratic programming (QP) techniques. Moreover, a computationally efficient iterative re-weighted least square (IRWLS) procedure has been derived to solve the problem in large data sets. The performance of the method has been compared with the conventional SVM for regression, providing outstanding results.\",\"PeriodicalId\":315958,\"journal\":{\"name\":\"2003 IEEE XIII Workshop on Neural Networks for Signal Processing (IEEE Cat. No.03TH8718)\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2003-09-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2003 IEEE XIII Workshop on Neural Networks for Signal Processing (IEEE Cat. No.03TH8718)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NNSP.2003.1318018\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2003 IEEE XIII Workshop on Neural Networks for Signal Processing (IEEE Cat. No.03TH8718)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NNSP.2003.1318018","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Support vector machine for the simultaneous approximation of a function and its derivative
In this paper, the problem of simultaneously approximating a function and its derivative is formulated within the support vector machine (SVM) framework. The problem has been solved by using the /spl epsiv/-insensitive loss function and introducing new linear constraints in the approximation of the derivative. The resulting quadratic problem can be solved by quadratic programming (QP) techniques. Moreover, a computationally efficient iterative re-weighted least square (IRWLS) procedure has been derived to solve the problem in large data sets. The performance of the method has been compared with the conventional SVM for regression, providing outstanding results.