Senior Member, A. K. Deb, Reshma Khemchandani, Suresh Chandra
{"title":"正则化最小二乘势svr","authors":"Senior Member, A. K. Deb, Reshma Khemchandani, Suresh Chandra","doi":"10.1109/INDCON.2006.302859","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a regularized least squares approach to potential SVRs. The proposed solution involves inverting a single matrix of small dimension. In the case of linear SVRs, the size of the matrix is independent of the number of data samples. Results involving benchmark data sets demonstrate the computational advantages of the proposal. In a recent publication, it has been highlighted that the margin in support vector machines (SVMs) is not scale invariant. This implies that an appropriate scaling can have an impact on the generalization performance of the SVM based regressor. Potential SVMs address this issue and suggest a new approach to regression","PeriodicalId":122715,"journal":{"name":"2006 Annual IEEE India Conference","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Regularized Least Squares Potential SVRs\",\"authors\":\"Senior Member, A. K. Deb, Reshma Khemchandani, Suresh Chandra\",\"doi\":\"10.1109/INDCON.2006.302859\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we propose a regularized least squares approach to potential SVRs. The proposed solution involves inverting a single matrix of small dimension. In the case of linear SVRs, the size of the matrix is independent of the number of data samples. Results involving benchmark data sets demonstrate the computational advantages of the proposal. In a recent publication, it has been highlighted that the margin in support vector machines (SVMs) is not scale invariant. This implies that an appropriate scaling can have an impact on the generalization performance of the SVM based regressor. Potential SVMs address this issue and suggest a new approach to regression\",\"PeriodicalId\":122715,\"journal\":{\"name\":\"2006 Annual IEEE India Conference\",\"volume\":\"19 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2006-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2006 Annual IEEE India Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/INDCON.2006.302859\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2006 Annual IEEE India Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INDCON.2006.302859","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
In this paper, we propose a regularized least squares approach to potential SVRs. The proposed solution involves inverting a single matrix of small dimension. In the case of linear SVRs, the size of the matrix is independent of the number of data samples. Results involving benchmark data sets demonstrate the computational advantages of the proposal. In a recent publication, it has been highlighted that the margin in support vector machines (SVMs) is not scale invariant. This implies that an appropriate scaling can have an impact on the generalization performance of the SVM based regressor. Potential SVMs address this issue and suggest a new approach to regression