Boliang Sun, Yan Li, Xingchen Hu, Guangquan Cheng, Chao Chen, Zhong Liu
{"title":"转换支持向量机的增量学习","authors":"Boliang Sun, Yan Li, Xingchen Hu, Guangquan Cheng, Chao Chen, Zhong Liu","doi":"10.1109/ISKE47853.2019.9170294","DOIUrl":null,"url":null,"abstract":"A new incremental transductive SVMs framework dependeding on duality is put forward for constrained optimization issues. Based on the weak duality theorem, the procedure of incremental learning is simplified for the task dual function increment. Based on this, two incremental learning methods are developed by updating limited dual parameters: (1) aggressive dual ascending; (2) local concave-convex procedure (LCCCP). Experiments demonstrated that our methods achieve comparable risk and accuracy to batch TSVMs, with less time consumption and memory requirment. Besides, our incremental learning methods can cope with concept drift and maintain smaller error rate than batch learning methods. The design and analysis of incremental semi-supervised learning methods is fully discussed in this research.","PeriodicalId":399084,"journal":{"name":"2019 IEEE 14th International Conference on Intelligent Systems and Knowledge Engineering (ISKE)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Incremental Learning for Transductive SVMs\",\"authors\":\"Boliang Sun, Yan Li, Xingchen Hu, Guangquan Cheng, Chao Chen, Zhong Liu\",\"doi\":\"10.1109/ISKE47853.2019.9170294\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A new incremental transductive SVMs framework dependeding on duality is put forward for constrained optimization issues. Based on the weak duality theorem, the procedure of incremental learning is simplified for the task dual function increment. Based on this, two incremental learning methods are developed by updating limited dual parameters: (1) aggressive dual ascending; (2) local concave-convex procedure (LCCCP). Experiments demonstrated that our methods achieve comparable risk and accuracy to batch TSVMs, with less time consumption and memory requirment. Besides, our incremental learning methods can cope with concept drift and maintain smaller error rate than batch learning methods. The design and analysis of incremental semi-supervised learning methods is fully discussed in this research.\",\"PeriodicalId\":399084,\"journal\":{\"name\":\"2019 IEEE 14th International Conference on Intelligent Systems and Knowledge Engineering (ISKE)\",\"volume\":\"17 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE 14th International Conference on Intelligent Systems and Knowledge Engineering (ISKE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISKE47853.2019.9170294\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 14th International Conference on Intelligent Systems and Knowledge Engineering (ISKE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISKE47853.2019.9170294","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A new incremental transductive SVMs framework dependeding on duality is put forward for constrained optimization issues. Based on the weak duality theorem, the procedure of incremental learning is simplified for the task dual function increment. Based on this, two incremental learning methods are developed by updating limited dual parameters: (1) aggressive dual ascending; (2) local concave-convex procedure (LCCCP). Experiments demonstrated that our methods achieve comparable risk and accuracy to batch TSVMs, with less time consumption and memory requirment. Besides, our incremental learning methods can cope with concept drift and maintain smaller error rate than batch learning methods. The design and analysis of incremental semi-supervised learning methods is fully discussed in this research.