{"title":"learning -on- on- go:物联网应用的自主跨学科上下文学习","authors":"Ramin Fallahzadeh, Parastoo Alinia, Hassan Ghasemzadeh","doi":"10.1109/ICCAD.2017.8203800","DOIUrl":null,"url":null,"abstract":"Developing machine learning algorithms for applications of Internet-of-Things requires collecting a large amount of labeled training data, which is an expensive and labor-intensive process. Upon a minor change in the context, for example utilization by a new user, the model will need re-training to maintain the initial performance. To address this problem, we propose a graph model and an unsupervised label transfer algorithm (learn-on-the-go) which exploits the relations between source and target user data to develop a highly-accurate and scalable machine learning model. Our analysis on real-world data demonstrates 54% and 22% performance improvement against baseline and state-of-the-art solutions, respectively.","PeriodicalId":126686,"journal":{"name":"2017 IEEE/ACM International Conference on Computer-Aided Design (ICCAD)","volume":"89 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-11-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Learn-on-the-go: Autonomous cross-subject context learning for internet-of-things applications\",\"authors\":\"Ramin Fallahzadeh, Parastoo Alinia, Hassan Ghasemzadeh\",\"doi\":\"10.1109/ICCAD.2017.8203800\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Developing machine learning algorithms for applications of Internet-of-Things requires collecting a large amount of labeled training data, which is an expensive and labor-intensive process. Upon a minor change in the context, for example utilization by a new user, the model will need re-training to maintain the initial performance. To address this problem, we propose a graph model and an unsupervised label transfer algorithm (learn-on-the-go) which exploits the relations between source and target user data to develop a highly-accurate and scalable machine learning model. Our analysis on real-world data demonstrates 54% and 22% performance improvement against baseline and state-of-the-art solutions, respectively.\",\"PeriodicalId\":126686,\"journal\":{\"name\":\"2017 IEEE/ACM International Conference on Computer-Aided Design (ICCAD)\",\"volume\":\"89 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-11-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 IEEE/ACM International Conference on Computer-Aided Design (ICCAD)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICCAD.2017.8203800\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE/ACM International Conference on Computer-Aided Design (ICCAD)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCAD.2017.8203800","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Learn-on-the-go: Autonomous cross-subject context learning for internet-of-things applications
Developing machine learning algorithms for applications of Internet-of-Things requires collecting a large amount of labeled training data, which is an expensive and labor-intensive process. Upon a minor change in the context, for example utilization by a new user, the model will need re-training to maintain the initial performance. To address this problem, we propose a graph model and an unsupervised label transfer algorithm (learn-on-the-go) which exploits the relations between source and target user data to develop a highly-accurate and scalable machine learning model. Our analysis on real-world data demonstrates 54% and 22% performance improvement against baseline and state-of-the-art solutions, respectively.