{"title":"Data Correction Management Method Using Temporal Data in Fog Computing","authors":"Tsukasa Kudo","doi":"10.23919/ICMU48249.2019.9006651","DOIUrl":null,"url":null,"abstract":"With the recent development and expansion of the Internet of Things, fog computing has been proposed to solve the problem of transferring large quantities of sensor data to a cloud server. Primary processing is performed at fog nodes installed near sensors and only its results are transferred to server to be shared and used by various analytical processing. When any missing or defective data is detected, the corrected data is retransferred to the server and the related analytical processing are executed again. And, when the results of the analytical and primary processing have a relationship, such as summaries and details, the analytical processing results must be updated with maintaining consistency between them. However, to maintain consistency at all times, analytical processing must be performed with every re-transfer thereby degrading the efficiency. In this paper, I propose a method to reflect many sequential corrections in a lump while maintaining consistency by using the temporal database concept and show its effectiveness.","PeriodicalId":348402,"journal":{"name":"2019 Twelfth International Conference on Mobile Computing and Ubiquitous Network (ICMU)","volume":"243 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 Twelfth International Conference on Mobile Computing and Ubiquitous Network (ICMU)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/ICMU48249.2019.9006651","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
With the recent development and expansion of the Internet of Things, fog computing has been proposed to solve the problem of transferring large quantities of sensor data to a cloud server. Primary processing is performed at fog nodes installed near sensors and only its results are transferred to server to be shared and used by various analytical processing. When any missing or defective data is detected, the corrected data is retransferred to the server and the related analytical processing are executed again. And, when the results of the analytical and primary processing have a relationship, such as summaries and details, the analytical processing results must be updated with maintaining consistency between them. However, to maintain consistency at all times, analytical processing must be performed with every re-transfer thereby degrading the efficiency. In this paper, I propose a method to reflect many sequential corrections in a lump while maintaining consistency by using the temporal database concept and show its effectiveness.