{"title":"时间序列物联网传感器数据异常检测的深度学习模型分析","authors":"Ujjwal Sachdeva, P. Vamsi","doi":"10.1145/3549206.3549218","DOIUrl":null,"url":null,"abstract":"The anomaly detection in Internet of Things (IoT) sensor data has become an important research area because of the possibility of noise and unavailability of labels in the sensors readings. The conventional machine learning algorithms cannot detect the anomalies when there is high correlation between the data points of the sensor data. Further, the volume and velocity of the data generated by the sensors in the IoT also a reason that the conventional statistical and machine learning algorithms fails to detect the anomalies. In recent years, the Deep Learning (DL) is gaining significant attention in the anomaly detection research due to the property of unsupervised learning of the high volume data and high detection accuracy of abnormalities. To this end, this paper proposed to study three DL models such as Autoencoders, Long Short Term Memory (LSTM) Autoencoder, and LSTM Recurrent Neural Networks (LSTM-RNN) for detecting anomalies in time series IoT sensor data. Simulations have been conducted using the Intel Berkeley Research Labs (IBRL) Sensor data to evaluate the performance. The results reveal which method performed better in terms of detection accuracy and training time.","PeriodicalId":199675,"journal":{"name":"Proceedings of the 2022 Fourteenth International Conference on Contemporary Computing","volume":"1521 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Analysis of Deep Learning Models for Anomaly Detection in Time Series IoT Sensor Data\",\"authors\":\"Ujjwal Sachdeva, P. Vamsi\",\"doi\":\"10.1145/3549206.3549218\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The anomaly detection in Internet of Things (IoT) sensor data has become an important research area because of the possibility of noise and unavailability of labels in the sensors readings. The conventional machine learning algorithms cannot detect the anomalies when there is high correlation between the data points of the sensor data. Further, the volume and velocity of the data generated by the sensors in the IoT also a reason that the conventional statistical and machine learning algorithms fails to detect the anomalies. In recent years, the Deep Learning (DL) is gaining significant attention in the anomaly detection research due to the property of unsupervised learning of the high volume data and high detection accuracy of abnormalities. To this end, this paper proposed to study three DL models such as Autoencoders, Long Short Term Memory (LSTM) Autoencoder, and LSTM Recurrent Neural Networks (LSTM-RNN) for detecting anomalies in time series IoT sensor data. Simulations have been conducted using the Intel Berkeley Research Labs (IBRL) Sensor data to evaluate the performance. The results reveal which method performed better in terms of detection accuracy and training time.\",\"PeriodicalId\":199675,\"journal\":{\"name\":\"Proceedings of the 2022 Fourteenth International Conference on Contemporary Computing\",\"volume\":\"1521 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-08-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2022 Fourteenth International Conference on Contemporary Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3549206.3549218\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 Fourteenth International Conference on Contemporary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3549206.3549218","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Analysis of Deep Learning Models for Anomaly Detection in Time Series IoT Sensor Data
The anomaly detection in Internet of Things (IoT) sensor data has become an important research area because of the possibility of noise and unavailability of labels in the sensors readings. The conventional machine learning algorithms cannot detect the anomalies when there is high correlation between the data points of the sensor data. Further, the volume and velocity of the data generated by the sensors in the IoT also a reason that the conventional statistical and machine learning algorithms fails to detect the anomalies. In recent years, the Deep Learning (DL) is gaining significant attention in the anomaly detection research due to the property of unsupervised learning of the high volume data and high detection accuracy of abnormalities. To this end, this paper proposed to study three DL models such as Autoencoders, Long Short Term Memory (LSTM) Autoencoder, and LSTM Recurrent Neural Networks (LSTM-RNN) for detecting anomalies in time series IoT sensor data. Simulations have been conducted using the Intel Berkeley Research Labs (IBRL) Sensor data to evaluate the performance. The results reveal which method performed better in terms of detection accuracy and training time.