{"title":"Accuracy vs Efficiency: Machine Learning Enabled Anomaly Detection on the Internet of Things","authors":"Xin-Wen Wu, Yongtao Cao, Richard Dankwa","doi":"10.1109/IoTaIS56727.2022.9975889","DOIUrl":null,"url":null,"abstract":"Anomaly detection is an important security mechanism for the Internet of Things (IoT). Existing works have been focused on developing accurate anomaly detection models. However, due to the resource-constrained nature of IoT networks and the requirement of real-time security operation, cost efficient (regarding computational efficiency and memory-consumption efficiency) approaches for anomaly detection are highly desirable in IoT applications. In this paper, we investigated machine learning (ML) enabled anomaly detection models for the IoT with regard to multi-objective optimization (Pareto optimization) that minimizes the detection error, execution time, and memory consumption simultaneously. Making use of well-known datasets consisting of network traffic traces captured in an IoT environment, we studied a variety of machine learning algorithms through the world-class H2O AI platform. Our experimental results show that the Gradient Boosting Machine, Random Forest, and Deep Learning models are the most accurate and fastest anomaly detection models; the Gradient Boosting Machine and Random Forest are the most accurate and memory-efficient models. These ML models form the Pareto-optimal set of anomaly detection models. Our results can be used by the industry to facilitate their selection of ML models for anomaly detection on various IoT networks based on their security requirements and system constraints.","PeriodicalId":138894,"journal":{"name":"2022 IEEE International Conference on Internet of Things and Intelligence Systems (IoTaIS)","volume":"66 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Internet of Things and Intelligence Systems (IoTaIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IoTaIS56727.2022.9975889","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Anomaly detection is an important security mechanism for the Internet of Things (IoT). Existing works have been focused on developing accurate anomaly detection models. However, due to the resource-constrained nature of IoT networks and the requirement of real-time security operation, cost efficient (regarding computational efficiency and memory-consumption efficiency) approaches for anomaly detection are highly desirable in IoT applications. In this paper, we investigated machine learning (ML) enabled anomaly detection models for the IoT with regard to multi-objective optimization (Pareto optimization) that minimizes the detection error, execution time, and memory consumption simultaneously. Making use of well-known datasets consisting of network traffic traces captured in an IoT environment, we studied a variety of machine learning algorithms through the world-class H2O AI platform. Our experimental results show that the Gradient Boosting Machine, Random Forest, and Deep Learning models are the most accurate and fastest anomaly detection models; the Gradient Boosting Machine and Random Forest are the most accurate and memory-efficient models. These ML models form the Pareto-optimal set of anomaly detection models. Our results can be used by the industry to facilitate their selection of ML models for anomaly detection on various IoT networks based on their security requirements and system constraints.