Jean Baptiste Minani , Yahia El Fellah , Fatima Sabir , Naouel Moha , Yann-Gaël Guéhéneuc , Martin Kuradusenge , Tomoaki Masuda
{"title":"IoT systems testing: Taxonomy, empirical findings, and recommendations","authors":"Jean Baptiste Minani , Yahia El Fellah , Fatima Sabir , Naouel Moha , Yann-Gaël Guéhéneuc , Martin Kuradusenge , Tomoaki Masuda","doi":"10.1016/j.jss.2025.112408","DOIUrl":null,"url":null,"abstract":"<div><div>The Internet of Things (IoT) is reshaping our lives, increasing the need for thorough pre-deployment testing. However, traditional software testing may not address the testing requirements of IoT systems, leading to quality challenges. A specific testing taxonomy is crucial, yet no widely recognized taxonomy exists for IoT system testing. We introduced an IoT-specific testing taxonomy that categorizes aspects of IoT systems testing into seven distinct categories. We mined testing aspects from 83 primary studies in IoT systems testing and built an initial taxonomy. This taxonomy was refined and validated through two rounds of surveys involving 16 and then 204 IoT industry practitioners. We assessed its effectiveness by conducting an empirical evaluation on two separate IoT systems, each involving 12 testers. Our findings categorize seven testing aspects: (1) testing objectives, (2) testing tools and artifacts, (3) testers, (4) testing stage, (5) testing environment, (6) Object Under Test (OUT) and metrics, and (7) testing approaches. The evaluation showed that testers equipped with the taxonomy could more effectively identify diverse test cases and scenarios. Additionally, we recommend new research opportunities to enhance the testing of IoT systems.</div></div>","PeriodicalId":51099,"journal":{"name":"Journal of Systems and Software","volume":"226 ","pages":"Article 112408"},"PeriodicalIF":3.7000,"publicationDate":"2025-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Systems and Software","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0164121225000767","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0
Abstract
The Internet of Things (IoT) is reshaping our lives, increasing the need for thorough pre-deployment testing. However, traditional software testing may not address the testing requirements of IoT systems, leading to quality challenges. A specific testing taxonomy is crucial, yet no widely recognized taxonomy exists for IoT system testing. We introduced an IoT-specific testing taxonomy that categorizes aspects of IoT systems testing into seven distinct categories. We mined testing aspects from 83 primary studies in IoT systems testing and built an initial taxonomy. This taxonomy was refined and validated through two rounds of surveys involving 16 and then 204 IoT industry practitioners. We assessed its effectiveness by conducting an empirical evaluation on two separate IoT systems, each involving 12 testers. Our findings categorize seven testing aspects: (1) testing objectives, (2) testing tools and artifacts, (3) testers, (4) testing stage, (5) testing environment, (6) Object Under Test (OUT) and metrics, and (7) testing approaches. The evaluation showed that testers equipped with the taxonomy could more effectively identify diverse test cases and scenarios. Additionally, we recommend new research opportunities to enhance the testing of IoT systems.
期刊介绍:
The Journal of Systems and Software publishes papers covering all aspects of software engineering and related hardware-software-systems issues. All articles should include a validation of the idea presented, e.g. through case studies, experiments, or systematic comparisons with other approaches already in practice. Topics of interest include, but are not limited to:
•Methods and tools for, and empirical studies on, software requirements, design, architecture, verification and validation, maintenance and evolution
•Agile, model-driven, service-oriented, open source and global software development
•Approaches for mobile, multiprocessing, real-time, distributed, cloud-based, dependable and virtualized systems
•Human factors and management concerns of software development
•Data management and big data issues of software systems
•Metrics and evaluation, data mining of software development resources
•Business and economic aspects of software development processes
The journal welcomes state-of-the-art surveys and reports of practical experience for all of these topics.