Indrajeet Ghosh, Adam Goldstein, Avijoy Chakma, Jade Freeman, T. Gregory, Niranjan Suri, S. R. Ramamurthy, Nirmalya Roy
{"title":"HeteroSys: Heterogeneous and Collaborative Sensing in the Wild","authors":"Indrajeet Ghosh, Adam Goldstein, Avijoy Chakma, Jade Freeman, T. Gregory, Niranjan Suri, S. R. Ramamurthy, Nirmalya Roy","doi":"10.1109/SMARTCOMP58114.2023.00073","DOIUrl":null,"url":null,"abstract":"Advances in Internet-of-Things, artificial intelligence, and ubiquitous computing technologies have contributed to building the next generation of context-aware heterogeneous systems with robust interoperability to control and monitor the environmental variables of smart environments. Motivated by this, we propose HeteroSys, an end-to-end multi-functional smart IoT-based system prototype for heterogeneous and collaborative sensing in a smart IoT-based environment. A unique characteristic of HeteroSys is that it relies on Home Assistant (HA) to collate heterogeneous sensors (e.g., passive infrared sensors (PIR), reed (door) switches, object tags, wearable wrist-mounted, water leak sensors, and internet protocol cameras), and uses a variety of networking protocols such as Zigbee open standard for mesh networking, WiFi, and Bluetooth Low Energy (BLE) for communication. The reliance on HA (and its broad community support) makes HeteroSys ideal for various applications such as object detection, human activity recognition and behavior patterns. We articulated the development phase, integration, testing challenges and evaluation of the HeteroSys. We conducted an extensive 24-hour longitudinal data collection from 5 participants performing 6 activities by deploying in an indoor home environment. Our assessment of the acquired dataset reveals that the representations learned using deep learning architecture aid in improving the detection of activities to 83.1% accuracy.","PeriodicalId":163556,"journal":{"name":"2023 IEEE International Conference on Smart Computing (SMARTCOMP)","volume":"56 2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE International Conference on Smart Computing (SMARTCOMP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SMARTCOMP58114.2023.00073","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Advances in Internet-of-Things, artificial intelligence, and ubiquitous computing technologies have contributed to building the next generation of context-aware heterogeneous systems with robust interoperability to control and monitor the environmental variables of smart environments. Motivated by this, we propose HeteroSys, an end-to-end multi-functional smart IoT-based system prototype for heterogeneous and collaborative sensing in a smart IoT-based environment. A unique characteristic of HeteroSys is that it relies on Home Assistant (HA) to collate heterogeneous sensors (e.g., passive infrared sensors (PIR), reed (door) switches, object tags, wearable wrist-mounted, water leak sensors, and internet protocol cameras), and uses a variety of networking protocols such as Zigbee open standard for mesh networking, WiFi, and Bluetooth Low Energy (BLE) for communication. The reliance on HA (and its broad community support) makes HeteroSys ideal for various applications such as object detection, human activity recognition and behavior patterns. We articulated the development phase, integration, testing challenges and evaluation of the HeteroSys. We conducted an extensive 24-hour longitudinal data collection from 5 participants performing 6 activities by deploying in an indoor home environment. Our assessment of the acquired dataset reveals that the representations learned using deep learning architecture aid in improving the detection of activities to 83.1% accuracy.