Internet of Things最新文献

筛选
英文 中文
Threat detection in the 6G enabled Industrial IoT Networks using Deep Learning: A review on the state-of-the-art solutions, challenges and future research directions 基于深度学习的6G工业物联网网络威胁检测:最新解决方案、挑战和未来研究方向综述
IF 6 3区 计算机科学
Internet of Things Pub Date : 2025-07-17 DOI: 10.1016/j.iot.2025.101686
Gaoyang Guo , Faizan Qamar , Syed Hussain Ali Kazmi , Muhammad Habib ur Rehman
{"title":"Threat detection in the 6G enabled Industrial IoT Networks using Deep Learning: A review on the state-of-the-art solutions, challenges and future research directions","authors":"Gaoyang Guo ,&nbsp;Faizan Qamar ,&nbsp;Syed Hussain Ali Kazmi ,&nbsp;Muhammad Habib ur Rehman","doi":"10.1016/j.iot.2025.101686","DOIUrl":"10.1016/j.iot.2025.101686","url":null,"abstract":"<div><div>The integration of the Industrial Internet of Things (IIoT) with sixth-generation (6G) communication technology is a critical foundation for the next generation of intelligent manufacturing and industrial automation. However, this advancement introduces significant security challenges, particularly in threat detection for IIoT systems. This paper systematically reviews existing research on threat detection in 6G-IIoT environments using Deep Learning (DL) techniques. It examines key challenges related to data processing, privacy protection, and model performance. The study first outlines the security requirements of IIoT within a 6G network environment and evaluates the application of various DL models for threat detection. It then identifies key limitations in current research, including dataset imbalance and the limited generalization capability of existing models. Finally, potential future research directions are discussed to advance the development of more intelligent and efficient threat detection mechanisms, ensuring the security and stability of IIoT systems in the 6G era.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"33 ","pages":"Article 101686"},"PeriodicalIF":6.0,"publicationDate":"2025-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144665833","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Exploiting usage control for implementation and enforcement of security by contract 利用使用控制,通过契约实现和强制执行安全
IF 6 3区 计算机科学
Internet of Things Pub Date : 2025-07-16 DOI: 10.1016/j.iot.2025.101697
Marco Rasori , Paolo Mori , Andrea Saracino , Alessandro Aldini
{"title":"Exploiting usage control for implementation and enforcement of security by contract","authors":"Marco Rasori ,&nbsp;Paolo Mori ,&nbsp;Andrea Saracino ,&nbsp;Alessandro Aldini","doi":"10.1016/j.iot.2025.101697","DOIUrl":"10.1016/j.iot.2025.101697","url":null,"abstract":"<div><div>The widespread adoption of IoT-based smart home technologies has transformed how people interact with their living spaces, offering greater control over everyday tasks. However, this increased connectivity introduces significant security challenges, particularly in managing applications that can control devices within the smart home. Users need effective ways to define and enforce security policies that permit or deny specific behaviors of these applications. Such policies should allow users to control what actions applications can perform, ensuring that they comply with security and privacy preferences. This paper proposes a hybrid framework that combines Security by Contract (S<span><math><mo>×</mo></math></span>C) and Usage Control (UCON) to address these challenges and provide a comprehensive security solution with low impact on system performance. S<span><math><mo>×</mo></math></span>C ensures verification of the application behavior, described formally as a contract, against predefined XACML-based policies. UCON enables continuous monitoring and enforcement of security policies during application execution. The theoretical foundations of the methodology combining these frameworks are based on labeled state/transition systems and their model-checking-based verification. Through experimental validation on a real testbed, we explore the feasibility of the proposed approach by evaluating its performance across various test campaigns, offering insights into its ability to manage policy enforcement and revocation processes with low overhead.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"33 ","pages":"Article 101697"},"PeriodicalIF":6.0,"publicationDate":"2025-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144713787","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Variational Quantum Classifier for predictive analysis in industrial production 用于工业生产预测分析的变分量子分类器
IF 6 3区 计算机科学
Internet of Things Pub Date : 2025-07-16 DOI: 10.1016/j.iot.2025.101695
Antimo Angelino , Enrico Landolfi , Alfredo Massa , Alfredo Troiano
{"title":"A Variational Quantum Classifier for predictive analysis in industrial production","authors":"Antimo Angelino ,&nbsp;Enrico Landolfi ,&nbsp;Alfredo Massa ,&nbsp;Alfredo Troiano","doi":"10.1016/j.iot.2025.101695","DOIUrl":"10.1016/j.iot.2025.101695","url":null,"abstract":"<div><div>Quantum Computing (QC) is a novel and disruptive paradigm of computation that leverages the properties of quantum mechanical systems to represent and process information. The interest in this emerging technology and its applications has been growing in recent years, especially regarding Quantum Machine Learning (QML). In QML, QC and Machine Learning (ML) techniques are combined to build more powerful and accurate learning models. Industries and research centers worldwide have been devoting significant efforts to find use cases of practical interest for which QML may be a suitable approach. In this work, one of the most common QML algorithms, namely a Variational Quantum Classifier (VQC), has been adopted for a supervised classification task in defence industry. The goal is to predict the failures that may happen during the final acceptance test of a finished product, based on the knowledge of test data related to its subassemblies. The test data have been collected using advanced IoT systems and the prediction has been made before the final product was assembled, so to improve the efficiency in the testing process. The VQC has been applied to a problem already approached with classical ML techniques, and then the classical and quantum performances have been compared. The results indicate promising performances and highlight the potential of QML algorithms in the industrial sector for predictive analysis use.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"33 ","pages":"Article 101695"},"PeriodicalIF":6.0,"publicationDate":"2025-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144704549","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Deep Learning based Payload Optimization for Image Transmission over LoRa with HARQ 基于深度学习的基于HARQ的LoRa图像传输有效载荷优化
IF 6 3区 计算机科学
Internet of Things Pub Date : 2025-07-15 DOI: 10.1016/j.iot.2025.101701
Khondoker Ziaul Islam , David Murray , Dean Diepeveen , Michael G.K. Jones , Ferdous Sohel
{"title":"Deep Learning based Payload Optimization for Image Transmission over LoRa with HARQ","authors":"Khondoker Ziaul Islam ,&nbsp;David Murray ,&nbsp;Dean Diepeveen ,&nbsp;Michael G.K. Jones ,&nbsp;Ferdous Sohel","doi":"10.1016/j.iot.2025.101701","DOIUrl":"10.1016/j.iot.2025.101701","url":null,"abstract":"<div><div>LoRa is a wireless technology suited for long-range IoT applications. Leveraging LoRa technology for image transmission could revolutionize many applications, such as surveillance and monitoring, at low costs. However, transmitting images, through LoRa is challenging due to LoRa’s limited data rate and bandwidth. To address this, we propose a pipeline to prepare a reduced image payload for transmission captured by a camera in a reasonably static background, which is common in surveillance settings. The main goal is to minimize the uplink payload while maintaining image quality. We use a selective transmission approach where dissimilar images are divided into patches, and a deep learning Siamese network determines if an image or patch has new content compared to previously transmitted ones. The data is then compressed and sent in constant packets via HARQ to reduce downlink requirements. Enhanced super-resolution generative adversarial networks and principal component analysis are used to reconstruct the images/patches. We tested our approach with two surveillance videos at two sites using LoRaWAN gateways, end devices, and a ChirpStack server. Assuming no duty cycle restrictions, our pipeline can transmit videos—converted to 1616 and 584 frames—in 7 and 26 min, respectively. Increased duty cycle restrictions and significant image changes extend the transmission time. At Murdoch Oval, we achieved 100% throughput with no retransmissions required for both sets. At Whitby Falls Farm, throughput was 98.3%, with approximately 71 and 266 packets needing retransmission for Sets 1 and 2, respectively.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"33 ","pages":"Article 101701"},"PeriodicalIF":6.0,"publicationDate":"2025-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144653445","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An IoT-Enabled Hybrid Deep Q-Learning and Elman Neural Network Framework for Proactive Crop Healthcare in the Agriculture Sector 一个支持物联网的混合深度q -学习和Elman神经网络框架,用于农业部门的主动作物医疗保健
IF 6 3区 计算机科学
Internet of Things Pub Date : 2025-07-14 DOI: 10.1016/j.iot.2025.101700
Meshari Alazmi , Majid Alshammari , Dina A. Alabbad , Hamad Ali Abosaq , Ola Hegazy , Khaled M. Alalayah , Nahla O.A. Mustafa , Abu Sarwar Zamani , Shahid Hussain
{"title":"An IoT-Enabled Hybrid Deep Q-Learning and Elman Neural Network Framework for Proactive Crop Healthcare in the Agriculture Sector","authors":"Meshari Alazmi ,&nbsp;Majid Alshammari ,&nbsp;Dina A. Alabbad ,&nbsp;Hamad Ali Abosaq ,&nbsp;Ola Hegazy ,&nbsp;Khaled M. Alalayah ,&nbsp;Nahla O.A. Mustafa ,&nbsp;Abu Sarwar Zamani ,&nbsp;Shahid Hussain","doi":"10.1016/j.iot.2025.101700","DOIUrl":"10.1016/j.iot.2025.101700","url":null,"abstract":"<div><div>Emerging sensing technology and the artificial intelligence (AI) has lifted the agriculture sector by offering crop health monitoring and enabling real-time decision making. However, the heterogeneous nature of IoT devices results in massive data with distinct features that present challenges for individual AI models to comprehend the inherited data pattern, thereby necessitating advanced models. Consequently, we introduce an IoT coupled hybrid framework that integrates Deep Q-Network and Elman Neural Network (ENN) for proactive crop healthcare in the agriculture sector. The developed hybrid framework utilizes the IoT system for crop monitoring data and incorporates ENN, which leverages the Recursive Pattern Elimination technique to evaluate the data patterns and extract the optimal pattern related to crop health. Subsequently, the developed framework utilizes Deep Q-Network to comprehend the inherited data pattern related to the crop health for informed decision-making purposes. The proposed hybrid framework is applied to publicly available Field and Greenhouse crop datasets collected through the IoT system and is validated against state-of-the-art models focused on crop healthcare. The results showed that the proposed ENN-DQN framework achieved a high accuracy of 99.77%, precision of 99.52%, recall of 99.93%, and F-score of 99.76%. Moreover, a detail of the DQN action distribution is presented, and the results are validated through robustness analysis against different levels of heterogeneity, statistical analysis with a 95% confidence interval, and computational complexity analysis. A source code for this study is openly accessible at: <span><span>GitHub repository</span><svg><path></path></svg></span></div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"33 ","pages":"Article 101700"},"PeriodicalIF":6.0,"publicationDate":"2025-07-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144653448","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A hybrid machine learning and metaheuristic optimization framework for energy-efficient data aggregation in real-time for rural IoT networks 一种用于农村物联网网络实时节能数据聚合的混合机器学习和元启发式优化框架
IF 6 3区 计算机科学
Internet of Things Pub Date : 2025-07-12 DOI: 10.1016/j.iot.2025.101685
Abhishek Bajpai , Anita Yadav
{"title":"A hybrid machine learning and metaheuristic optimization framework for energy-efficient data aggregation in real-time for rural IoT networks","authors":"Abhishek Bajpai ,&nbsp;Anita Yadav","doi":"10.1016/j.iot.2025.101685","DOIUrl":"10.1016/j.iot.2025.101685","url":null,"abstract":"<div><div>Rapid deployment of Internet-of-Things (IoT) devices has led to a surge in real-time data generation, intensifying challenges related to energy consumption, bandwidth limitations, and network congestion. Traditional transmission methods suffer from decreased packet delivery ratios and long end-to-end delays. This work proposes a novel data aggregation model that addresses spatial and temporal redundancy while optimizing network parameters. Initially, a clustering approach employs K-Means to analyze spatial data patterns, refined using an Exponential Weighted Moving Average (EWMA). Cluster head (CH) selection is energy-aware to extend network longevity. A synergistic metaheuristic method, integrating Grey Wolf Optimization (GWO) with Greedy Perimeter Stateless Routing (GPSR), determines the optimal routing path from the CH to the sink node. Designed for rural and agricultural IoT networks, the proposed method achieves an average energy efficiency improvement of 11.1% over DA-MOMLOA, 4.7% over MOCRAW, and 20.9% over MOEA. After 2000 simulation iterations, the proposed model retains 40% of nodes alive, indicating significantly enhanced network longevity. It also improves the packet delivery ratio (PDR) by 2.1% over DA-MOMLOA, 1.3% over MOCRAW, 4.6% over MOEA, and 4.3% over LEACH, achieving a 97.32% PDR at high node density. Simulations in NS-3 confirm the model’s superior efficiency, reliability, and scalability in real-time IoT deployments.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"33 ","pages":"Article 101685"},"PeriodicalIF":6.0,"publicationDate":"2025-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144623320","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Signer-optimal multiple-time post-quantum hash-based signature for heterogeneous IoT Systems 异构物联网系统的签名者优化多次后量子哈希签名
IF 6 3区 计算机科学
Internet of Things Pub Date : 2025-07-10 DOI: 10.1016/j.iot.2025.101694
Kiarash Sedghighadikolaei, Attila A. Yavuz, Saif E. Nouma
{"title":"Signer-optimal multiple-time post-quantum hash-based signature for heterogeneous IoT Systems","authors":"Kiarash Sedghighadikolaei,&nbsp;Attila A. Yavuz,&nbsp;Saif E. Nouma","doi":"10.1016/j.iot.2025.101694","DOIUrl":"10.1016/j.iot.2025.101694","url":null,"abstract":"<div><div>Heterogeneous Internet of Things (IoTs) harboring resource-limited devices like wearable sensors are essential for next-generation networks. Ensuring the authentication and integrity of security-sensitive telemetry in these applications is vital. Digital signatures provide scalable authentication with non-repudiation and public verifiability, making them essential tools for IoTs. However, current NIST-PQC standards are significantly resource-intensive for practical use on constrained IoT devices. This highlights a critical need for lightweight PQ-secure digital signatures that align with the limitations of low-end IoTs.</div><div>We propose a new multiple-time hash-based signature called <em>Maximum Utilization Multiple HORS</em> (<span><math><mstyle><mi>M</mi><mi>U</mi><mi>M</mi><mo>−</mo><mi>H</mi><mi>O</mi><mi>R</mi><mi>S</mi></mstyle></math></span>) that offers PQ security, short signatures, fast signing, and high key utilization for an extended lifespan. <span><math><mstyle><mi>M</mi><mi>U</mi><mi>M</mi><mo>−</mo><mi>H</mi><mi>O</mi><mi>R</mi><mi>S</mi></mstyle></math></span> addresses the inefficiency and key loss issues of HORS in offline/online settings by introducing compact key management data structures and optimized resistance to weak-message attacks. We tested <span><math><mstyle><mi>M</mi><mi>U</mi><mi>M</mi><mo>−</mo><mi>H</mi><mi>O</mi><mi>R</mi><mi>S</mi></mstyle></math></span> on two embedded platforms (ARM Cortex A-72 and 8-bit AVR ATmega2560) and commodity hardware. Results show <span><math><mrow><mn>40</mn><mo>×</mo></mrow></math></span> lower resource usage at the same signing capacity (<span><math><msup><mrow><mn>2</mn></mrow><mrow><mn>20</mn></mrow></msup></math></span> messages, 128-bit security) than multiple-time HORS. Furthermore, <span><math><mstyle><mi>M</mi><mi>U</mi><mi>M</mi><mo>−</mo><mi>H</mi><mi>O</mi><mi>R</mi><mi>S</mi></mstyle></math></span> achieves <span><math><mrow><mn>2</mn><mo>×</mo></mrow></math></span> and up to 4000<span><math><mo>×</mo></math></span> faster signing than conventional secure schemes on the ARM Cortex and state-of-the-art PQ-secure schemes for IoTs, respectively.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"33 ","pages":"Article 101694"},"PeriodicalIF":6.0,"publicationDate":"2025-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144653449","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A systematic literature review on applications of explainable artificial intelligence in the financial sector 可解释人工智能在金融领域应用的系统文献综述
IF 6 3区 计算机科学
Internet of Things Pub Date : 2025-07-10 DOI: 10.1016/j.iot.2025.101696
Abdullah Emir Cil , Kazim Yildiz
{"title":"A systematic literature review on applications of explainable artificial intelligence in the financial sector","authors":"Abdullah Emir Cil ,&nbsp;Kazim Yildiz","doi":"10.1016/j.iot.2025.101696","DOIUrl":"10.1016/j.iot.2025.101696","url":null,"abstract":"<div><div>The development of artificial intelligence undoubtedly affects every field containing data. Today, the integration of artificial intelligence into systems has gained momentum in a remarkable way. The use of artificial intelligence in finance has an important share among the areas where artificial intelligence is integrated. However, it is an issue that needs to be considered how safe it is when using artificial intelligence in a critical area such as finance. Many artificial intelligence algorithms used work as a closed box and there is no clear observation of how it works. At this point, the importance of the concept of explainable artificial intelligence emerges. In this study, a literature review has been conducted to examine the studies on explainable artificial intelligence algorithms used in the financial sector. In the literature study, answers to the questions of “which explainable artificial intelligence algorithms are used for which financial services”, “whether explainable artificial intelligence algorithms are really suitable for financial services”, “what effect the use of explainable artificial intelligence has on the performance of the financial services offered”, “what kind of data sets are preferred for the applications of explainable artificial intelligence algorithms in financial services” were sought.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"33 ","pages":"Article 101696"},"PeriodicalIF":6.0,"publicationDate":"2025-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144653447","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Optimization strategies for neural network deployment on FPGA: An energy-efficient real-time face detection use case FPGA上神经网络部署的优化策略:一种节能的实时人脸检测用例
IF 6 3区 计算机科学
Internet of Things Pub Date : 2025-07-10 DOI: 10.1016/j.iot.2025.101676
Mhd Rashed Al Koutayni, Gerd Reis, Didier Stricker
{"title":"Optimization strategies for neural network deployment on FPGA: An energy-efficient real-time face detection use case","authors":"Mhd Rashed Al Koutayni,&nbsp;Gerd Reis,&nbsp;Didier Stricker","doi":"10.1016/j.iot.2025.101676","DOIUrl":"10.1016/j.iot.2025.101676","url":null,"abstract":"<div><div>Field programmable gate arrays (FPGAs) are considered promising platforms for accelerating deep neural networks (DNNs) due to their parallel processing capabilities and energy efficiency. However, Deploying DNNs on FPGA platforms for computer vision tasks presents unique challenges, such as limited computational resources, constrained power budgets, and the need for real-time performance. This work presents a set of optimization methodologies to enhance the efficiency of real-time DNN inference on FPGA system-on-a-chip (SoC) platforms. These optimizations include architectural modifications, fixed-point quantization, computation reordering, and parallelization. Additionally, hardware/software partitioning is employed to optimize task allocation between the processing system (PS) and programmable logic (PL), along with system integration and interface configuration. To validate these strategies, we apply them to a baseline face detection DNN (FaceBoxes) as a use case. The proposed techniques not only improve the efficiency of FaceBoxes on FPGA but also provide a roadmap for optimizing other DNN-based applications for resource-constrained platforms. Experimental results on the AMD Xilinx ZCU102 board with VGA resolution (<span><math><mrow><mn>480</mn><mo>×</mo><mn>640</mn><mo>×</mo><mn>3</mn></mrow></math></span>) input demonstrate a significant increase in efficiency, achieving real-time performance while substantially reducing dynamic energy consumption.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"33 ","pages":"Article 101676"},"PeriodicalIF":6.0,"publicationDate":"2025-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144597456","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Data-driven compressive sensing approach for ECG signals in IoT healthcare applications 物联网医疗应用中心电信号的数据驱动压缩感知方法
IF 6 3区 计算机科学
Internet of Things Pub Date : 2025-07-09 DOI: 10.1016/j.iot.2025.101690
Bharat Lal, Pasquale Corsonello, Raffaele Gravina
{"title":"Data-driven compressive sensing approach for ECG signals in IoT healthcare applications","authors":"Bharat Lal,&nbsp;Pasquale Corsonello,&nbsp;Raffaele Gravina","doi":"10.1016/j.iot.2025.101690","DOIUrl":"10.1016/j.iot.2025.101690","url":null,"abstract":"<div><div>The rapid adoption of Internet of Things (IoT) technologies in healthcare has transformed patient monitoring, particularly in continuous ECG monitoring for early detection of cardiac abnormalities. However, traditional ECG monitoring methods face challenges such as high data volume, power consumption, and transmission inefficiencies, complicating real-time monitoring in resource-constrained environments. This study introduces a novel data-driven compressive sensing framework designed for ECG signal processing in IoT healthcare applications. The framework incorporates a Data-Driven Sensing Matrix (DSM) and Binary Thresholding Matrix (BTM) to optimize hardware efficiency while maintaining high reconstruction accuracy. DSM leverages machine learning to adapt to ECG signal properties, while BTM employs a novel thresholding technique for efficient hardware implementation. Additionally, overcomplete dictionaries, such as Gaussian and K-SVD, enhance sparsity and reconstruction accuracy. Performance validation using the MIT-BIH Arrhythmia Database demonstrates that the reconstructed signal preserves key features, with Percent Root Mean Square Difference values below 9% at compression ratios up to 85%. Comparative evaluations confirm the superiority of DSM and BTM over conventional sensing matrices like Random Gaussian, Bernoulli Binary, and Signed Matrices in compression efficiency and reconstruction accuracy. These findings highlight the potential of data-adaptive compressive sensing for energy-efficient, secure, and real-time ECG monitoring in IoT-driven healthcare. The proposed BTM, with its low computational requirements and efficient hardware integration, addresses key challenges in wearable and portable ECG devices, ensuring scalable and reliable performance in real-world applications.</div></div>","PeriodicalId":29968,"journal":{"name":"Internet of Things","volume":"33 ","pages":"Article 101690"},"PeriodicalIF":6.0,"publicationDate":"2025-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144623321","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信