{"title":"Aluminum Doped Zinc Oxide and Silver Nanowire Composite Based Printed CO$_{2}$ Gas Sensor","authors":"Nikhila Patil;Neethu Thomas;Neha Sharma;Parasuraman Swaminathan;P. Sumathi","doi":"10.1109/LSENS.2025.3571193","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3571193","url":null,"abstract":"Carbon dioxide (CO<inline-formula><tex-math>$_{2}$</tex-math></inline-formula>) is a significant greenhouse gas and an essential indicator of effective air circulation in enclosed spaces, requiring precise and continuous monitoring. Traditional chemiresistive CO<inline-formula><tex-math>$_{2}$</tex-math></inline-formula> sensors have high operating temperatures that require external heating elements limiting their applicability in low-power portable electronics. This work demonstrates a miniaturized printed CO<inline-formula><tex-math>$_{2}$</tex-math></inline-formula> gas sensor, based on aluminum-doped zinc oxide (AZO) and silver nanowire (Ag NW) nanocomposite ink, which operates efficiently at room temperature. The AZO-Ag NW nanocomposite ink is optimized for direct ink writing (DIW) to obtain a uniform printed pattern. The composite ink helps overcome the inherent high resistance of AZO nanostructures by taking advantage of Ag NW's high conductivity and surface reactivity. The sensor shows a quick response time of 19 s and a recovery time of 36 s for 400 ppm CO<inline-formula><tex-math>$_{2}$</tex-math></inline-formula>. The sensor exhibits a response (R) of 32.5% with a limit of detection of 24.04 ppm, while operating at a low bias of 1 V. The integration of DIW, cost-effective ink formulation, and scalable fabrication is a significant advancement for real-time CO<inline-formula><tex-math>$_{2}$</tex-math></inline-formula> monitoring at low power.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 6","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144179174","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Non-intrusive, Non-obstructive, Versatile Venous Reservoir Blood Volume Sensor Based on Computer Vision for Clinical Cardiopulmonary Bypass","authors":"Ryan Kaddis;Chan-Jin Chung;Sean Murtha;Hao Jiang","doi":"10.1109/LSENS.2025.3551948","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3551948","url":null,"abstract":"We present an automatic venous reservoir blood volume sensor system, based on computer vision, for clinical cardiopulmonary bypass. A camera connected to a computer reads the blood volume inside the venous reservoir in real time, using custom-developed algorithms and software. The system's performance in blood volume measurement has been characterized, achieving a mean absolute percentage error as small as 2.5%. Through comprehensive testing in a simulated clinical environment, the system has proven to be nonintrusive to blood, nonobstructive for operators, and versatile for different reservoir models.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 5","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143821658","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Samiha Ahmed;Majed O. Althumayri;Weiming Xu;Azra Y. Tarman;Ramu Banavath;Jayesh Korgaonkar;Emily Wussow;Justin McMurray;Souvik Paul;Daniel Wollin;Gerard L. Coté;Hatice Ceylan Koydemir
{"title":"Engineering a Point-of-Care Device for Multiplex Monitoring of Urinary Parameters","authors":"Samiha Ahmed;Majed O. Althumayri;Weiming Xu;Azra Y. Tarman;Ramu Banavath;Jayesh Korgaonkar;Emily Wussow;Justin McMurray;Souvik Paul;Daniel Wollin;Gerard L. Coté;Hatice Ceylan Koydemir","doi":"10.1109/LSENS.2025.3551929","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3551929","url":null,"abstract":"Urinary tract infections (UTIs) are common and costly healthcare issues, particularly among catheterized patients. Current diagnostic methods, such as urine cultures and dipstick tests, suffer from delayed results and limited accuracy, highlighting the need for more reliable, real-time approaches. This letter presents a proof-of-concept point-of-care device for multiplex detection of key urine parameters—pH and nitrite levels—and urine volume in catheter bags used for intermittent or long-term urinary drainage. The system integrates commercial pH and nitrite probes with an ultrasonic liquid level sensor, all of which are managed by a control unit that processes and transmits data wirelessly via Bluetooth or near-field communication. The sensors were calibrated via standard solutions and tested for stability over four days, which revealed minimal drift. The system was further validated using human urine spiked with varying concentrations of nitrites and pH. The results demonstrated reliable sensor performance and accurate detection of urinary biomarkers in real time. While the system shows promise for the early detection of catheter-associated urinary tract infections, further validation is needed to confirm its validity over standard diagnostic methods. This device offers a feasible solution for real-time, noninvasive UTI monitoring at the point of care.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 4","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143740328","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Rotational Odometry Using Ultra Low Resolution Thermal Cameras","authors":"Ali Safa","doi":"10.1109/LSENS.2025.3552135","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3552135","url":null,"abstract":"This letter provides what is, to the best of the authors' knowledge, a first study on the applicability of ultra-low-resolution thermal cameras for providing rotational odometry measurements to navigational devices, such as rovers and drones. Our use of an ultra-low-resolution thermal camera instead of other modalities, such as an RGB camera is motivated by its robustness to lighting conditions, while being one order of magnitude less cost-expensive compared to higher-resolution thermal cameras. After setting up a custom data acquisition system and acquiring thermal camera data together with its associated rotational speed label, we train a small four-layer convolutional neural network (CNN) for regressing the rotational speed from the thermal data. Experiments and ablation studies are conducted for determining the impact of thermal camera resolution and the number of successive frames on the CNN estimation precision. Finally, our novel dataset for the study of low-resolution thermal odometry is openly released with the hope of benefiting future research.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 4","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143777782","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Anders Frem Wolstrup;Thomas Schlaikjer Holst;Jon Spangenberg;Tiberiu Gabriel Zsurzsan
{"title":"3D-Printed Conductance-Based Force Sensors Using Single Traxels","authors":"Anders Frem Wolstrup;Thomas Schlaikjer Holst;Jon Spangenberg;Tiberiu Gabriel Zsurzsan","doi":"10.1109/LSENS.2025.3571198","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3571198","url":null,"abstract":"This letter investigates the use of 3-D printing for fabricating conductance-based force sensors with cell-based geometries. Three mathematically defined structures, i.e., sine wave, circle, and Reuleaux triangle, were implemented using single traxels (3D-printed conductive tracks) to maximize contact area and enabling consistent fabrication. The sensors were produced via fused deposition modeling and programmed using FullControl G-code, enabling direct translation of mathematical functions into print paths. The sine wave design achieved the highest sensitivity (0.035 N<inline-formula><tex-math>$^{-1}$</tex-math></inline-formula>) and 95% linearity, consistent with constriction resistance theory. All designs demonstrated reliable performance with minimal process-induced variation. These findings highlight the potential of traxel-based 3-D printing as a cost-effective and customizable approach for producing force sensors suited for applications in human–machine interfacing and soft robotics.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 7","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144255665","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Former-HGR: Hand Gesture Recognition With Hybrid Feature-Aware Transformer","authors":"Monu Verma;Garvit Gopalani;Saiyam Bharara;Santosh Kumar Vipparthi;Subrahmanyam Murala;Mohamed Abdel-Mottaleb","doi":"10.1109/LSENS.2025.3566022","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3566022","url":null,"abstract":"Hand gesture recognition (HGR) systems, using cameras and sensors, offer an intuitive method for human–machine interaction, sparking interest across various applications. However, these systems face challenges from environmental factors such as variations in illumination, complex backgrounds, diverse hand shapes, and similarities between different gesture classes. Achieving accurate gesture recognition under such conditions remains a complex task, necessitating robust solutions to ensure reliable performance. This letter proposes a novel approach named Former-HGR, a hybrid feature-aware transformer for HGR. Unlike traditional transformer-based HGR systems that heavily rely on computationally intensive self-attention mechanisms, Former-HGR enhances global feature perception by applying self-attention across channels through the integration of multidconv head transposed attention. In addition, Former-HGR improves feature extraction by incorporating multiscale features and effectively filters redundant information using a hybrid feature-aware network. Extensive experiments conducted on three datasets: NUSII, OUHANDS, and MUGD, demonstrate that Former-HGR outperforms recent benchmark HGR approaches, achieving accuracy improvements of up to 14% in person-independent validation schemes.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 6","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144196723","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Lightweight Electrocardiogram Signal Quality-Aware VT/VF Detector for Resource-Constrained Life-Threatening Monitoring Devices","authors":"Nabasmita Phukan;M. Sabarimalai Manikandan;Ram Bilas Pachori;Niranjan Garg","doi":"10.1109/LSENS.2025.3570346","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3570346","url":null,"abstract":"Ventricular tachycardia (VT) and ventricular fibrillation (VF) are life-threatening arrhythmias, which lead to sudden cardiac arrest (SCA). The timely detection of VT and VF is vital, as automated external defibrillators rely on accurate VT/VF identification to deliver life-saving defibrillation and restore normal sinus rhythm during SCA. Continuous monitoring of electrocardiogram (ECG) signals plays a pivotal role in the early detection of VT/VF, potentially reducing mortality associated with SCA. However, the reliability of continuous ECG monitoring is often compromised by various noise sources, necessitating assessment of signal quality to ensure accurate VT/VF detection. This letter presents a real-time signal quality assessment (SQA)-based VT/VF detection method using zero-crossing rate. The SQA-based VT/VF detection method is tested on single and multilead datasets. The method is tested on real-time ECG signals collected from subjects with cardiac arrhythmias. Compared to zero-crossing rate-based VT/VF detection without SQA, the proposed SQA-based method reduced the false detection rate by up to 7.38% on a single-lead dataset and 59.22% on lead 1 of a multilead dataset. The method, implemented on the Arduino Due, consumed energy of 5.79 mJ and processing time of 13 ms, validating its real-time feasibility on resource-constrained wearable health monitoring devices.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 7","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144472545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Human Activity Recognition Using WiFi Signal Features and Efficient Residual Packet Attention Network","authors":"Senquan Yang;Junjie Yang;Chao Yang;Wei Yan;Pu Li","doi":"10.1109/LSENS.2025.3551337","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3551337","url":null,"abstract":"WiFi signal features, particularly channel state information (CSI), have gained considerable attention in human activity recognition (HAR) due to their nonintrusive and privacy–friendly nature. However, CSI packets are often nonstationary and exhibit fluctuations across various human activities. In this letter, we propose an end-to-end deep neural network (DNN) called efficient residual packet attention network (ERPANet) to tackle these challenges. In the proposed framework, we introduce the multilayer residual module composed of an attention residual (AR) operation and a downsampling attention residual (DAR) operation to effectively capture spatial-temporal features of CSI packets. In addition, a self-attention mechanism is embedded within AR and DAR to emphasize the importance of interrelationship among these multiscale CSI packet features. The proposed ERPANet aims to encode both channel information and long-range dependencies of CSI packet features. Extensive experiments show that ERPANet outperforms state-of-the-art methods, achieving average accuracies of 99.4% and 99.6% on the university of toronto human activity recognition (UT-HAR) and nanyang technological university human activity recognition (NTU-HAR) datasets, respectively.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 4","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143777748","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}