Peijun Ma;Cong Yao;Jiangyi Shi;Gongzhi Zhao;Mingyu Ma
{"title":"Design of Autoencoder Algorithm for Compression of Lightweight EEG Signals Based on 2-D Rhythm Feature Maps","authors":"Peijun Ma;Cong Yao;Jiangyi Shi;Gongzhi Zhao;Mingyu Ma","doi":"10.1109/LSENS.2025.3541231","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3541231","url":null,"abstract":"In recent years, with the development of brain science, the value of electroencephalography (EEG) data has become prominent. However, due to the characteristics of real-time transmission and large amounts of data, there is an urgent need for efficient and lightweight EEG compression algorithms. The existing EEG compression methods have many shortcomings, such as limited compression ratio (CR), poor reconstruction signal quality, and too large model scale, which cannot meet the developmental needs of portable wearable EEG detection devices. In this letter, a method of EEG signal compression based on 2-D rhythm feature maps is proposed. Through discrete wavelet transformation (DWT) extraction of the signal rhythm characteristics, the signal is compressed and reconstructed using encoding and reconstruction channels based on an autoencoder network. At the output end of the encoding channel, entropy coding is carried out to further compress the data volume. Through the discussion of several coding algorithms, JPEG2000 is selected as the local optimal coding algorithm. In addition, based on the idea of grouping convolution and void convolution kernel, a lightweight structure is designed to simplify the process of the proposed network and greatly reduce the number of model parameters. Experiments show that, compared with other similar algorithms, the percentage-root-mean-square distortion and mean squared error (MSE) of the proposed algorithm are 14.76% and 2.95%, respectively, at a relatively high CR (CR is about 16). And only 87.9-k parameters are used, which is more suitable for embedded scenarios and wearable devices.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 3","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143564223","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Fiber Bragg Grating-Based Sensor System for Strain and Angle Assessment in Passive Orthosis","authors":"João Coimbra;Pedro Lorenzutti;Arnaldo Leal-Junior","doi":"10.1109/LSENS.2025.3541344","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3541344","url":null,"abstract":"In this letter, we present the development and application of a fiber Bragg grating (FBG) sensor system for the instrumentation of a lower limb passive orthosis for gait assistance. The sensors include FBG strain sensors attached to the orthosis structure for monitoring the strains in the structure during gait. In addition, another FBG is used as angle sensor positioned on the user lumbar for angle monitoring of the trunk in the frontal plane. The sensors were characterized as a function of strain and angle resulting in root-mean-squared errors of 6.15<inline-formula><tex-math>$;mu epsilon$</tex-math></inline-formula> and 0.30 ° for strain and angle, respectively. Then, in the application tests, the strain sensor demonstrate its feasibility by means of strain estimation within the range of 10–200<inline-formula><tex-math>$;mu epsilon$</tex-math></inline-formula> as well as the periodic strain pattern following the ground reaction force variation during the stance phase of the gait. Furthermore, the angle measurement during the wearable gait tests indicated a measurement range of −15 ° to 30 <inline-formula><tex-math>$^{circ }$</tex-math></inline-formula> with an estimated linear velocity of 4.0 km/h, which is the reference one used on the treadmill. Therefore, the proposed sensor system is an integrated solution for in-situ gait analysis, which can extent the functionalities not only of the passive orthosis, but also in active rehabilitation robots.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 3","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143535526","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jörg F. Wagner;Sebastian Wille;Carl C. Rheinländer;Michael Kohl;Benedikt Györfi
{"title":"Hybrid Sensor System for Integrated Structural Motion Measurement","authors":"Jörg F. Wagner;Sebastian Wille;Carl C. Rheinländer;Michael Kohl;Benedikt Györfi","doi":"10.1109/LSENS.2025.3540358","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3540358","url":null,"abstract":"Over the last three decades, gyroscopes have become increasingly miniaturized and have enabled the design of very small, light, and cost-effective inertial measurement units (IMUs). This allows new applications of inertial sensors. The motion measurement or motion capture of structures which, in contrast to classic inertial navigation, are no longer single rigid bodies but can show a variable shape is such a new application. However, this approach is only realizable if such a “flexible” structure is equipped with several spatially distributed and closely networked IMUs. At the same time, this measurement approach, similar to classic inertial navigation, requires the use of additional, complementary aiding sensors, which must also be distributed across the structure. Such systems or networks, which consist of inertial and structural sensors in parallel, are novel and not state of the art. This letter, therefore, presents a modular, experimental sensor system of this type designed as a demonstrator for basic research purposes, describes the specific challenges in designing the system, and reports on initial practical experiences in using the system on a test rig for the motion measurement of a flexible beam.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 3","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143512936","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Novel Approach for Exploring Resolution Tunable Displacement Measurement System by Using Extension Spring and Ultrasonic Sensors","authors":"Ram Kishore Roy;Nityananda Hazarika;Tulshi Bezboruah","doi":"10.1109/LSENS.2025.3540499","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3540499","url":null,"abstract":"In this letter, we proposed an ultrasonic sensor-based resolution tunable displacement measurement system by using an extension spring. In the proposed method, a linear extension spring is vertically suspended between two clamps. The top end of the clamp is fixed, while the bottom end of the clamp is movable. An ultrasonic sensor is mounted on top of the arrangement. A circular reflective metallic plate is fixed at certain number of active coils of the spring to reflect the ultrasonic waves. It is experimentally observed that with the change in position of the reflective plate on the spring, there is a change in resolution of measured displacement. The proposed method has novelty of having capability to tune the resolution, free of hysteresis, simple in design, and having response time less than 1 s.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 3","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143535474","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Development of Self-Sensing Cantilever Structures Using Additive Manufacturing and Optical Fiber Sensing Technology","authors":"Robertson Pires-Junior;Leandro Macedo;Anselmo Frizera;Arnaldo Leal-Junior","doi":"10.1109/LSENS.2025.3539980","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3539980","url":null,"abstract":"Fused filament fabrication (FFF) can produce parts that can be integrated with optical fiber sensors to obtain a multifunctional structure. In this study, fibers inscribed with Bragg gratings were incorporated into single cantilever-type accelerometers (fabricated using FFF) from nylon filaments and 17–4 PH to produce structures sensitive to mechanical vibration. The materials were characterized using dynamic mechanical analysis with an optical fiber incorporated into the test specimens, where the nylon specimens presented a transition in elastic modulus until 60 <inline-formula><tex-math>$^{circ }$</tex-math></inline-formula>C and for the 17–4 PH the transition occurred until 180 <inline-formula><tex-math>$^{circ }$</tex-math></inline-formula>C. Linear regressions were used to estimate the sensitivity of the accelerometers, with calculated coefficients of determination (R<sup>2</sup>) greater than 0.97. The 17–4 PH cantilever had a sensitivity to variation in vibration amplitude (which ranged from 0.5 to 2.0 V) of 1.787 pm/V when vibrated at a frequency of 10 Hz and 3.605 pm/V at 100 Hz, while the nylon accelerometer had sensitivities of 83.114 pm/V and 104.385 pm/V at 10 and 100 Hz, respectively. The manufactured accelerometers have the potential to be used as low-frequency vibration sensors and self-sensing cantilevers in diverse environments.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 3","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143455319","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Dielectric-Based Calibration Algorithm for Rapid and Nondestructive Determination of Multiple Quality Attributes of In-Shell Nuts","authors":"Samir Trabelsi;Micah A. Lewis","doi":"10.1109/LSENS.2025.3539583","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3539583","url":null,"abstract":"Quality assessment of several agricultural commodities requires measurement of several quality attributes to determine their value, optimum processing conditions, and conditions for safe storage. A dielectric-based calibration algorithm is proposed for simultaneous and nondestructive determination of three quality attributes of in-shell nuts, namely, bulk density, moisture contents (pods, kernels, and shells), and “meat” content from the measurement of their dielectric properties at a single microwave frequency. The calibration algorithm relies on the existence of direct correlations between the quality attributes and the dielectric properties of in-shell nuts measured at a single microwave frequency. The performance of this algorithm is demonstrated through measurements on in-shell peanuts with a low-cost microwave sensor operating at 5.8 GHz. Calibration equations correlating each quality attribute with in-shell peanuts’ dielectric properties are given along with the standard error of calibration.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 4","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143654901","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"No-Reference Laparoscopic Video Quality Assessment for Sensor Distortions Using Optimized Long Short-Term Memory Framework","authors":"Sria Biswas;Rohini Palanisamy","doi":"10.1109/LSENS.2025.3539186","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3539186","url":null,"abstract":"Laparoscopic surgery relies on sensor-based video systems vulnerable to visual distortions, requiring rigorous quality checks to meet regulatory standards. This letter introduces a no-reference laparoscopic video quality assessment algorithm designed to replicate human perceptual judgments in the presence of sensor distortions. The method models the statistical interdependencies between luminance and motion features and combines them with texture variations to formulate a perceptually relevant feature vector. This is used as input to train a memory-retentive deep learning model optimized by chaotic maps to predict frame quality scores which are utilized to evaluate the overall video quality. Performance comparisons with state-of-the-art methods show that the proposed model aligns closely with both expert and nonexpert subjective ratings, with experts achieving higher accuracy. Ablation studies further emphasize the effectiveness of the selected feature combinations and regression frameworks, demonstrating the capability of the model to replicate human opinions. These findings highlight the potential of the proposed method as a reliable tool for automating quality assessment in sensor-based laparoscopic systems to ensure high standards in clinical applications.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 4","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143621679","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Harnessing Haptic Technology for Real-Time Emotion Detection","authors":"Lital Levy;Yuval Blum;Asmare Ambaw;Roi Yozevitch;Eldad Holdengreber","doi":"10.1109/LSENS.2025.3538804","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3538804","url":null,"abstract":"This letter introduces a novel multi-modal environmental translator for real-time emotion recognition. The system integrates facial expression recognition (FER) and speech emotion recognition (SER) to analyze visual and vocal cues while conveying emotional feedback through vibrotactile signals. Emotions are mapped to distinct vibration frequencies—ranging from 0.4 Hz for neutral to 35 Hz for anger—enabling users to identify seven core emotions through tactile sensations intuitively. A user study involving ten participants demonstrated an average adaptation time of fewer than 7 min, indicating the system's effectiveness in quickly familiarizing users with the vibration signals. Overall, this innovative solution provides a robust approach to enhancing real-time emotion recognition through haptic feedback, making it suitable for everyday social interactions.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 3","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143489280","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"uThaw: Ultra-Wideband Wireless Solid–Liquid State Transition Sensor to Detect Thawing of Food","authors":"Rahul Bulusu;Ashutosh Dhekne","doi":"10.1109/LSENS.2025.3537922","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3537922","url":null,"abstract":"This letter explores the potential use of ultra-wideband (UWB) wireless sensors in detecting the transition of food items from a frozen state to thawed state (or vice versa), marking a significant advancement in monitoring the thawing process. By exploiting the drastic change in complex permittivity at microwave frequencies (we use frequencies close to 4 GHz) during the solid–liquid state transition, this letter introduces a novel approach in ensuring frozen food safety and in enhancing the efficiency of the frozen food industry and cold chain transportation. The developed system, capable of operating through food packaging, offers a noninvasive, cost-effective solution for real-time monitoring, addressing the limitations of conventional temperature-based and timing-based methods widely used today in household and professional cooking and in the food industry. Our findings from the raw UWB channel impulse responses and computed similarity scores indeed show significant promise and validate the feasibility of the proposed system with various real-world applications.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 3","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143438409","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Ball Trajectory and Spin Analysis From Asynchronous Videos","authors":"Aakanksha;Ashish Kumar;Rajagopalan A. N.","doi":"10.1109/LSENS.2025.3537116","DOIUrl":"https://doi.org/10.1109/LSENS.2025.3537116","url":null,"abstract":"Existing systems for ball trajectory and spin estimation use embedded sensors or expensive high-frame-rate cameras, which severely limits their accessibility. We propose an easy-to-setup low-cost vision sensor pipeline using two static asynchronous consumer-grade cameras. We also propose the use of epipolar geometry for synchronizing the cameras. We estimate 3-D ball trajectory and spin with only one distinguishable feature on the ball. Mixture of Gaussians and adaptive color-based thresholding are used to localize the ball in 2-D followed by triangulation. To estimate spin magnitude and axis, we employ feature detection and plane fitting. Extensive experiments with three different balls across multiple varied environments are reported and the approach is validated by arriving at the standard gravitational acceleration value from our estimated ball trajectory. For validating the spin, we compare our results with the true spin for a rotating ball fixed on a motor shaft. The average reprojection error was below 10 pixels for all our experiments and a maximum deviation of 17 rotations per minute in spin magnitude was observed.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 3","pages":"1-4"},"PeriodicalIF":2.2,"publicationDate":"2025-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143521352","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}