K. Balamurugan , G. Sudhakar , Kavin Francis Xavier , N. Bharathiraja , Gaganpreet Kaur
{"title":"Human-machine interaction in mechanical systems through sensor enabled wearable augmented reality interfaces","authors":"K. Balamurugan , G. Sudhakar , Kavin Francis Xavier , N. Bharathiraja , Gaganpreet Kaur","doi":"10.1016/j.measen.2025.101880","DOIUrl":null,"url":null,"abstract":"<div><div>The research improves mechanical systems by using wearable sensor-based Augmented Reality (AR) interfaces for better Human-Machine Interaction (HCI). Industrial AR systems currently face problems created by their static programming methods along with delayed responsiveness and restricted sensor collectability and insufficient wireless throughput that results in system inefficiency and elevated stress on users. A new wearable AR system using gloves with haptic feedback and flex sensors with Inertial Measurement Units provides precise gesture-control while displaying real-time contextual information. The dynamic gesture recognition system uses Random Forest as its lightweight machine learning model to achieve 93.4 % accuracy in mapping gestures to command sequences which represents a 14.6 % enhancement above conventional static models. The system leverages Edge Computing for low-latency processing (average latency <47 ms) and cloud-based analytics for predictive maintenance insights. The proposed setup demonstrated an enhanced industrial performance in a simulated environment through error reduction by 22.3 % along with a 31.1 % increase in task speed and a 27.8 % improvement in situational awareness recorded through NASA-TLX cognitive load evaluations. Findings prove that the system fills fundamental weaknesses with current AR-assisted industrial HCI systems by providing automatic adaptation features along with improved safety measures and precise operational capability.</div></div>","PeriodicalId":34311,"journal":{"name":"Measurement Sensors","volume":"39 ","pages":"Article 101880"},"PeriodicalIF":0.0000,"publicationDate":"2025-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Measurement Sensors","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2665917425000741","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"Engineering","Score":null,"Total":0}
引用次数: 0
Abstract
The research improves mechanical systems by using wearable sensor-based Augmented Reality (AR) interfaces for better Human-Machine Interaction (HCI). Industrial AR systems currently face problems created by their static programming methods along with delayed responsiveness and restricted sensor collectability and insufficient wireless throughput that results in system inefficiency and elevated stress on users. A new wearable AR system using gloves with haptic feedback and flex sensors with Inertial Measurement Units provides precise gesture-control while displaying real-time contextual information. The dynamic gesture recognition system uses Random Forest as its lightweight machine learning model to achieve 93.4 % accuracy in mapping gestures to command sequences which represents a 14.6 % enhancement above conventional static models. The system leverages Edge Computing for low-latency processing (average latency <47 ms) and cloud-based analytics for predictive maintenance insights. The proposed setup demonstrated an enhanced industrial performance in a simulated environment through error reduction by 22.3 % along with a 31.1 % increase in task speed and a 27.8 % improvement in situational awareness recorded through NASA-TLX cognitive load evaluations. Findings prove that the system fills fundamental weaknesses with current AR-assisted industrial HCI systems by providing automatic adaptation features along with improved safety measures and precise operational capability.