Zhuo Wang , Weichu Li , Jiacheng Zhang , Yiliang Zhou , Shisong Chen , Yuwei Dai , Jiale Song , Yeming Cheng , Xiaoting Du
{"title":"A mixed reality-based aircraft cable harness installation assistance system with fully occluded gesture recognition","authors":"Zhuo Wang , Weichu Li , Jiacheng Zhang , Yiliang Zhou , Shisong Chen , Yuwei Dai , Jiale Song , Yeming Cheng , Xiaoting Du","doi":"10.1016/j.rcim.2024.102930","DOIUrl":null,"url":null,"abstract":"<div><div>In limited visibility human-machine environments, there has been little discussion on hand motion parameter extraction, behavioral intention data analysis, and the effectiveness of 3D assembly instructions. To address this issue, we developed a mixed reality system for fully occluded gesture recognition (Fog-MR), which supports the extraction of complete hand motion models, the determination of a directional relationship between hand motion models and task intentions, and provides timely, natural visual feedback for hand operations. Firstly, a 3D hand pose registration method for cyber-physical objects is proposed, which uses BundleFusion to obtain a 3D point cloud model of virtual hands in VR space. Secondly, a hand motion feature spectral clustering analysis method based on a hand reconstruction model is constructed, combining a dual autoencoder network and mutual information metrics to achieve precise matching of hand behavioral intentions. Finally, a new industrial mixed reality visual prompt is designed, providing operators with more vivid and specific operational guidance. Experimental data indicate that compared to our previously developed mixed reality assembly system (Ug-MR), Fog-MR system significantly improves the motion synchronization accuracy between virtual hands and operator's hands, the accuracy of clustering hand motion parameters, and the naturalness of 3D assembly instructions that express behavioral intentions. These improvements significantly reduce the frequency of errors and omissions in such human-machine collaborative assembly processes, which is crucial for enhancing the reliability of high-precision manual operations.</div></div>","PeriodicalId":21452,"journal":{"name":"Robotics and Computer-integrated Manufacturing","volume":"93 ","pages":"Article 102930"},"PeriodicalIF":9.1000,"publicationDate":"2024-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Robotics and Computer-integrated Manufacturing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0736584524002175","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
In limited visibility human-machine environments, there has been little discussion on hand motion parameter extraction, behavioral intention data analysis, and the effectiveness of 3D assembly instructions. To address this issue, we developed a mixed reality system for fully occluded gesture recognition (Fog-MR), which supports the extraction of complete hand motion models, the determination of a directional relationship between hand motion models and task intentions, and provides timely, natural visual feedback for hand operations. Firstly, a 3D hand pose registration method for cyber-physical objects is proposed, which uses BundleFusion to obtain a 3D point cloud model of virtual hands in VR space. Secondly, a hand motion feature spectral clustering analysis method based on a hand reconstruction model is constructed, combining a dual autoencoder network and mutual information metrics to achieve precise matching of hand behavioral intentions. Finally, a new industrial mixed reality visual prompt is designed, providing operators with more vivid and specific operational guidance. Experimental data indicate that compared to our previously developed mixed reality assembly system (Ug-MR), Fog-MR system significantly improves the motion synchronization accuracy between virtual hands and operator's hands, the accuracy of clustering hand motion parameters, and the naturalness of 3D assembly instructions that express behavioral intentions. These improvements significantly reduce the frequency of errors and omissions in such human-machine collaborative assembly processes, which is crucial for enhancing the reliability of high-precision manual operations.
期刊介绍:
The journal, Robotics and Computer-Integrated Manufacturing, focuses on sharing research applications that contribute to the development of new or enhanced robotics, manufacturing technologies, and innovative manufacturing strategies that are relevant to industry. Papers that combine theory and experimental validation are preferred, while review papers on current robotics and manufacturing issues are also considered. However, papers on traditional machining processes, modeling and simulation, supply chain management, and resource optimization are generally not within the scope of the journal, as there are more appropriate journals for these topics. Similarly, papers that are overly theoretical or mathematical will be directed to other suitable journals. The journal welcomes original papers in areas such as industrial robotics, human-robot collaboration in manufacturing, cloud-based manufacturing, cyber-physical production systems, big data analytics in manufacturing, smart mechatronics, machine learning, adaptive and sustainable manufacturing, and other fields involving unique manufacturing technologies.