Sophokles Ktistakis , Lucas Gimeno , Fatima-Zahra Laftissi , Alexis Hoss , Antonio De Donno , Mirko Meboldt
{"title":"Robot assistance primitives with force-field guidance for shared task collaboration","authors":"Sophokles Ktistakis , Lucas Gimeno , Fatima-Zahra Laftissi , Alexis Hoss , Antonio De Donno , Mirko Meboldt","doi":"10.1016/j.rcim.2025.103061","DOIUrl":null,"url":null,"abstract":"<div><div>This paper proposes a novel framework for human-robot collaboration (HRC) that addresses the critical need for robots to effectively collaborate with humans on shared tasks within unstructured and dynamic environments. While prior research focused on safety-related aspects, such as collision avoidance in shared workspaces, the task-oriented aspects of human-robot collaboration remain largely underexplored. To address this gap, our framework introduces Robot Assistance Primitives (RAPs). These low-level robot actions integrate both safety and task-related behaviours, enabling the robot to function as a collaborative \"third hand\", and provide assistance across the full spectrum of both physical and contactless interactions. A key component of our approach is an extension of impedance control with virtual force fields, which unifies task-related interactions and safety-related aspects within a single control scheme. The framework leverages a state-of-the-art visual perception pipeline that constructs and tracks real-time 3D digital representations of the workspace and the human operator. Additionally, an Augmented Reality Head-Mounted Display (AR-HMD) facilitates multimodal task programming through user gaze, gestures, and speech, as well as providing visual feedback to foster trust during interactions. We validate the feasibility of the proposed framework and conduct a user study to further evaluate user interactions in a collaborative soldering and assembly task. This research not only addresses limitations of current HRC frameworks but also paves the way for exploring novel collaborative applications.</div></div>","PeriodicalId":21452,"journal":{"name":"Robotics and Computer-integrated Manufacturing","volume":"96 ","pages":"Article 103061"},"PeriodicalIF":9.1000,"publicationDate":"2025-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Robotics and Computer-integrated Manufacturing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0736584525001152","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
This paper proposes a novel framework for human-robot collaboration (HRC) that addresses the critical need for robots to effectively collaborate with humans on shared tasks within unstructured and dynamic environments. While prior research focused on safety-related aspects, such as collision avoidance in shared workspaces, the task-oriented aspects of human-robot collaboration remain largely underexplored. To address this gap, our framework introduces Robot Assistance Primitives (RAPs). These low-level robot actions integrate both safety and task-related behaviours, enabling the robot to function as a collaborative "third hand", and provide assistance across the full spectrum of both physical and contactless interactions. A key component of our approach is an extension of impedance control with virtual force fields, which unifies task-related interactions and safety-related aspects within a single control scheme. The framework leverages a state-of-the-art visual perception pipeline that constructs and tracks real-time 3D digital representations of the workspace and the human operator. Additionally, an Augmented Reality Head-Mounted Display (AR-HMD) facilitates multimodal task programming through user gaze, gestures, and speech, as well as providing visual feedback to foster trust during interactions. We validate the feasibility of the proposed framework and conduct a user study to further evaluate user interactions in a collaborative soldering and assembly task. This research not only addresses limitations of current HRC frameworks but also paves the way for exploring novel collaborative applications.
期刊介绍:
The journal, Robotics and Computer-Integrated Manufacturing, focuses on sharing research applications that contribute to the development of new or enhanced robotics, manufacturing technologies, and innovative manufacturing strategies that are relevant to industry. Papers that combine theory and experimental validation are preferred, while review papers on current robotics and manufacturing issues are also considered. However, papers on traditional machining processes, modeling and simulation, supply chain management, and resource optimization are generally not within the scope of the journal, as there are more appropriate journals for these topics. Similarly, papers that are overly theoretical or mathematical will be directed to other suitable journals. The journal welcomes original papers in areas such as industrial robotics, human-robot collaboration in manufacturing, cloud-based manufacturing, cyber-physical production systems, big data analytics in manufacturing, smart mechatronics, machine learning, adaptive and sustainable manufacturing, and other fields involving unique manufacturing technologies.