Xiongjun Chen, Chenguang Yang, Cheng Fang, Zhijun Li
{"title":"Impedance matching strategy for physical human robot interaction control","authors":"Xiongjun Chen, Chenguang Yang, Cheng Fang, Zhijun Li","doi":"10.1109/COASE.2017.8256093","DOIUrl":null,"url":null,"abstract":"Effective and stable execution of a human-robot interaction task requires the force and position trajectories of the robot are commanded properly according to the time-varying human arm behavior. In this paper, we aim to realize a direct and physical interaction task between the human hand and robotic arm end-effector. A computationally efficient Cartesian stiffness estimation model of human arm is first employed, which accounts for the geometric and volume modifications of the Cartesian stiffness profile through the arm posture and the activation levels of the two dominant upper arm muscles (i.e., Biceps and Triceps) respectively. Two Myo armbands are attached on the upper arm and the forearm with their built-in gyroscopes and wireless electromyography sensors (EMG) tracking the arm posture and the activation levels of the two muscles respectively. This stiffness estimation model is then extended to the full impedance sense by considering the mass and damping items supplementarily. Once the impedance estimation model is available after the calibration in various arm configurations and muscle activation levels. We employed Linear Quadratic Regulator (LQR) to computing the corresponding impedance model of the robot to match the estimated human arm behavior. An adaptive controller base on Function Approximation Technique (FAT) is employed to control the robot trajectory in joint space to realize the matching impedance behavior. The corresponding simulation results show that the proposed scheme is stable and effecitve.","PeriodicalId":445441,"journal":{"name":"2017 13th IEEE Conference on Automation Science and Engineering (CASE)","volume":"148 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 13th IEEE Conference on Automation Science and Engineering (CASE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/COASE.2017.8256093","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Effective and stable execution of a human-robot interaction task requires the force and position trajectories of the robot are commanded properly according to the time-varying human arm behavior. In this paper, we aim to realize a direct and physical interaction task between the human hand and robotic arm end-effector. A computationally efficient Cartesian stiffness estimation model of human arm is first employed, which accounts for the geometric and volume modifications of the Cartesian stiffness profile through the arm posture and the activation levels of the two dominant upper arm muscles (i.e., Biceps and Triceps) respectively. Two Myo armbands are attached on the upper arm and the forearm with their built-in gyroscopes and wireless electromyography sensors (EMG) tracking the arm posture and the activation levels of the two muscles respectively. This stiffness estimation model is then extended to the full impedance sense by considering the mass and damping items supplementarily. Once the impedance estimation model is available after the calibration in various arm configurations and muscle activation levels. We employed Linear Quadratic Regulator (LQR) to computing the corresponding impedance model of the robot to match the estimated human arm behavior. An adaptive controller base on Function Approximation Technique (FAT) is employed to control the robot trajectory in joint space to realize the matching impedance behavior. The corresponding simulation results show that the proposed scheme is stable and effecitve.