Chunyuan Shi, Jingdong Zhao, Dapeng Yang, Li Jiang
{"title":"i-MYO:基于凝视运动、增强现实技术和肌电信号的多抓假手控制系统","authors":"Chunyuan Shi, Jingdong Zhao, Dapeng Yang, Li Jiang","doi":"10.1002/rcs.2617","DOIUrl":null,"url":null,"abstract":"<div>\n \n \n <section>\n \n <h3> Background</h3>\n \n <p>Controlling a multi-grasp prosthetic hand still remains a challenge. This study explores the influence of merging gaze movements and augmented reality in bionics on improving prosthetic hand control.</p>\n </section>\n \n <section>\n \n <h3> Methods</h3>\n \n <p>A control system based on gaze movements, augmented reality, and myoelectric signals (i-MYO) was proposed. In the i-MYO, the GazeButton was introduced into the controller to detect the grasp-type intention from the eye-tracking signals, and the proportional velocity scheme based on the i-MYO was used to control hand movement.</p>\n </section>\n \n <section>\n \n <h3> Results</h3>\n \n <p>The able-bodied subjects with no prior training successfully transferred objects in 91.6% of the cases and switched the optimal grasp types in 97.5%. The patient could successfully trigger the EMG to control the hand holding the objects in 98.7% of trials in around 3.2 s and spend around 1.3 s switching the optimal grasp types in 99.2% of trials.</p>\n </section>\n \n <section>\n \n <h3> Conclusions</h3>\n \n <p>Merging gaze movements and augmented reality in bionics can widen the control bandwidth of prosthetic hand. With the help of i-MYO, the subjects can control a prosthetic hand using six grasp types if they can manipulate two muscle signals and gaze movement.</p>\n </section>\n </div>","PeriodicalId":50311,"journal":{"name":"International Journal of Medical Robotics and Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":2.3000,"publicationDate":"2023-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"i-MYO: A multi-grasp prosthetic hand control system based on gaze movements, augmented reality, and myoelectric signals\",\"authors\":\"Chunyuan Shi, Jingdong Zhao, Dapeng Yang, Li Jiang\",\"doi\":\"10.1002/rcs.2617\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div>\\n \\n \\n <section>\\n \\n <h3> Background</h3>\\n \\n <p>Controlling a multi-grasp prosthetic hand still remains a challenge. This study explores the influence of merging gaze movements and augmented reality in bionics on improving prosthetic hand control.</p>\\n </section>\\n \\n <section>\\n \\n <h3> Methods</h3>\\n \\n <p>A control system based on gaze movements, augmented reality, and myoelectric signals (i-MYO) was proposed. In the i-MYO, the GazeButton was introduced into the controller to detect the grasp-type intention from the eye-tracking signals, and the proportional velocity scheme based on the i-MYO was used to control hand movement.</p>\\n </section>\\n \\n <section>\\n \\n <h3> Results</h3>\\n \\n <p>The able-bodied subjects with no prior training successfully transferred objects in 91.6% of the cases and switched the optimal grasp types in 97.5%. The patient could successfully trigger the EMG to control the hand holding the objects in 98.7% of trials in around 3.2 s and spend around 1.3 s switching the optimal grasp types in 99.2% of trials.</p>\\n </section>\\n \\n <section>\\n \\n <h3> Conclusions</h3>\\n \\n <p>Merging gaze movements and augmented reality in bionics can widen the control bandwidth of prosthetic hand. With the help of i-MYO, the subjects can control a prosthetic hand using six grasp types if they can manipulate two muscle signals and gaze movement.</p>\\n </section>\\n </div>\",\"PeriodicalId\":50311,\"journal\":{\"name\":\"International Journal of Medical Robotics and Computer Assisted Surgery\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2023-12-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Medical Robotics and Computer Assisted Surgery\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/rcs.2617\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"SURGERY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Medical Robotics and Computer Assisted Surgery","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/rcs.2617","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"SURGERY","Score":null,"Total":0}
i-MYO: A multi-grasp prosthetic hand control system based on gaze movements, augmented reality, and myoelectric signals
Background
Controlling a multi-grasp prosthetic hand still remains a challenge. This study explores the influence of merging gaze movements and augmented reality in bionics on improving prosthetic hand control.
Methods
A control system based on gaze movements, augmented reality, and myoelectric signals (i-MYO) was proposed. In the i-MYO, the GazeButton was introduced into the controller to detect the grasp-type intention from the eye-tracking signals, and the proportional velocity scheme based on the i-MYO was used to control hand movement.
Results
The able-bodied subjects with no prior training successfully transferred objects in 91.6% of the cases and switched the optimal grasp types in 97.5%. The patient could successfully trigger the EMG to control the hand holding the objects in 98.7% of trials in around 3.2 s and spend around 1.3 s switching the optimal grasp types in 99.2% of trials.
Conclusions
Merging gaze movements and augmented reality in bionics can widen the control bandwidth of prosthetic hand. With the help of i-MYO, the subjects can control a prosthetic hand using six grasp types if they can manipulate two muscle signals and gaze movement.
期刊介绍:
The International Journal of Medical Robotics and Computer Assisted Surgery provides a cross-disciplinary platform for presenting the latest developments in robotics and computer assisted technologies for medical applications. The journal publishes cutting-edge papers and expert reviews, complemented by commentaries, correspondence and conference highlights that stimulate discussion and exchange of ideas. Areas of interest include robotic surgery aids and systems, operative planning tools, medical imaging and visualisation, simulation and navigation, virtual reality, intuitive command and control systems, haptics and sensor technologies. In addition to research and surgical planning studies, the journal welcomes papers detailing clinical trials and applications of computer-assisted workflows and robotic systems in neurosurgery, urology, paediatric, orthopaedic, craniofacial, cardiovascular, thoraco-abdominal, musculoskeletal and visceral surgery. Articles providing critical analysis of clinical trials, assessment of the benefits and risks of the application of these technologies, commenting on ease of use, or addressing surgical education and training issues are also encouraged. The journal aims to foster a community that encompasses medical practitioners, researchers, and engineers and computer scientists developing robotic systems and computational tools in academic and commercial environments, with the intention of promoting and developing these exciting areas of medical technology.