{"title":"使用头戴式显示器的假手双向用户界面","authors":"Dayeon Kim, Su-Bin Joo, Joonho Seo, P. Kazanzides","doi":"10.1109/ismr48346.2021.9661504","DOIUrl":null,"url":null,"abstract":"Prosthetic hands have the potential to restore functionality to humans who have lost their hands, but it remains challenging to design a prosthetic hand that mimics a biological hand and then to effectively integrate that mechanical hand with the human brain for both sensing and control. We focus on the human/prosthesis integration and propose an Augmented Reality Manipulation Interface (ARMI) to facilitate that bi-directional integration. ARMI enables the user to specify intent (control) to the prosthesis while providing guidance (feedback) to the user through perception of the environment via artificial intelligence. Specifically, ARMI identifies objects in the environment and automatically determines the grasping configuration and timing; once the user selects an object, ARMI provides guidance for the user to correctly position the prosthesis with respect to the object and then initiate an autonomous grasp. We perform preliminary experiments with the Microsoft HoloLens head-mounted display (HMD) and a robotic hand to demonstrate the concept. Results suggest that ARMI would currently provide the greatest benefit for novice users who have not yet mastered the prosthetic hand, whereas further system improvements are necessary to provide a benefit for more experienced users.","PeriodicalId":405817,"journal":{"name":"2021 International Symposium on Medical Robotics (ISMR)","volume":"197 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"A Bi-directional User Interface for a Prosthetic Hand Using a Head-Mounted Display\",\"authors\":\"Dayeon Kim, Su-Bin Joo, Joonho Seo, P. Kazanzides\",\"doi\":\"10.1109/ismr48346.2021.9661504\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Prosthetic hands have the potential to restore functionality to humans who have lost their hands, but it remains challenging to design a prosthetic hand that mimics a biological hand and then to effectively integrate that mechanical hand with the human brain for both sensing and control. We focus on the human/prosthesis integration and propose an Augmented Reality Manipulation Interface (ARMI) to facilitate that bi-directional integration. ARMI enables the user to specify intent (control) to the prosthesis while providing guidance (feedback) to the user through perception of the environment via artificial intelligence. Specifically, ARMI identifies objects in the environment and automatically determines the grasping configuration and timing; once the user selects an object, ARMI provides guidance for the user to correctly position the prosthesis with respect to the object and then initiate an autonomous grasp. We perform preliminary experiments with the Microsoft HoloLens head-mounted display (HMD) and a robotic hand to demonstrate the concept. Results suggest that ARMI would currently provide the greatest benefit for novice users who have not yet mastered the prosthetic hand, whereas further system improvements are necessary to provide a benefit for more experienced users.\",\"PeriodicalId\":405817,\"journal\":{\"name\":\"2021 International Symposium on Medical Robotics (ISMR)\",\"volume\":\"197 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-11-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 International Symposium on Medical Robotics (ISMR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ismr48346.2021.9661504\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Symposium on Medical Robotics (ISMR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ismr48346.2021.9661504","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Bi-directional User Interface for a Prosthetic Hand Using a Head-Mounted Display
Prosthetic hands have the potential to restore functionality to humans who have lost their hands, but it remains challenging to design a prosthetic hand that mimics a biological hand and then to effectively integrate that mechanical hand with the human brain for both sensing and control. We focus on the human/prosthesis integration and propose an Augmented Reality Manipulation Interface (ARMI) to facilitate that bi-directional integration. ARMI enables the user to specify intent (control) to the prosthesis while providing guidance (feedback) to the user through perception of the environment via artificial intelligence. Specifically, ARMI identifies objects in the environment and automatically determines the grasping configuration and timing; once the user selects an object, ARMI provides guidance for the user to correctly position the prosthesis with respect to the object and then initiate an autonomous grasp. We perform preliminary experiments with the Microsoft HoloLens head-mounted display (HMD) and a robotic hand to demonstrate the concept. Results suggest that ARMI would currently provide the greatest benefit for novice users who have not yet mastered the prosthetic hand, whereas further system improvements are necessary to provide a benefit for more experienced users.