{"title":"Learning-based object abstraction method from simple instructions for human support robot HSR","authors":"Kotaro Nagahama, Hiroaki Yaguchi, Hirohito Hattori, Kiyohiro Sogen, Takashi Yamamoto, M. Inaba","doi":"10.1109/AIM.2016.7576812","DOIUrl":null,"url":null,"abstract":"This study proposes the development of a simple remote-controlled daily assistive robot to assist physically challenged individuals. Specifically, we present a method for target object selection using a single click on a graphical user interface. Using this information, the robot can automatically estimate the unknown target object region to plan to grasp and fetch the object. The challenging task is to correctly estimate the region of the object of interest. The proposed system is implemented using the following framework for estimating the region of the object. First, the robot automatically estimates the object region based on user input. Second, the user can intervene by interactively drawing and erasing the estimated region while the system sequentially updates the estimation method based only on the user's correction. The advantage of this system is that only limited inputs are required from the user, a feature that is useful for handicapped users. Moreover, we introduce (1) graph cuts, comprising “HyperPixels” and three-dimensional information, to enable the system to recognize the rich features around the user-specified region for robust segmentation, (2) interactive correction of the automatically estimated object region while the system calculates good graph parameters for the correct estimation, and (3) recall and use of the learned parameters for the estimation based on the database of features around the clicked point.","PeriodicalId":154457,"journal":{"name":"2016 IEEE International Conference on Advanced Intelligent Mechatronics (AIM)","volume":"53 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE International Conference on Advanced Intelligent Mechatronics (AIM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AIM.2016.7576812","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
This study proposes the development of a simple remote-controlled daily assistive robot to assist physically challenged individuals. Specifically, we present a method for target object selection using a single click on a graphical user interface. Using this information, the robot can automatically estimate the unknown target object region to plan to grasp and fetch the object. The challenging task is to correctly estimate the region of the object of interest. The proposed system is implemented using the following framework for estimating the region of the object. First, the robot automatically estimates the object region based on user input. Second, the user can intervene by interactively drawing and erasing the estimated region while the system sequentially updates the estimation method based only on the user's correction. The advantage of this system is that only limited inputs are required from the user, a feature that is useful for handicapped users. Moreover, we introduce (1) graph cuts, comprising “HyperPixels” and three-dimensional information, to enable the system to recognize the rich features around the user-specified region for robust segmentation, (2) interactive correction of the automatically estimated object region while the system calculates good graph parameters for the correct estimation, and (3) recall and use of the learned parameters for the estimation based on the database of features around the clicked point.