{"title":"Autonomous robotic bin picking platform generated from human demonstration and YOLOv5","authors":"Jinho Park, C. Han, M. Jun, Huitaek Yun","doi":"10.1115/1.4063107","DOIUrl":null,"url":null,"abstract":"\n Vision-based robots have been utilized for pick-and-place operations by their ability to find object poses. As they progress into handling a variety of objects with cluttered state, more flexible and lightweight operations have been presented. In this paper, an autonomous robotic bin-picking platform which combines human demonstration with a collaborative robot for the flexibility of the objects and YOLOv5 neural network model for the faster object localization without prior CAD models or dataset in the training. After simple human demonstration of which target object to pick and place, the raw color and depth images were refined, and the one on top of the bin was utilized to create synthetic images and annotations for the YOLOv5 model. To pick up the target object, the point cloud was lifted using the depth data corresponding to the result of the trained YOLOv5 model, and the object pose was estimated through matching them by Iterative Closest Points (ICP) algorithm. After picking up the target object, the robot placed it where the user defined in the previous human demonstration stage. From the result of experiments with four types of objects and four human demonstrations, it took a total of 0.5 seconds to recognize the target object and estimate the object pose. The success rate of object detection was 95.6%, and the pick-and-place motion of all the found objects were successful.","PeriodicalId":16299,"journal":{"name":"Journal of Manufacturing Science and Engineering-transactions of The Asme","volume":null,"pages":null},"PeriodicalIF":2.4000,"publicationDate":"2023-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Manufacturing Science and Engineering-transactions of The Asme","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1115/1.4063107","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, MANUFACTURING","Score":null,"Total":0}
引用次数: 0
Abstract
Vision-based robots have been utilized for pick-and-place operations by their ability to find object poses. As they progress into handling a variety of objects with cluttered state, more flexible and lightweight operations have been presented. In this paper, an autonomous robotic bin-picking platform which combines human demonstration with a collaborative robot for the flexibility of the objects and YOLOv5 neural network model for the faster object localization without prior CAD models or dataset in the training. After simple human demonstration of which target object to pick and place, the raw color and depth images were refined, and the one on top of the bin was utilized to create synthetic images and annotations for the YOLOv5 model. To pick up the target object, the point cloud was lifted using the depth data corresponding to the result of the trained YOLOv5 model, and the object pose was estimated through matching them by Iterative Closest Points (ICP) algorithm. After picking up the target object, the robot placed it where the user defined in the previous human demonstration stage. From the result of experiments with four types of objects and four human demonstrations, it took a total of 0.5 seconds to recognize the target object and estimate the object pose. The success rate of object detection was 95.6%, and the pick-and-place motion of all the found objects were successful.
期刊介绍:
Areas of interest including, but not limited to: Additive manufacturing; Advanced materials and processing; Assembly; Biomedical manufacturing; Bulk deformation processes (e.g., extrusion, forging, wire drawing, etc.); CAD/CAM/CAE; Computer-integrated manufacturing; Control and automation; Cyber-physical systems in manufacturing; Data science-enhanced manufacturing; Design for manufacturing; Electrical and electrochemical machining; Grinding and abrasive processes; Injection molding and other polymer fabrication processes; Inspection and quality control; Laser processes; Machine tool dynamics; Machining processes; Materials handling; Metrology; Micro- and nano-machining and processing; Modeling and simulation; Nontraditional manufacturing processes; Plant engineering and maintenance; Powder processing; Precision and ultra-precision machining; Process engineering; Process planning; Production systems optimization; Rapid prototyping and solid freeform fabrication; Robotics and flexible tooling; Sensing, monitoring, and diagnostics; Sheet and tube metal forming; Sustainable manufacturing; Tribology in manufacturing; Welding and joining