{"title":"Application of Object Grasping Using Dual-Arm Autonomous Mobile Robot—Path Planning by Spline Curve and Object Recognition by YOLO—","authors":"Naoya Mukai, Masato Suzuki, Tomokazu Takahashi, Yasushi Mae, Yasuhiko Arai, S. Aoyagi","doi":"10.20965/jrm.2023.p1524","DOIUrl":null,"url":null,"abstract":"In the trash-collection challenge of the Nakanoshima Robot Challenge, an autonomous robot must collect trash (bottles, cans, and bentos) scattered in a defined area within a time limit. A method for collecting the trash is to use machine learning to recognize the objects, move to the target location, and grasp the objects. An autonomous robot can achieve the target position and posture by rotating on the spot at the starting point, moving in a straight line, and rotating on the spot at the destination, but the rotation requires stopping and starting. To achieve faster movement, we implemented a smooth movement approach without sequential stops using a spline curve. When using the training data previously generated by the authors in their laboratory for object recognition, the robot could not correctly recognize objects in the environment of the robot competition, where strong sunlight shines through glass, because of the varying brightness and darkness. To solve this problem, we added our newly generated training data to YOLO, an image-recognition algorithm based on deep learning, and performed machine learning to achieve object recognition under various conditions.","PeriodicalId":51661,"journal":{"name":"Journal of Robotics and Mechatronics","volume":"50 2","pages":""},"PeriodicalIF":0.9000,"publicationDate":"2023-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Robotics and Mechatronics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.20965/jrm.2023.p1524","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
In the trash-collection challenge of the Nakanoshima Robot Challenge, an autonomous robot must collect trash (bottles, cans, and bentos) scattered in a defined area within a time limit. A method for collecting the trash is to use machine learning to recognize the objects, move to the target location, and grasp the objects. An autonomous robot can achieve the target position and posture by rotating on the spot at the starting point, moving in a straight line, and rotating on the spot at the destination, but the rotation requires stopping and starting. To achieve faster movement, we implemented a smooth movement approach without sequential stops using a spline curve. When using the training data previously generated by the authors in their laboratory for object recognition, the robot could not correctly recognize objects in the environment of the robot competition, where strong sunlight shines through glass, because of the varying brightness and darkness. To solve this problem, we added our newly generated training data to YOLO, an image-recognition algorithm based on deep learning, and performed machine learning to achieve object recognition under various conditions.
期刊介绍:
First published in 1989, the Journal of Robotics and Mechatronics (JRM) has the longest publication history in the world in this field, publishing a total of over 2,000 works exclusively on robotics and mechatronics from the first number. The Journal publishes academic papers, development reports, reviews, letters, notes, and discussions. The JRM is a peer-reviewed journal in fields such as robotics, mechatronics, automation, and system integration. Its editorial board includes wellestablished researchers and engineers in the field from the world over. The scope of the journal includes any and all topics on robotics and mechatronics. As a key technology in robotics and mechatronics, it includes actuator design, motion control, sensor design, sensor fusion, sensor networks, robot vision, audition, mechanism design, robot kinematics and dynamics, mobile robot, path planning, navigation, SLAM, robot hand, manipulator, nano/micro robot, humanoid, service and home robots, universal design, middleware, human-robot interaction, human interface, networked robotics, telerobotics, ubiquitous robot, learning, and intelligence. The scope also includes applications of robotics and automation, and system integrations in the fields of manufacturing, construction, underwater, space, agriculture, sustainability, energy conservation, ecology, rescue, hazardous environments, safety and security, dependability, medical, and welfare.