Enhancing robotic grasping detection accuracy with the R2CNN algorithm and force-closure

IF 2.6 3区 工程技术 Q2 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
Hsien-I Lin, M. Shodiq, Hong-Qi Chu
{"title":"Enhancing robotic grasping detection accuracy with the R2CNN algorithm and force-closure","authors":"Hsien-I Lin, M. Shodiq, Hong-Qi Chu","doi":"10.1115/1.4065311","DOIUrl":null,"url":null,"abstract":"\n This study aims to use an improved rotational region convolutional neural network (R2CNN) algorithm to detect the grasping bounding box for the robotic arm that reaches supermarket goods. This algorithm can calculate the final predicted grasping bounding box without any additional architecture, which greatly improves the speed of grasp inferences. In this study, we added the force-closure condition, so that the final grasping bounding box could achieve grasping stability in a physical sense. We experimentally demonstrated that the deep model treated object detection and grasping detection are the same tasks. We used transfer learning to improve the prediction accuracy of the grasping bounding box. In particular, the ResNet-101 network weights, which were originally used in object detection, were used to continue training with the Cornell dataset. In terms of grasping detection, we used the trained model weights that were originally used in object detection as the features of the to-be-grasped objects and fed them to the network for continuous training. For 2,828 test images, this method achieved nearly 98% accuracy and a speed of 14–17 frames per second.","PeriodicalId":54856,"journal":{"name":"Journal of Computing and Information Science in Engineering","volume":null,"pages":null},"PeriodicalIF":2.6000,"publicationDate":"2024-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computing and Information Science in Engineering","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1115/1.4065311","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

This study aims to use an improved rotational region convolutional neural network (R2CNN) algorithm to detect the grasping bounding box for the robotic arm that reaches supermarket goods. This algorithm can calculate the final predicted grasping bounding box without any additional architecture, which greatly improves the speed of grasp inferences. In this study, we added the force-closure condition, so that the final grasping bounding box could achieve grasping stability in a physical sense. We experimentally demonstrated that the deep model treated object detection and grasping detection are the same tasks. We used transfer learning to improve the prediction accuracy of the grasping bounding box. In particular, the ResNet-101 network weights, which were originally used in object detection, were used to continue training with the Cornell dataset. In terms of grasping detection, we used the trained model weights that were originally used in object detection as the features of the to-be-grasped objects and fed them to the network for continuous training. For 2,828 test images, this method achieved nearly 98% accuracy and a speed of 14–17 frames per second.
利用 R2CNN 算法和力闭合提高机器人抓取检测精度
本研究旨在使用改进的旋转区域卷积神经网络(R2CNN)算法来检测机械臂抓取超市货物的抓取边界框。该算法无需任何附加架构即可计算出最终预测的抓取边界框,从而大大提高了抓取推断的速度。在本研究中,我们加入了力闭合条件,从而使最终的抓取边界框在物理意义上实现了抓取稳定性。我们通过实验证明,深度模型处理物体检测和抓取检测是相同的任务。我们利用迁移学习提高了抓取边界框的预测精度。其中,ResNet-101 网络的权重原本用于物体检测,我们将其用于康奈尔数据集的继续训练。在抓取检测方面,我们将原来用于物体检测的训练模型权重作为待抓取物体的特征,并将其输入网络进行持续训练。在 2,828 张测试图像中,该方法的准确率接近 98%,速度为每秒 14-17 帧。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
6.30
自引率
12.90%
发文量
100
审稿时长
6 months
期刊介绍: The ASME Journal of Computing and Information Science in Engineering (JCISE) publishes articles related to Algorithms, Computational Methods, Computing Infrastructure, Computer-Interpretable Representations, Human-Computer Interfaces, Information Science, and/or System Architectures that aim to improve some aspect of product and system lifecycle (e.g., design, manufacturing, operation, maintenance, disposal, recycling etc.). Applications considered in JCISE manuscripts should be relevant to the mechanical engineering discipline. Papers can be focused on fundamental research leading to new methods, or adaptation of existing methods for new applications. Scope: Advanced Computing Infrastructure; Artificial Intelligence; Big Data and Analytics; Collaborative Design; Computer Aided Design; Computer Aided Engineering; Computer Aided Manufacturing; Computational Foundations for Additive Manufacturing; Computational Foundations for Engineering Optimization; Computational Geometry; Computational Metrology; Computational Synthesis; Conceptual Design; Cybermanufacturing; Cyber Physical Security for Factories; Cyber Physical System Design and Operation; Data-Driven Engineering Applications; Engineering Informatics; Geometric Reasoning; GPU Computing for Design and Manufacturing; Human Computer Interfaces/Interactions; Industrial Internet of Things; Knowledge Engineering; Information Management; Inverse Methods for Engineering Applications; Machine Learning for Engineering Applications; Manufacturing Planning; Manufacturing Automation; Model-based Systems Engineering; Multiphysics Modeling and Simulation; Multiscale Modeling and Simulation; Multidisciplinary Optimization; Physics-Based Simulations; Process Modeling for Engineering Applications; Qualification, Verification and Validation of Computational Models; Symbolic Computing for Engineering Applications; Tolerance Modeling; Topology and Shape Optimization; Virtual and Augmented Reality Environments; Virtual Prototyping
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信