Accurate Robotic Grasp Detection with Angular Label Smoothing

IF 1.2 3区 计算机科学 Q4 COMPUTER SCIENCE, HARDWARE & ARCHITECTURE
Min Shi, Hao Lu, Zhao-Xin Li, Deng-Ming Zhu, Zhao-Qi Wang
{"title":"Accurate Robotic Grasp Detection with Angular Label Smoothing","authors":"Min Shi, Hao Lu, Zhao-Xin Li, Deng-Ming Zhu, Zhao-Qi Wang","doi":"10.1007/s11390-022-1458-5","DOIUrl":null,"url":null,"abstract":"<p>Grasp detection is a visual recognition task where the robot makes use of its sensors to detect graspable objects in its environment. Despite the steady progress in robotic grasping, it is still difficult to achieve both real-time and high accuracy grasping detection. In this paper, we propose a real-time robotic grasp detection method, which can accurately predict potential grasp for parallel-plate robotic grippers using RGB images. Our work employs an end-to-end convolutional neural network which consists of a feature descriptor and a grasp detector. And for the first time, we add an attention mechanism to the grasp detection task, which enables the network to focus on grasp regions rather than background. Specifically, we present an angular label smoothing strategy in our grasp detection method to enhance the fault tolerance of the network. We quantitatively and qualitatively evaluate our grasp detection method from different aspects on the public Cornell dataset and Jacquard dataset. Extensive experiments demonstrate that our grasp detection method achieves superior performance to the state-of-the-art methods. In particular, our grasp detection method ranked first on both the Cornell dataset and the Jacquard dataset, giving rise to the accuracy of 98.9% and 95.6%, respectively at real-time calculation speed.</p>","PeriodicalId":50222,"journal":{"name":"Journal of Computer Science and Technology","volume":null,"pages":null},"PeriodicalIF":1.2000,"publicationDate":"2023-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computer Science and Technology","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s11390-022-1458-5","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0

Abstract

Grasp detection is a visual recognition task where the robot makes use of its sensors to detect graspable objects in its environment. Despite the steady progress in robotic grasping, it is still difficult to achieve both real-time and high accuracy grasping detection. In this paper, we propose a real-time robotic grasp detection method, which can accurately predict potential grasp for parallel-plate robotic grippers using RGB images. Our work employs an end-to-end convolutional neural network which consists of a feature descriptor and a grasp detector. And for the first time, we add an attention mechanism to the grasp detection task, which enables the network to focus on grasp regions rather than background. Specifically, we present an angular label smoothing strategy in our grasp detection method to enhance the fault tolerance of the network. We quantitatively and qualitatively evaluate our grasp detection method from different aspects on the public Cornell dataset and Jacquard dataset. Extensive experiments demonstrate that our grasp detection method achieves superior performance to the state-of-the-art methods. In particular, our grasp detection method ranked first on both the Cornell dataset and the Jacquard dataset, giving rise to the accuracy of 98.9% and 95.6%, respectively at real-time calculation speed.

基于角标记平滑的精确机器人抓取检测
抓取检测是一项视觉识别任务,机器人利用其传感器来检测其环境中可抓取的物体。尽管机器人抓取技术取得了长足的进步,但要实现实时、高精度的抓取检测仍然很困难。本文提出了一种实时机器人抓力检测方法,该方法可以利用RGB图像准确预测平行板机器人抓力的潜在抓力。我们的工作采用端到端卷积神经网络,该网络由特征描述符和抓取检测器组成。并且,我们首次在抓取检测任务中加入了注意机制,使网络能够专注于抓取区域而不是背景。具体来说,我们在抓握检测方法中提出了一种角度标签平滑策略,以提高网络的容错性。我们在公开的Cornell数据集和Jacquard数据集上从不同的方面对我们的抓握检测方法进行了定量和定性的评价。大量的实验表明,我们的抓握检测方法比最先进的方法具有更好的性能。特别是,我们的抓取检测方法在Cornell数据集和Jacquard数据集上均排名第一,在实时计算速度下,准确率分别达到98.9%和95.6%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of Computer Science and Technology
Journal of Computer Science and Technology 工程技术-计算机:软件工程
CiteScore
4.00
自引率
0.00%
发文量
2255
审稿时长
9.8 months
期刊介绍: Journal of Computer Science and Technology (JCST), the first English language journal in the computer field published in China, is an international forum for scientists and engineers involved in all aspects of computer science and technology to publish high quality and refereed papers. Papers reporting original research and innovative applications from all parts of the world are welcome. Papers for publication in the journal are selected through rigorous peer review, to ensure originality, timeliness, relevance, and readability. While the journal emphasizes the publication of previously unpublished materials, selected conference papers with exceptional merit that require wider exposure are, at the discretion of the editors, also published, provided they meet the journal''s peer review standards. The journal also seeks clearly written survey and review articles from experts in the field, to promote insightful understanding of the state-of-the-art and technology trends. Topics covered by Journal of Computer Science and Technology include but are not limited to: -Computer Architecture and Systems -Artificial Intelligence and Pattern Recognition -Computer Networks and Distributed Computing -Computer Graphics and Multimedia -Software Systems -Data Management and Data Mining -Theory and Algorithms -Emerging Areas
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信