无标记视觉触觉传感器抓取力估计

IF 4.3 2区 综合性期刊 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Julio Castaño-Amoros;Pablo Gil
{"title":"无标记视觉触觉传感器抓取力估计","authors":"Julio Castaño-Amoros;Pablo Gil","doi":"10.1109/JSEN.2024.3489052","DOIUrl":null,"url":null,"abstract":"Tactile sensors have been used for force estimation in the past, especially vision-based tactile sensors (VBTSs) have recently become a new trend due to their high spatial resolution and low cost. In this work, we have designed and implemented several approaches to estimate the normal grasping force using different types of markerless visuotactile representations obtained from VBTS. Our main goal is to determine the most appropriate visuotactile representation, based on a performance analysis during robotic grasping tasks. Our proposal has been tested on the dataset generated with our DIGIT sensors and another one obtained using GelSight Mini sensors from another state-of-the-art work. We have also tested the generalization capabilities of our best approach, called RGBmod. The results led to two main conclusions. First, the RGB visuotactile representation is a better input option than the depth image or a combination of the two for estimating normal grasping forces. Second, RGBmod achieved a good performance when tested on ten unseen everyday objects in real-world scenarios, achieving an average relative error (RE) of \n<inline-formula> <tex-math>$0.125~\\pm ~0.153$ </tex-math></inline-formula>\n. Furthermore, we show that our proposal outperforms other works in the literature that use RGB and depth information for the same task.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"24 24","pages":"42538-42548"},"PeriodicalIF":4.3000,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10740944","citationCount":"0","resultStr":"{\"title\":\"Grasping Force Estimation for Markerless Visuotactile Sensors\",\"authors\":\"Julio Castaño-Amoros;Pablo Gil\",\"doi\":\"10.1109/JSEN.2024.3489052\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Tactile sensors have been used for force estimation in the past, especially vision-based tactile sensors (VBTSs) have recently become a new trend due to their high spatial resolution and low cost. In this work, we have designed and implemented several approaches to estimate the normal grasping force using different types of markerless visuotactile representations obtained from VBTS. Our main goal is to determine the most appropriate visuotactile representation, based on a performance analysis during robotic grasping tasks. Our proposal has been tested on the dataset generated with our DIGIT sensors and another one obtained using GelSight Mini sensors from another state-of-the-art work. We have also tested the generalization capabilities of our best approach, called RGBmod. The results led to two main conclusions. First, the RGB visuotactile representation is a better input option than the depth image or a combination of the two for estimating normal grasping forces. Second, RGBmod achieved a good performance when tested on ten unseen everyday objects in real-world scenarios, achieving an average relative error (RE) of \\n<inline-formula> <tex-math>$0.125~\\\\pm ~0.153$ </tex-math></inline-formula>\\n. Furthermore, we show that our proposal outperforms other works in the literature that use RGB and depth information for the same task.\",\"PeriodicalId\":447,\"journal\":{\"name\":\"IEEE Sensors Journal\",\"volume\":\"24 24\",\"pages\":\"42538-42548\"},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2024-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10740944\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Sensors Journal\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10740944/\",\"RegionNum\":2,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10740944/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

摘要

过去,触觉传感器一直被用于力估计,特别是基于视觉的触觉传感器(VBTSs)由于其高空间分辨率和低成本而成为近年来的新趋势。在这项工作中,我们设计并实现了几种方法,使用从VBTS获得的不同类型的无标记视觉表征来估计法向抓取力。我们的主要目标是根据机器人抓取任务的性能分析,确定最合适的视觉表现。我们的建议已经在我们的DIGIT传感器生成的数据集上进行了测试,另一个数据集是使用另一个最先进的GelSight Mini传感器获得的。我们还测试了我们的最佳方法RGBmod的泛化能力。研究结果得出了两个主要结论。首先,在估计法向抓握力时,RGB视觉表示是比深度图像或两者的组合更好的输入选项。其次,RGBmod在真实场景中对10个看不见的日常物体进行测试时取得了良好的性能,平均相对误差(RE)为0.125~ 0.153美元。此外,我们表明我们的建议优于文献中使用RGB和深度信息完成相同任务的其他工作。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Grasping Force Estimation for Markerless Visuotactile Sensors
Tactile sensors have been used for force estimation in the past, especially vision-based tactile sensors (VBTSs) have recently become a new trend due to their high spatial resolution and low cost. In this work, we have designed and implemented several approaches to estimate the normal grasping force using different types of markerless visuotactile representations obtained from VBTS. Our main goal is to determine the most appropriate visuotactile representation, based on a performance analysis during robotic grasping tasks. Our proposal has been tested on the dataset generated with our DIGIT sensors and another one obtained using GelSight Mini sensors from another state-of-the-art work. We have also tested the generalization capabilities of our best approach, called RGBmod. The results led to two main conclusions. First, the RGB visuotactile representation is a better input option than the depth image or a combination of the two for estimating normal grasping forces. Second, RGBmod achieved a good performance when tested on ten unseen everyday objects in real-world scenarios, achieving an average relative error (RE) of $0.125~\pm ~0.153$ . Furthermore, we show that our proposal outperforms other works in the literature that use RGB and depth information for the same task.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Sensors Journal
IEEE Sensors Journal 工程技术-工程:电子与电气
CiteScore
7.70
自引率
14.00%
发文量
2058
审稿时长
5.2 months
期刊介绍: The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following: -Sensor Phenomenology, Modelling, and Evaluation -Sensor Materials, Processing, and Fabrication -Chemical and Gas Sensors -Microfluidics and Biosensors -Optical Sensors -Physical Sensors: Temperature, Mechanical, Magnetic, and others -Acoustic and Ultrasonic Sensors -Sensor Packaging -Sensor Networks -Sensor Applications -Sensor Systems: Signals, Processing, and Interfaces -Actuators and Sensor Power Systems -Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting -Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data) -Sensors in Industrial Practice
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信