Active learning strategies for robotic tactile texture recognition tasks

Shemonto Das, Vinicius Prado da Fonseca, Amilcar Soares
{"title":"Active learning strategies for robotic tactile texture recognition tasks","authors":"Shemonto Das, Vinicius Prado da Fonseca, Amilcar Soares","doi":"10.3389/frobt.2024.1281060","DOIUrl":null,"url":null,"abstract":"Accurate texture classification empowers robots to improve their perception and comprehension of the environment, enabling informed decision-making and appropriate responses to diverse materials and surfaces. Still, there are challenges for texture classification regarding the vast amount of time series data generated from robots’ sensors. For instance, robots are anticipated to leverage human feedback during interactions with the environment, particularly in cases of misclassification or uncertainty. With the diversity of objects and textures in daily activities, Active Learning (AL) can be employed to minimize the number of samples the robot needs to request from humans, streamlining the learning process. In the present work, we use AL to select the most informative samples for annotation, thus reducing the human labeling effort required to achieve high performance for classifying textures. We also use a sliding window strategy for extracting features from the sensor’s time series used in our experiments. Our multi-class dataset (e.g., 12 textures) challenges traditional AL strategies since standard techniques cannot control the number of instances per class selected to be labeled. Therefore, we propose a novel class-balancing instance selection algorithm that we integrate with standard AL strategies. Moreover, we evaluate the effect of sliding windows of two-time intervals (3 and 6 s) on our AL Strategies. Finally, we analyze in our experiments the performance of AL strategies, with and without the balancing algorithm, regarding f1-score, and positive effects are observed in terms of performance when using our proposed data pipeline. Our results show that the training data can be reduced to 70% using an AL strategy regardless of the machine learning model and reach, and in many cases, surpass a baseline performance. Finally, exploring the textures with a 6-s window achieves the best performance, and using either Extra Trees produces an average f1-score of 90.21% in the texture classification data set.","PeriodicalId":504612,"journal":{"name":"Frontiers in Robotics and AI","volume":"9 8","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Robotics and AI","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/frobt.2024.1281060","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Accurate texture classification empowers robots to improve their perception and comprehension of the environment, enabling informed decision-making and appropriate responses to diverse materials and surfaces. Still, there are challenges for texture classification regarding the vast amount of time series data generated from robots’ sensors. For instance, robots are anticipated to leverage human feedback during interactions with the environment, particularly in cases of misclassification or uncertainty. With the diversity of objects and textures in daily activities, Active Learning (AL) can be employed to minimize the number of samples the robot needs to request from humans, streamlining the learning process. In the present work, we use AL to select the most informative samples for annotation, thus reducing the human labeling effort required to achieve high performance for classifying textures. We also use a sliding window strategy for extracting features from the sensor’s time series used in our experiments. Our multi-class dataset (e.g., 12 textures) challenges traditional AL strategies since standard techniques cannot control the number of instances per class selected to be labeled. Therefore, we propose a novel class-balancing instance selection algorithm that we integrate with standard AL strategies. Moreover, we evaluate the effect of sliding windows of two-time intervals (3 and 6 s) on our AL Strategies. Finally, we analyze in our experiments the performance of AL strategies, with and without the balancing algorithm, regarding f1-score, and positive effects are observed in terms of performance when using our proposed data pipeline. Our results show that the training data can be reduced to 70% using an AL strategy regardless of the machine learning model and reach, and in many cases, surpass a baseline performance. Finally, exploring the textures with a 6-s window achieves the best performance, and using either Extra Trees produces an average f1-score of 90.21% in the texture classification data set.
机器人触觉纹理识别任务的主动学习策略
准确的纹理分类有助于机器人提高对环境的感知和理解能力,从而做出明智的决策,并对不同的材料和表面做出适当的反应。然而,从机器人传感器生成的大量时间序列数据来看,纹理分类仍面临挑战。例如,预计机器人在与环境互动时会利用人类的反馈,特别是在分类错误或不确定的情况下。由于日常活动中的物体和纹理多种多样,因此可以采用主动学习(AL)来尽量减少机器人需要向人类请求的样本数量,从而简化学习过程。在本作品中,我们利用主动学习技术选择信息量最大的样本进行标注,从而减少了人工标注的工作量,实现了纹理分类的高性能。我们还使用滑动窗口策略从实验中使用的传感器时间序列中提取特征。我们的多类数据集(例如 12 种纹理)对传统的 AL 策略提出了挑战,因为标准技术无法控制每个类别中需要标记的实例数量。因此,我们提出了一种新颖的类平衡实例选择算法,并将其与标准 AL 策略相结合。此外,我们还评估了两个时间间隔(3 秒和 6 秒)的滑动窗口对我们的 AL 策略的影响。最后,我们在实验中分析了使用和不使用平衡算法的 AL 策略在 f1 分数方面的性能,并观察到使用我们提出的数据管道对性能的积极影响。我们的结果表明,无论采用哪种机器学习模型,使用 AL 策略都能将训练数据减少到 70%,并达到基准性能,在很多情况下甚至超过基准性能。最后,在纹理分类数据集中,使用 6 秒窗口探索纹理取得了最佳性能,使用 Extra Trees 的平均 f1 分数为 90.21%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信