Robotic Odor Source Localization via Vision and Olfaction Fusion Navigation Algorithm

Sunzid Hassan, Lingxiao Wang, Khan Raqib Mahmud
{"title":"Robotic Odor Source Localization via Vision and Olfaction Fusion Navigation Algorithm","authors":"Sunzid Hassan, Lingxiao Wang, Khan Raqib Mahmud","doi":"10.3390/s24072309","DOIUrl":null,"url":null,"abstract":"Robotic odor source localization (OSL) is a technology that enables mobile robots or autonomous vehicles to find an odor source in unknown environments. An effective navigation algorithm that guides the robot to approach the odor source is the key to successfully locating the odor source. While traditional OSL approaches primarily utilize an olfaction-only strategy, guiding robots to find the odor source by tracing emitted odor plumes, our work introduces a fusion navigation algorithm that combines both vision and olfaction-based techniques. This hybrid approach addresses challenges such as turbulent airflow, which disrupts olfaction sensing, and physical obstacles inside the search area, which may impede vision detection. In this work, we propose a hierarchical control mechanism that dynamically shifts the robot’s search behavior among four strategies: crosswind maneuver, Obstacle-Avoid Navigation, Vision-Based Navigation, and Olfaction-Based Navigation. Our methodology includes a custom-trained deep-learning model for visual target detection and a moth-inspired algorithm for Olfaction-Based Navigation. To assess the effectiveness of our approach, we implemented the proposed algorithm on a mobile robot in a search environment with obstacles. Experimental results demonstrate that our Vision and Olfaction Fusion algorithm significantly outperforms vision-only and olfaction-only methods, reducing average search time by 54% and 30%, respectively.","PeriodicalId":221960,"journal":{"name":"Sensors (Basel, Switzerland)","volume":"140 ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sensors (Basel, Switzerland)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/s24072309","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Robotic odor source localization (OSL) is a technology that enables mobile robots or autonomous vehicles to find an odor source in unknown environments. An effective navigation algorithm that guides the robot to approach the odor source is the key to successfully locating the odor source. While traditional OSL approaches primarily utilize an olfaction-only strategy, guiding robots to find the odor source by tracing emitted odor plumes, our work introduces a fusion navigation algorithm that combines both vision and olfaction-based techniques. This hybrid approach addresses challenges such as turbulent airflow, which disrupts olfaction sensing, and physical obstacles inside the search area, which may impede vision detection. In this work, we propose a hierarchical control mechanism that dynamically shifts the robot’s search behavior among four strategies: crosswind maneuver, Obstacle-Avoid Navigation, Vision-Based Navigation, and Olfaction-Based Navigation. Our methodology includes a custom-trained deep-learning model for visual target detection and a moth-inspired algorithm for Olfaction-Based Navigation. To assess the effectiveness of our approach, we implemented the proposed algorithm on a mobile robot in a search environment with obstacles. Experimental results demonstrate that our Vision and Olfaction Fusion algorithm significantly outperforms vision-only and olfaction-only methods, reducing average search time by 54% and 30%, respectively.
通过视觉和嗅觉融合导航算法实现机器人气味源定位
机器人气味源定位(OSL)是一种能让移动机器人或自动驾驶车辆在未知环境中找到气味源的技术。引导机器人接近气味源的有效导航算法是成功定位气味源的关键。传统的 OSL 方法主要采用纯嗅觉策略,通过追踪散发的气味羽流引导机器人找到气味源,而我们的工作则引入了一种融合导航算法,将视觉和嗅觉技术相结合。这种混合方法可以应对各种挑战,例如扰乱嗅觉感应的湍流气流,以及搜索区域内可能阻碍视觉检测的物理障碍物。在这项工作中,我们提出了一种分层控制机制,可在四种策略中动态改变机器人的搜索行为:横风机动、避障导航、基于视觉的导航和基于嗅觉的导航。我们的方法包括用于视觉目标检测的定制训练深度学习模型和用于基于嗅觉导航的飞蛾启发算法。为了评估我们方法的有效性,我们在有障碍物的搜索环境中对移动机器人实施了所提出的算法。实验结果表明,我们的视觉与嗅觉融合算法明显优于纯视觉方法和纯嗅觉方法,平均搜索时间分别缩短了 54% 和 30%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信