Visual application of navigation framework in cyber-physical system for mobile robot to prevent disease

IF 2.3 4区 计算机科学 Q2 Computer Science
Thanh Phuong Nguyen, Hung Nguyen, H. Ngo
{"title":"Visual application of navigation framework in cyber-physical system for mobile robot to prevent disease","authors":"Thanh Phuong Nguyen, Hung Nguyen, H. Ngo","doi":"10.1177/17298806231162202","DOIUrl":null,"url":null,"abstract":"In this article, we propose the visual application of a navigation framework for a wheeled robot to disinfect surfaces. Since dynamic environments are complicated, advanced sensors are integrated into the hardware platform to enhance the navigation task. The 2D lidar UTM-30LX from Hokuyo attached to the front of the robot can cover a wide scanning area. To provide better results in laser scan matching, an inertial measurement unit was integrated into the robot’s body. The output of this combination feeds into a global costmap for monitoring and navigation. Additionally, incremental encoders that obtain high-resolution position data are connected to the rear wheels. The role of the positioning sensor is to identify the existing location of the robot in a local costmap. To detect the appearance of a human, a Kinect digital camera is fixed to the top of the robot. All feedback signals are combined in the host computer to navigate the autonomous robot. For disinfection missions, the robot must carry several ultraviolet lamps to autonomously patrol in unknown environments. To visualize the robot’s effectiveness, our approach was validated using both a virtual simulation and an experimental test. The contributions of this work are summarized as follows: (i) a structure for ultraviolet-based hardware was first established; (ii) the theoretical computations for the robot’s localization in the 3D workspace will play a fundamental role in further developments; and (iii) data fusion from advanced sensing devices was integrated to enable navigation in uncertain environments.","PeriodicalId":50343,"journal":{"name":"International Journal of Advanced Robotic Systems","volume":" ","pages":""},"PeriodicalIF":2.3000,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Advanced Robotic Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1177/17298806231162202","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 3

Abstract

In this article, we propose the visual application of a navigation framework for a wheeled robot to disinfect surfaces. Since dynamic environments are complicated, advanced sensors are integrated into the hardware platform to enhance the navigation task. The 2D lidar UTM-30LX from Hokuyo attached to the front of the robot can cover a wide scanning area. To provide better results in laser scan matching, an inertial measurement unit was integrated into the robot’s body. The output of this combination feeds into a global costmap for monitoring and navigation. Additionally, incremental encoders that obtain high-resolution position data are connected to the rear wheels. The role of the positioning sensor is to identify the existing location of the robot in a local costmap. To detect the appearance of a human, a Kinect digital camera is fixed to the top of the robot. All feedback signals are combined in the host computer to navigate the autonomous robot. For disinfection missions, the robot must carry several ultraviolet lamps to autonomously patrol in unknown environments. To visualize the robot’s effectiveness, our approach was validated using both a virtual simulation and an experimental test. The contributions of this work are summarized as follows: (i) a structure for ultraviolet-based hardware was first established; (ii) the theoretical computations for the robot’s localization in the 3D workspace will play a fundamental role in further developments; and (iii) data fusion from advanced sensing devices was integrated to enable navigation in uncertain environments.
导航框架在移动机器人网络物理系统中的可视化应用
在这篇文章中,我们提出了一个导航框架在轮式机器人表面消毒中的可视化应用。由于动态环境复杂,因此将先进的传感器集成到硬件平台中,以增强导航任务。连接在机器人前部的Hokuyo的2D激光雷达UTM-30LX可以覆盖广阔的扫描区域。为了在激光扫描匹配中提供更好的结果,在机器人体内集成了惯性测量单元。这种组合的输出被输入到用于监控和导航的全球成本图中。此外,获得高分辨率位置数据的增量编码器连接到后轮。定位传感器的作用是在本地成本图中识别机器人的现有位置。为了检测人类的外观,Kinect数码相机被固定在机器人的顶部。所有的反馈信号在主计算机中被组合以导航自主机器人。对于消毒任务,机器人必须携带几个紫外线灯在未知环境中自主巡逻。为了可视化机器人的有效性,我们的方法通过虚拟模拟和实验测试进行了验证。这项工作的贡献总结如下:(i)首次建立了基于紫外线的硬件结构;(ii)机器人在3D工作空间中的定位的理论计算将在进一步的发展中发挥基础作用;以及(iii)集成了来自先进传感设备的数据融合,以实现在不确定环境中的导航。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
6.50
自引率
0.00%
发文量
65
审稿时长
6 months
期刊介绍: International Journal of Advanced Robotic Systems (IJARS) is a JCR ranked, peer-reviewed open access journal covering the full spectrum of robotics research. The journal is addressed to both practicing professionals and researchers in the field of robotics and its specialty areas. IJARS features fourteen topic areas each headed by a Topic Editor-in-Chief, integrating all aspects of research in robotics under the journal''s domain.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信