{"title":"导航框架在移动机器人网络物理系统中的可视化应用","authors":"Thanh Phuong Nguyen, Hung Nguyen, H. Ngo","doi":"10.1177/17298806231162202","DOIUrl":null,"url":null,"abstract":"In this article, we propose the visual application of a navigation framework for a wheeled robot to disinfect surfaces. Since dynamic environments are complicated, advanced sensors are integrated into the hardware platform to enhance the navigation task. The 2D lidar UTM-30LX from Hokuyo attached to the front of the robot can cover a wide scanning area. To provide better results in laser scan matching, an inertial measurement unit was integrated into the robot’s body. The output of this combination feeds into a global costmap for monitoring and navigation. Additionally, incremental encoders that obtain high-resolution position data are connected to the rear wheels. The role of the positioning sensor is to identify the existing location of the robot in a local costmap. To detect the appearance of a human, a Kinect digital camera is fixed to the top of the robot. All feedback signals are combined in the host computer to navigate the autonomous robot. For disinfection missions, the robot must carry several ultraviolet lamps to autonomously patrol in unknown environments. To visualize the robot’s effectiveness, our approach was validated using both a virtual simulation and an experimental test. The contributions of this work are summarized as follows: (i) a structure for ultraviolet-based hardware was first established; (ii) the theoretical computations for the robot’s localization in the 3D workspace will play a fundamental role in further developments; and (iii) data fusion from advanced sensing devices was integrated to enable navigation in uncertain environments.","PeriodicalId":50343,"journal":{"name":"International Journal of Advanced Robotic Systems","volume":" ","pages":""},"PeriodicalIF":2.3000,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Visual application of navigation framework in cyber-physical system for mobile robot to prevent disease\",\"authors\":\"Thanh Phuong Nguyen, Hung Nguyen, H. Ngo\",\"doi\":\"10.1177/17298806231162202\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this article, we propose the visual application of a navigation framework for a wheeled robot to disinfect surfaces. Since dynamic environments are complicated, advanced sensors are integrated into the hardware platform to enhance the navigation task. The 2D lidar UTM-30LX from Hokuyo attached to the front of the robot can cover a wide scanning area. To provide better results in laser scan matching, an inertial measurement unit was integrated into the robot’s body. The output of this combination feeds into a global costmap for monitoring and navigation. Additionally, incremental encoders that obtain high-resolution position data are connected to the rear wheels. The role of the positioning sensor is to identify the existing location of the robot in a local costmap. To detect the appearance of a human, a Kinect digital camera is fixed to the top of the robot. All feedback signals are combined in the host computer to navigate the autonomous robot. For disinfection missions, the robot must carry several ultraviolet lamps to autonomously patrol in unknown environments. To visualize the robot’s effectiveness, our approach was validated using both a virtual simulation and an experimental test. The contributions of this work are summarized as follows: (i) a structure for ultraviolet-based hardware was first established; (ii) the theoretical computations for the robot’s localization in the 3D workspace will play a fundamental role in further developments; and (iii) data fusion from advanced sensing devices was integrated to enable navigation in uncertain environments.\",\"PeriodicalId\":50343,\"journal\":{\"name\":\"International Journal of Advanced Robotic Systems\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2023-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Advanced Robotic Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1177/17298806231162202\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"Computer Science\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Advanced Robotic Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1177/17298806231162202","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Computer Science","Score":null,"Total":0}
Visual application of navigation framework in cyber-physical system for mobile robot to prevent disease
In this article, we propose the visual application of a navigation framework for a wheeled robot to disinfect surfaces. Since dynamic environments are complicated, advanced sensors are integrated into the hardware platform to enhance the navigation task. The 2D lidar UTM-30LX from Hokuyo attached to the front of the robot can cover a wide scanning area. To provide better results in laser scan matching, an inertial measurement unit was integrated into the robot’s body. The output of this combination feeds into a global costmap for monitoring and navigation. Additionally, incremental encoders that obtain high-resolution position data are connected to the rear wheels. The role of the positioning sensor is to identify the existing location of the robot in a local costmap. To detect the appearance of a human, a Kinect digital camera is fixed to the top of the robot. All feedback signals are combined in the host computer to navigate the autonomous robot. For disinfection missions, the robot must carry several ultraviolet lamps to autonomously patrol in unknown environments. To visualize the robot’s effectiveness, our approach was validated using both a virtual simulation and an experimental test. The contributions of this work are summarized as follows: (i) a structure for ultraviolet-based hardware was first established; (ii) the theoretical computations for the robot’s localization in the 3D workspace will play a fundamental role in further developments; and (iii) data fusion from advanced sensing devices was integrated to enable navigation in uncertain environments.
期刊介绍:
International Journal of Advanced Robotic Systems (IJARS) is a JCR ranked, peer-reviewed open access journal covering the full spectrum of robotics research. The journal is addressed to both practicing professionals and researchers in the field of robotics and its specialty areas. IJARS features fourteen topic areas each headed by a Topic Editor-in-Chief, integrating all aspects of research in robotics under the journal''s domain.