{"title":"Testing a Vision-Based Autonomous Drone Navigation Model in a Forest Environment","authors":"Alvin Lee, Suet-Peng Yong, W. Pedrycz, J. Watada","doi":"10.3390/a17040139","DOIUrl":null,"url":null,"abstract":"Drones play a pivotal role in various industries of Industry 4.0. For achieving the application of drones in a dynamic environment, finding a clear path for their autonomous flight requires more research. This paper addresses the problem of finding a navigation path for an autonomous drone based on visual scene information. A deep learning-based object detection approach can localize obstacles detected in a scene. Considering this approach, we propose a solution framework that includes masking with a color-based segmentation method to identify an empty area where the drone can fly. The scene is described using segmented regions and localization points. The proposed approach can be used to remotely guide drones in dynamic environments that have poor coverage from global positioning systems. The simulation results show that the proposed framework with object detection and the proposed masking technique support drone navigation in a dynamic environment based only on the visual input from the front field of view.","PeriodicalId":7636,"journal":{"name":"Algorithms","volume":null,"pages":null},"PeriodicalIF":1.8000,"publicationDate":"2024-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Algorithms","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/a17040139","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Drones play a pivotal role in various industries of Industry 4.0. For achieving the application of drones in a dynamic environment, finding a clear path for their autonomous flight requires more research. This paper addresses the problem of finding a navigation path for an autonomous drone based on visual scene information. A deep learning-based object detection approach can localize obstacles detected in a scene. Considering this approach, we propose a solution framework that includes masking with a color-based segmentation method to identify an empty area where the drone can fly. The scene is described using segmented regions and localization points. The proposed approach can be used to remotely guide drones in dynamic environments that have poor coverage from global positioning systems. The simulation results show that the proposed framework with object detection and the proposed masking technique support drone navigation in a dynamic environment based only on the visual input from the front field of view.