{"title":"Indoor Drone 3-D Tracking Using Reflected Light From Floor Surfaces","authors":"Yusei Onishi;Hiroki Watanabe;Masanari Nakamura;Hiromichi Hashizume;Masanori Sugimoto","doi":"10.1109/JISPIN.2024.3453775","DOIUrl":null,"url":null,"abstract":"Because of the drone's penetration into our society, the demand for their indoor positioning has increased. However, its standard technology has not been established yet. This article describes an indoor 3-D tracking method for drones, using the drone's built-in camera to capture light reflected from the floor. Using a captured image and video data captured during the drone's flight, the proposed method can estimate the drone's position and trajectory. A drone's built-in camera is usually unable to capture light directly from ceiling light sources because of its limited field of view and gimbal angles. To address this problem, the proposed method captures the light indirectly, as the reflections from the floor of ceiling light-emitting diodes (LEDs), in the video stream acquired by its rolling-shutter camera. The 3-D position is estimated by calculating the received signal strength of each individual LED for a single video frame during the flight and fitting this data to a model generated by simulation images. In an indoor environment without external lights, we captured the reflected light from floor surfaces using the drone's camera under gimbal control and analyzed the captured images offline. Experimental results gave an absolute error of 0.34 m at the 90th percentile for 3-D positioning when hovering and using a single-frame image. For a linear flight path, the error was 0.31 m. The computation time for 3-D position estimation was 1.12 s. We also discussed limitations related to real-time and real-world applications, together with approaches to addressing these limitations.","PeriodicalId":100621,"journal":{"name":"IEEE Journal of Indoor and Seamless Positioning and Navigation","volume":"2 ","pages":"251-262"},"PeriodicalIF":0.0000,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10664003","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Indoor and Seamless Positioning and Navigation","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10664003/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Because of the drone's penetration into our society, the demand for their indoor positioning has increased. However, its standard technology has not been established yet. This article describes an indoor 3-D tracking method for drones, using the drone's built-in camera to capture light reflected from the floor. Using a captured image and video data captured during the drone's flight, the proposed method can estimate the drone's position and trajectory. A drone's built-in camera is usually unable to capture light directly from ceiling light sources because of its limited field of view and gimbal angles. To address this problem, the proposed method captures the light indirectly, as the reflections from the floor of ceiling light-emitting diodes (LEDs), in the video stream acquired by its rolling-shutter camera. The 3-D position is estimated by calculating the received signal strength of each individual LED for a single video frame during the flight and fitting this data to a model generated by simulation images. In an indoor environment without external lights, we captured the reflected light from floor surfaces using the drone's camera under gimbal control and analyzed the captured images offline. Experimental results gave an absolute error of 0.34 m at the 90th percentile for 3-D positioning when hovering and using a single-frame image. For a linear flight path, the error was 0.31 m. The computation time for 3-D position estimation was 1.12 s. We also discussed limitations related to real-time and real-world applications, together with approaches to addressing these limitations.