Cheng Tang, Ryota Inoue, Kohei Oshio, M. Tsujimoto, K. Taniguchi, N. Kubota
{"title":"基于自主移动机器人的大场景照度测量自动化","authors":"Cheng Tang, Ryota Inoue, Kohei Oshio, M. Tsujimoto, K. Taniguchi, N. Kubota","doi":"10.1109/SSCI47803.2020.9308595","DOIUrl":null,"url":null,"abstract":"In recent years, due to the shortage of labor caused by an aging population and low birth rate in Japan, various research on autonomous mobile robots has been conducted. There are many industries in which applying autonomous mobile robots are important, for instance, the illuminance measurement industry. Illuminance measurement is a time-consuming task and the accuracy in a vast environment is still a problem due to the accumulative error. Therefore, with the purpose of improving the accuracy of illuminance measurement in a large environment by an autonomous mobile robot, various methods have been proposed in the past, including loop closing, sensor fusion, and motion analysis. In this paper, we proposed a method that reduces the accumulative error simultaneously while measuring the illuminance of the surroundings. The proposed method is based on an occupancy grid map and evolution strategy (ES). We used the data gathered by laser range finders to calculate the fitness of robot position in both the ground-truth map and constructed map. By monitoring the fitness of the robot’s position, the adjustments will be conducted using evolution strategy to overcome the accumulative error. The proposed method is analyzed and evaluated in terms of accuracy through a series of real robot experiments in real-world environments.","PeriodicalId":413489,"journal":{"name":"2020 IEEE Symposium Series on Computational Intelligence (SSCI)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Automation of Illuminance measurement in a large scene by an autonomous Mobile Robot\",\"authors\":\"Cheng Tang, Ryota Inoue, Kohei Oshio, M. Tsujimoto, K. Taniguchi, N. Kubota\",\"doi\":\"10.1109/SSCI47803.2020.9308595\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In recent years, due to the shortage of labor caused by an aging population and low birth rate in Japan, various research on autonomous mobile robots has been conducted. There are many industries in which applying autonomous mobile robots are important, for instance, the illuminance measurement industry. Illuminance measurement is a time-consuming task and the accuracy in a vast environment is still a problem due to the accumulative error. Therefore, with the purpose of improving the accuracy of illuminance measurement in a large environment by an autonomous mobile robot, various methods have been proposed in the past, including loop closing, sensor fusion, and motion analysis. In this paper, we proposed a method that reduces the accumulative error simultaneously while measuring the illuminance of the surroundings. The proposed method is based on an occupancy grid map and evolution strategy (ES). We used the data gathered by laser range finders to calculate the fitness of robot position in both the ground-truth map and constructed map. By monitoring the fitness of the robot’s position, the adjustments will be conducted using evolution strategy to overcome the accumulative error. The proposed method is analyzed and evaluated in terms of accuracy through a series of real robot experiments in real-world environments.\",\"PeriodicalId\":413489,\"journal\":{\"name\":\"2020 IEEE Symposium Series on Computational Intelligence (SSCI)\",\"volume\":\"7 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE Symposium Series on Computational Intelligence (SSCI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SSCI47803.2020.9308595\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE Symposium Series on Computational Intelligence (SSCI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SSCI47803.2020.9308595","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Automation of Illuminance measurement in a large scene by an autonomous Mobile Robot
In recent years, due to the shortage of labor caused by an aging population and low birth rate in Japan, various research on autonomous mobile robots has been conducted. There are many industries in which applying autonomous mobile robots are important, for instance, the illuminance measurement industry. Illuminance measurement is a time-consuming task and the accuracy in a vast environment is still a problem due to the accumulative error. Therefore, with the purpose of improving the accuracy of illuminance measurement in a large environment by an autonomous mobile robot, various methods have been proposed in the past, including loop closing, sensor fusion, and motion analysis. In this paper, we proposed a method that reduces the accumulative error simultaneously while measuring the illuminance of the surroundings. The proposed method is based on an occupancy grid map and evolution strategy (ES). We used the data gathered by laser range finders to calculate the fitness of robot position in both the ground-truth map and constructed map. By monitoring the fitness of the robot’s position, the adjustments will be conducted using evolution strategy to overcome the accumulative error. The proposed method is analyzed and evaluated in terms of accuracy through a series of real robot experiments in real-world environments.