{"title":"利用深度强化学习探索未知环境","authors":"Asad Ali, Sarah Gul, Tallat Mahmood, A. Ullah","doi":"10.1109/ICRAI57502.2023.10089589","DOIUrl":null,"url":null,"abstract":"Exploring the unknown environment is a very crucial task where human life is at risks like search and rescue operations, abandoned nuclear plants, covert operations and more. Autonomous robots could serve this task efficiently. The existing methods use uncertainty models for localization and map building to explore the unknown areas requiring high onboard computation and time. We propose to use Deep Reinforcement Learning (DRL) for the autonomous exploration of unknown environments. In DRL, the agent interacts with the environment and learns based on experiences (feedback/reward). We propose extrinsic and curiosity-driven reward functions to explore the environment. The curiosity-based reward function motivates the agent to explore unseen areas by predicting future states, while the extrinsic reward function avoids collisions. We train the differential drive robot in one environment and evaluate its performance in another unknown environment. We observe curiosity-driven reward function outperformed the extrinsic reward by exploring more areas in the unknown environment. The test results show the generalization capability to explore unknown environments with the proposed methods.","PeriodicalId":447565,"journal":{"name":"2023 International Conference on Robotics and Automation in Industry (ICRAI)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Exploration of Unknown Environment using Deep Reinforcement Learning\",\"authors\":\"Asad Ali, Sarah Gul, Tallat Mahmood, A. Ullah\",\"doi\":\"10.1109/ICRAI57502.2023.10089589\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Exploring the unknown environment is a very crucial task where human life is at risks like search and rescue operations, abandoned nuclear plants, covert operations and more. Autonomous robots could serve this task efficiently. The existing methods use uncertainty models for localization and map building to explore the unknown areas requiring high onboard computation and time. We propose to use Deep Reinforcement Learning (DRL) for the autonomous exploration of unknown environments. In DRL, the agent interacts with the environment and learns based on experiences (feedback/reward). We propose extrinsic and curiosity-driven reward functions to explore the environment. The curiosity-based reward function motivates the agent to explore unseen areas by predicting future states, while the extrinsic reward function avoids collisions. We train the differential drive robot in one environment and evaluate its performance in another unknown environment. We observe curiosity-driven reward function outperformed the extrinsic reward by exploring more areas in the unknown environment. The test results show the generalization capability to explore unknown environments with the proposed methods.\",\"PeriodicalId\":447565,\"journal\":{\"name\":\"2023 International Conference on Robotics and Automation in Industry (ICRAI)\",\"volume\":\"61 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-03-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 International Conference on Robotics and Automation in Industry (ICRAI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICRAI57502.2023.10089589\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Robotics and Automation in Industry (ICRAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICRAI57502.2023.10089589","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Exploration of Unknown Environment using Deep Reinforcement Learning
Exploring the unknown environment is a very crucial task where human life is at risks like search and rescue operations, abandoned nuclear plants, covert operations and more. Autonomous robots could serve this task efficiently. The existing methods use uncertainty models for localization and map building to explore the unknown areas requiring high onboard computation and time. We propose to use Deep Reinforcement Learning (DRL) for the autonomous exploration of unknown environments. In DRL, the agent interacts with the environment and learns based on experiences (feedback/reward). We propose extrinsic and curiosity-driven reward functions to explore the environment. The curiosity-based reward function motivates the agent to explore unseen areas by predicting future states, while the extrinsic reward function avoids collisions. We train the differential drive robot in one environment and evaluate its performance in another unknown environment. We observe curiosity-driven reward function outperformed the extrinsic reward by exploring more areas in the unknown environment. The test results show the generalization capability to explore unknown environments with the proposed methods.