S. Nayhouse, S. Chadha, P. Hourican, C. Moore, N. Bezzo
{"title":"有限机载传感条件下人机交互的一般框架","authors":"S. Nayhouse, S. Chadha, P. Hourican, C. Moore, N. Bezzo","doi":"10.1109/SIEDS58326.2023.10137774","DOIUrl":null,"url":null,"abstract":"Recent advancements in unmanned aerial vehicles (UAVs), has allowed their deployment for numerous applications like aerial photography, infrastructure inspection, search and rescue, and surveillance. Despite the potential for full autonomy, many applications still necessitate human operators for navigating complex environments and decision-making. Existing solutions often employ high-precision and simple sensors like 2-D or 3-D LiDAR, which may provide more data than necessary and contribute to increased system complexity and cost. To address these challenges and bridge the gap between full autonomy and human-controlled UAVs, this work develops a shared-autonomy framework for UAVs, leveraging lightweight, low-cost 1-D LiDAR sensors combined with mobility behaviors to obtain performance comparable to more advanced 2-D/3-D LiDAR sensors while minimizing energy, computation overhead, and weight. Our framework includes a novel state machine method that exploits the UAV mobility to compensate for the limitations of 1-D LiDAR sensors, ensuring safety and obstacle avoidance through a physics-based algorithm that transitions between teleoperation and autonomous mode as needed based on environmental conditions and safety-critical issues. Experimental validations on real UAVs demonstrates the effectiveness of this shared autonomy scheme in complex environments, and the system is further generalized to larger UAVs and prototyped with a custom sensor configuration and onboard obstacle avoidance.","PeriodicalId":267464,"journal":{"name":"2023 Systems and Information Engineering Design Symposium (SIEDS)","volume":"61 5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A General Framework for Human-Drone Interaction under Limited On-board Sensing\",\"authors\":\"S. Nayhouse, S. Chadha, P. Hourican, C. Moore, N. Bezzo\",\"doi\":\"10.1109/SIEDS58326.2023.10137774\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recent advancements in unmanned aerial vehicles (UAVs), has allowed their deployment for numerous applications like aerial photography, infrastructure inspection, search and rescue, and surveillance. Despite the potential for full autonomy, many applications still necessitate human operators for navigating complex environments and decision-making. Existing solutions often employ high-precision and simple sensors like 2-D or 3-D LiDAR, which may provide more data than necessary and contribute to increased system complexity and cost. To address these challenges and bridge the gap between full autonomy and human-controlled UAVs, this work develops a shared-autonomy framework for UAVs, leveraging lightweight, low-cost 1-D LiDAR sensors combined with mobility behaviors to obtain performance comparable to more advanced 2-D/3-D LiDAR sensors while minimizing energy, computation overhead, and weight. Our framework includes a novel state machine method that exploits the UAV mobility to compensate for the limitations of 1-D LiDAR sensors, ensuring safety and obstacle avoidance through a physics-based algorithm that transitions between teleoperation and autonomous mode as needed based on environmental conditions and safety-critical issues. Experimental validations on real UAVs demonstrates the effectiveness of this shared autonomy scheme in complex environments, and the system is further generalized to larger UAVs and prototyped with a custom sensor configuration and onboard obstacle avoidance.\",\"PeriodicalId\":267464,\"journal\":{\"name\":\"2023 Systems and Information Engineering Design Symposium (SIEDS)\",\"volume\":\"61 5 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-04-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 Systems and Information Engineering Design Symposium (SIEDS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SIEDS58326.2023.10137774\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 Systems and Information Engineering Design Symposium (SIEDS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SIEDS58326.2023.10137774","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A General Framework for Human-Drone Interaction under Limited On-board Sensing
Recent advancements in unmanned aerial vehicles (UAVs), has allowed their deployment for numerous applications like aerial photography, infrastructure inspection, search and rescue, and surveillance. Despite the potential for full autonomy, many applications still necessitate human operators for navigating complex environments and decision-making. Existing solutions often employ high-precision and simple sensors like 2-D or 3-D LiDAR, which may provide more data than necessary and contribute to increased system complexity and cost. To address these challenges and bridge the gap between full autonomy and human-controlled UAVs, this work develops a shared-autonomy framework for UAVs, leveraging lightweight, low-cost 1-D LiDAR sensors combined with mobility behaviors to obtain performance comparable to more advanced 2-D/3-D LiDAR sensors while minimizing energy, computation overhead, and weight. Our framework includes a novel state machine method that exploits the UAV mobility to compensate for the limitations of 1-D LiDAR sensors, ensuring safety and obstacle avoidance through a physics-based algorithm that transitions between teleoperation and autonomous mode as needed based on environmental conditions and safety-critical issues. Experimental validations on real UAVs demonstrates the effectiveness of this shared autonomy scheme in complex environments, and the system is further generalized to larger UAVs and prototyped with a custom sensor configuration and onboard obstacle avoidance.