H. Grewal, Neha Thotappala Jayaprakash, Aaron Matthews, Chinmay Shrivastav, K. George
{"title":"Autonomous wheelchair navigation in unmapped indoor environments","authors":"H. Grewal, Neha Thotappala Jayaprakash, Aaron Matthews, Chinmay Shrivastav, K. George","doi":"10.1109/I2MTC.2018.8409854","DOIUrl":null,"url":null,"abstract":"Recent developments in robot automation have fostered the development of many assistive devices to improve the quality of life for individuals with disabilities. Notable among these devices are autonomous wheelchairs, which are capable of navigating to given destinations while avoiding obstacles. However, the method of destination selection and navigation in unmapped indoor environments remains a challenge for these autonomous wheelchairs. In this work, a novel approach to selecting a destination for an autonomous wheelchair in an unmapped indoor environment using a camera, ranging LIDAR, and computer vision is presented. The system scans the environment at startup and compiles a list of possible destinations for a user to easily make an selection. The proposed system was tested in a simulated shopping mall environment where destinations included various stores. The computer vision system was tested with images of store-fronts at various distances and angles. Ten trials were conducted to test the navigation system with destinations at close-range, mid-range, and long-range. The system successfully navigated to the destination in 100% of the trials for close-range destinations and 90% of the trials for mid-range and long-range destinations. Based on these results, we conclude the proposed design is a promising means of destination selection for autonomous wheelchairs in unmapped indoor environments for individuals with severe disabilities.","PeriodicalId":393766,"journal":{"name":"2018 IEEE International Instrumentation and Measurement Technology Conference (I2MTC)","volume":"56 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-05-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE International Instrumentation and Measurement Technology Conference (I2MTC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/I2MTC.2018.8409854","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10
Abstract
Recent developments in robot automation have fostered the development of many assistive devices to improve the quality of life for individuals with disabilities. Notable among these devices are autonomous wheelchairs, which are capable of navigating to given destinations while avoiding obstacles. However, the method of destination selection and navigation in unmapped indoor environments remains a challenge for these autonomous wheelchairs. In this work, a novel approach to selecting a destination for an autonomous wheelchair in an unmapped indoor environment using a camera, ranging LIDAR, and computer vision is presented. The system scans the environment at startup and compiles a list of possible destinations for a user to easily make an selection. The proposed system was tested in a simulated shopping mall environment where destinations included various stores. The computer vision system was tested with images of store-fronts at various distances and angles. Ten trials were conducted to test the navigation system with destinations at close-range, mid-range, and long-range. The system successfully navigated to the destination in 100% of the trials for close-range destinations and 90% of the trials for mid-range and long-range destinations. Based on these results, we conclude the proposed design is a promising means of destination selection for autonomous wheelchairs in unmapped indoor environments for individuals with severe disabilities.