{"title":"Mobile-Based Navigation Assistant for Visually Impaired Person with Real-time Obstacle Detection Using YOLO-based Deep Learning Algorithm","authors":"G. Catedrilla","doi":"10.1109/ACMLC58173.2022.00020","DOIUrl":null,"url":null,"abstract":"This project mainly aims to develop a mobile-based application for navigation with real-time obstacle detection to provide fair access to people with visual impairment to some activities, specifically navigating outdoors. It is a navigation mobile application equipped with speech and gesture recognition, to allow the people with visual impairment to access and use the application, and obstacle detection to provide audio prompts to the user, so they will know whenever an object or obstacle is within the frame of the phone camera. The research was structured and accomplished through different scientific and technological process and approach. With the use of Dialog flow, it was possible to create a speech recognition feature for the application, while YOLO algorithm allowed the process of object detection using mobile phone camera, possible. In this research, it was found out that the application was applicable to improving the navigation of the visually impaired, it is ideal that it serves as supplement to the white stick in order to improve their navigation experience. Also, this project would like to emphasize that researches that seeks to help person with disability be considered and conducted by other researchers.","PeriodicalId":375920,"journal":{"name":"2022 5th Asia Conference on Machine Learning and Computing (ACMLC)","volume":"50 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 5th Asia Conference on Machine Learning and Computing (ACMLC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACMLC58173.2022.00020","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This project mainly aims to develop a mobile-based application for navigation with real-time obstacle detection to provide fair access to people with visual impairment to some activities, specifically navigating outdoors. It is a navigation mobile application equipped with speech and gesture recognition, to allow the people with visual impairment to access and use the application, and obstacle detection to provide audio prompts to the user, so they will know whenever an object or obstacle is within the frame of the phone camera. The research was structured and accomplished through different scientific and technological process and approach. With the use of Dialog flow, it was possible to create a speech recognition feature for the application, while YOLO algorithm allowed the process of object detection using mobile phone camera, possible. In this research, it was found out that the application was applicable to improving the navigation of the visually impaired, it is ideal that it serves as supplement to the white stick in order to improve their navigation experience. Also, this project would like to emphasize that researches that seeks to help person with disability be considered and conducted by other researchers.