{"title":"Development of smart navigation robot for the visually impaired","authors":"Jin Yien Lee, Taiga Eguchi, Wen Liang Yeoh, Hiroshi Okumura, Osamu Fukuda","doi":"10.1007/s10015-025-01012-6","DOIUrl":null,"url":null,"abstract":"<div><p>Individuals with visual impairments often rely on assistive tools such as white canes and guide dogs to navigate their environments. While these tools provide a certain level of support, their effectiveness is frequently constrained in complex or dynamically changing environments, even with extensive user training. To address these limitations, we have developed a smart navigation robot that integrates artificial intelligence for object detection, offering a viable alternative to traditional assistive tools. The robot is designed to provide real-time assistance through auditory alerts, all while allowing the user to maintain full control over the robot’s direction according to their intentions. The robot’s effectiveness was evaluated through an experimental study in which participants navigated diverse environments using both the smart navigation robot and a white cane. Participant perceptions of the robot’s usability, reliability, safety, and interaction quality were evaluated using the Godspeed Questionnaire Series. The comparative analysis revealed that the smart navigation robot outperformed the white cane, particularly in dynamic scenarios. These findings suggest that the robot has the potential to substantially improve the quality of life and independence of individuals with visual impairments, offering a greater degree of freedom than was previously attainable.</p></div>","PeriodicalId":46050,"journal":{"name":"Artificial Life and Robotics","volume":"30 2","pages":"265 - 275"},"PeriodicalIF":0.8000,"publicationDate":"2025-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Artificial Life and Robotics","FirstCategoryId":"1085","ListUrlMain":"https://link.springer.com/article/10.1007/s10015-025-01012-6","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
Individuals with visual impairments often rely on assistive tools such as white canes and guide dogs to navigate their environments. While these tools provide a certain level of support, their effectiveness is frequently constrained in complex or dynamically changing environments, even with extensive user training. To address these limitations, we have developed a smart navigation robot that integrates artificial intelligence for object detection, offering a viable alternative to traditional assistive tools. The robot is designed to provide real-time assistance through auditory alerts, all while allowing the user to maintain full control over the robot’s direction according to their intentions. The robot’s effectiveness was evaluated through an experimental study in which participants navigated diverse environments using both the smart navigation robot and a white cane. Participant perceptions of the robot’s usability, reliability, safety, and interaction quality were evaluated using the Godspeed Questionnaire Series. The comparative analysis revealed that the smart navigation robot outperformed the white cane, particularly in dynamic scenarios. These findings suggest that the robot has the potential to substantially improve the quality of life and independence of individuals with visual impairments, offering a greater degree of freedom than was previously attainable.