A Comprehensive Study on Supermarket Indoor Navigation for Visually Impaired using Computer Vision Techniques

S. Kayalvizhi, S. Roshni, Riya Ponraj, S. Priya Dharshini
{"title":"A Comprehensive Study on Supermarket Indoor Navigation for Visually Impaired using Computer Vision Techniques","authors":"S. Kayalvizhi, S. Roshni, Riya Ponraj, S. Priya Dharshini","doi":"10.1109/OTCON56053.2023.10114030","DOIUrl":null,"url":null,"abstract":"The ability to navigate is a fundamental skill that every person must have. People who are visually challenged need regular support when travelling from one place to another. It is difficult for them to have an autonomous shopping experience in a supermarket. The project’s goal is to make one of the daily life’s task easier for visually impaired persons. The project entails aiding visually impaired customers by guiding them to the destination of the corresponding product sections using an indoor navigation technique. The program aids in providing the visually impaired with shopping environment as well as assistance on how to buy their preferred goods. When logging into the Android app, the user or someone who is blind will speak about their shopping list preferences (voice input). The camera is turned on to provide the live-streaming video. The user can then begin navigating indoors to reach the section where the product is found. This incorporates the technology that is used in automated self-driving cars. Modeled highdefinition simulated maps of the supermarkets are used to navigate inside employing the shortest distance. The first item from the list is selected, and the user’s location is identified by matching the features in the image frame with the map and used to compute the path to the product section from there. It provides speech output for navigation. Someone who assists the people to get the products from shelves will help them to get the exact item. The buying process continues until the last item on the list is bought. As a result, this will make tasks comparatively easier for visually impaired persons and make indoor navigation possible.","PeriodicalId":265966,"journal":{"name":"2022 OPJU International Technology Conference on Emerging Technologies for Sustainable Development (OTCON)","volume":"73 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 OPJU International Technology Conference on Emerging Technologies for Sustainable Development (OTCON)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/OTCON56053.2023.10114030","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The ability to navigate is a fundamental skill that every person must have. People who are visually challenged need regular support when travelling from one place to another. It is difficult for them to have an autonomous shopping experience in a supermarket. The project’s goal is to make one of the daily life’s task easier for visually impaired persons. The project entails aiding visually impaired customers by guiding them to the destination of the corresponding product sections using an indoor navigation technique. The program aids in providing the visually impaired with shopping environment as well as assistance on how to buy their preferred goods. When logging into the Android app, the user or someone who is blind will speak about their shopping list preferences (voice input). The camera is turned on to provide the live-streaming video. The user can then begin navigating indoors to reach the section where the product is found. This incorporates the technology that is used in automated self-driving cars. Modeled highdefinition simulated maps of the supermarkets are used to navigate inside employing the shortest distance. The first item from the list is selected, and the user’s location is identified by matching the features in the image frame with the map and used to compute the path to the product section from there. It provides speech output for navigation. Someone who assists the people to get the products from shelves will help them to get the exact item. The buying process continues until the last item on the list is bought. As a result, this will make tasks comparatively easier for visually impaired persons and make indoor navigation possible.
基于计算机视觉技术的视障人士超市室内导航综合研究
导航能力是每个人都必须具备的基本技能。视力有障碍的人在从一个地方到另一个地方旅行时需要定期的支持。他们很难在超市拥有自主购物体验。该项目的目标是使视障人士的日常生活变得更加容易。该项目需要通过使用室内导航技术将视障客户引导到相应产品区域的目的地,从而帮助他们。该计划旨在为视障人士提供购物环境,以及如何购买他们喜欢的商品。当登录到Android应用程序时,用户或盲人将说出他们的购物清单偏好(语音输入)。摄像头打开,提供直播视频。然后,用户可以开始在室内导航,到达找到产品的区域。这其中包含了用于自动驾驶汽车的技术。利用超市的高清模拟地图,以最短的距离进行导航。选择列表中的第一个项目,通过将图像框架中的特征与地图相匹配来识别用户的位置,并用于从那里计算到产品区域的路径。它为导航提供语音输出。帮助人们从货架上取货的人会帮助他们拿到确切的商品。购买过程一直持续到购买清单上的最后一件物品。因此,这将使视障人士的任务相对容易,并使室内导航成为可能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信