{"title":"使用谷歌视觉API为视障人士设计和实现语音辅助智能眼镜","authors":"P. Rajendran, Padmaveni Krishnan, D. Aravindhar","doi":"10.1109/ICECA49313.2020.9297553","DOIUrl":null,"url":null,"abstract":"Generally, visually challenged people tends to have difficulties in traveling and managing many kinds of challenges in their routine life. Mostly, wooden Sticks are used to sense barriers and obstacles next to them. As a result, visually impaired people cannot know exactly what kind of challenges they face and must thus rely entirely on lead sticks and training to navigate safely and in the right direction. This research work focuses on the development of a guidance system that uses smart glass paired with a sensor to continually capture images from the environment by the user wearable smart glass. The smart glass is equipped with a processor to process the captured images and objecst will be detected to inform the user about the results of the image and the user would have a much more comprehensive view of the method. This system allows visually impaired people not only to inform about traveling route and distance to the obstacle, but it also can inform about what the obstacle is. This smart glass can sense the distance from the obstacle and produce a warning to alert the user in advance. This application is developed to provide such a speech-based interface for the user, i.e. the user sends a voice that interprets his destination location when and when he is about to reach the destination. Here, instead of an alarm signal, the blind man can hear the location recorded by the user.","PeriodicalId":297285,"journal":{"name":"2020 4th International Conference on Electronics, Communication and Aerospace Technology (ICECA)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Design and Implementation of Voice Assisted Smart Glasses for Visually Impaired People Using Google Vision API\",\"authors\":\"P. Rajendran, Padmaveni Krishnan, D. Aravindhar\",\"doi\":\"10.1109/ICECA49313.2020.9297553\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Generally, visually challenged people tends to have difficulties in traveling and managing many kinds of challenges in their routine life. Mostly, wooden Sticks are used to sense barriers and obstacles next to them. As a result, visually impaired people cannot know exactly what kind of challenges they face and must thus rely entirely on lead sticks and training to navigate safely and in the right direction. This research work focuses on the development of a guidance system that uses smart glass paired with a sensor to continually capture images from the environment by the user wearable smart glass. The smart glass is equipped with a processor to process the captured images and objecst will be detected to inform the user about the results of the image and the user would have a much more comprehensive view of the method. This system allows visually impaired people not only to inform about traveling route and distance to the obstacle, but it also can inform about what the obstacle is. This smart glass can sense the distance from the obstacle and produce a warning to alert the user in advance. This application is developed to provide such a speech-based interface for the user, i.e. the user sends a voice that interprets his destination location when and when he is about to reach the destination. Here, instead of an alarm signal, the blind man can hear the location recorded by the user.\",\"PeriodicalId\":297285,\"journal\":{\"name\":\"2020 4th International Conference on Electronics, Communication and Aerospace Technology (ICECA)\",\"volume\":\"37 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-11-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 4th International Conference on Electronics, Communication and Aerospace Technology (ICECA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICECA49313.2020.9297553\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 4th International Conference on Electronics, Communication and Aerospace Technology (ICECA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICECA49313.2020.9297553","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Design and Implementation of Voice Assisted Smart Glasses for Visually Impaired People Using Google Vision API
Generally, visually challenged people tends to have difficulties in traveling and managing many kinds of challenges in their routine life. Mostly, wooden Sticks are used to sense barriers and obstacles next to them. As a result, visually impaired people cannot know exactly what kind of challenges they face and must thus rely entirely on lead sticks and training to navigate safely and in the right direction. This research work focuses on the development of a guidance system that uses smart glass paired with a sensor to continually capture images from the environment by the user wearable smart glass. The smart glass is equipped with a processor to process the captured images and objecst will be detected to inform the user about the results of the image and the user would have a much more comprehensive view of the method. This system allows visually impaired people not only to inform about traveling route and distance to the obstacle, but it also can inform about what the obstacle is. This smart glass can sense the distance from the obstacle and produce a warning to alert the user in advance. This application is developed to provide such a speech-based interface for the user, i.e. the user sends a voice that interprets his destination location when and when he is about to reach the destination. Here, instead of an alarm signal, the blind man can hear the location recorded by the user.