{"title":"一种用于训练回声定位的移动应用程序,该应用程序具有空间渲染回声的新方法","authors":"M. Bujacz, Krzysztof Matysik, Grzegorz Górski","doi":"10.1109/IoD55468.2022.9987159","DOIUrl":null,"url":null,"abstract":"In this short demo paper we wish to present a mobile application developed for both Android and iOS for the purpose of training echolocation skills and general listening acuity of blind persons. The tasks in the app were based on real echolocation tests performed by blind and sighted volunteers as part of the Echovis research project and primarily consisted of guessing the direction and distance to a small wall at distances from one to three meters. The sounds in the app come from binaural recordings and artificial renders using DearVR and custom directional echo filters. The developed Waypoint Acousitic Stimulation (WAS) algorithm allows to render spatial room echoes with the computational power of a mobile processor. The WAS algorithm was also used to create renders of more complex training scenes, such as an office building through which a player could navigate using echoes. Tests of the app show that although the performance in real scenes is slightly better than tests using recordings or renders, there is no statistically significant difference between them and the sounds in the app are sufficient to demonstrate non-random levels of echolocation in both sighted and blind testers.","PeriodicalId":376545,"journal":{"name":"2022 IEEE 1st International Conference on Internet of Digital Reality (IoD)","volume":"70 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A mobile application for training echolocation with a novel method for spatially rendered echoes\",\"authors\":\"M. Bujacz, Krzysztof Matysik, Grzegorz Górski\",\"doi\":\"10.1109/IoD55468.2022.9987159\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this short demo paper we wish to present a mobile application developed for both Android and iOS for the purpose of training echolocation skills and general listening acuity of blind persons. The tasks in the app were based on real echolocation tests performed by blind and sighted volunteers as part of the Echovis research project and primarily consisted of guessing the direction and distance to a small wall at distances from one to three meters. The sounds in the app come from binaural recordings and artificial renders using DearVR and custom directional echo filters. The developed Waypoint Acousitic Stimulation (WAS) algorithm allows to render spatial room echoes with the computational power of a mobile processor. The WAS algorithm was also used to create renders of more complex training scenes, such as an office building through which a player could navigate using echoes. Tests of the app show that although the performance in real scenes is slightly better than tests using recordings or renders, there is no statistically significant difference between them and the sounds in the app are sufficient to demonstrate non-random levels of echolocation in both sighted and blind testers.\",\"PeriodicalId\":376545,\"journal\":{\"name\":\"2022 IEEE 1st International Conference on Internet of Digital Reality (IoD)\",\"volume\":\"70 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-06-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE 1st International Conference on Internet of Digital Reality (IoD)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IoD55468.2022.9987159\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 1st International Conference on Internet of Digital Reality (IoD)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IoD55468.2022.9987159","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A mobile application for training echolocation with a novel method for spatially rendered echoes
In this short demo paper we wish to present a mobile application developed for both Android and iOS for the purpose of training echolocation skills and general listening acuity of blind persons. The tasks in the app were based on real echolocation tests performed by blind and sighted volunteers as part of the Echovis research project and primarily consisted of guessing the direction and distance to a small wall at distances from one to three meters. The sounds in the app come from binaural recordings and artificial renders using DearVR and custom directional echo filters. The developed Waypoint Acousitic Stimulation (WAS) algorithm allows to render spatial room echoes with the computational power of a mobile processor. The WAS algorithm was also used to create renders of more complex training scenes, such as an office building through which a player could navigate using echoes. Tests of the app show that although the performance in real scenes is slightly better than tests using recordings or renders, there is no statistically significant difference between them and the sounds in the app are sufficient to demonstrate non-random levels of echolocation in both sighted and blind testers.