Md. Iftekhar Tanveer, A. Anam, M. Yeasin, Majid Khan
{"title":"你看到我看到的了吗?:设计一种感官替代装置,以进入非语言交流模式","authors":"Md. Iftekhar Tanveer, A. Anam, M. Yeasin, Majid Khan","doi":"10.1145/2513383.2513438","DOIUrl":null,"url":null,"abstract":"The inability to access non-verbal cues is a setback for people who are blind or visually impaired. A visual-to-auditory Sensory Substitution Device (SSD) may help improve the quality of their lives by transforming visual cues into auditory cues. In this paper, we describe the design and development of a robust and real-time SSD called iFEPS -- improved Facial Expression Perception through Sound. The implementation of the iFEPS evolved over time through a participatory design process. We conducted both subjective and objective experiments to quantify the usability of the system. Evaluation with 14 subjects (7 blind + 7 blind-folded) shows that the users were able to perceive the facial expressions in most of the time. In addition, the overall subjective usability of the system was found to be scoring 4.02 in a 5 point Likert scale.","PeriodicalId":378932,"journal":{"name":"Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"20","resultStr":"{\"title\":\"Do you see what I see?: designing a sensory substitution device to access non-verbal modes of communication\",\"authors\":\"Md. Iftekhar Tanveer, A. Anam, M. Yeasin, Majid Khan\",\"doi\":\"10.1145/2513383.2513438\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The inability to access non-verbal cues is a setback for people who are blind or visually impaired. A visual-to-auditory Sensory Substitution Device (SSD) may help improve the quality of their lives by transforming visual cues into auditory cues. In this paper, we describe the design and development of a robust and real-time SSD called iFEPS -- improved Facial Expression Perception through Sound. The implementation of the iFEPS evolved over time through a participatory design process. We conducted both subjective and objective experiments to quantify the usability of the system. Evaluation with 14 subjects (7 blind + 7 blind-folded) shows that the users were able to perceive the facial expressions in most of the time. In addition, the overall subjective usability of the system was found to be scoring 4.02 in a 5 point Likert scale.\",\"PeriodicalId\":378932,\"journal\":{\"name\":\"Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility\",\"volume\":\"24 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-10-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"20\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2513383.2513438\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2513383.2513438","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Do you see what I see?: designing a sensory substitution device to access non-verbal modes of communication
The inability to access non-verbal cues is a setback for people who are blind or visually impaired. A visual-to-auditory Sensory Substitution Device (SSD) may help improve the quality of their lives by transforming visual cues into auditory cues. In this paper, we describe the design and development of a robust and real-time SSD called iFEPS -- improved Facial Expression Perception through Sound. The implementation of the iFEPS evolved over time through a participatory design process. We conducted both subjective and objective experiments to quantify the usability of the system. Evaluation with 14 subjects (7 blind + 7 blind-folded) shows that the users were able to perceive the facial expressions in most of the time. In addition, the overall subjective usability of the system was found to be scoring 4.02 in a 5 point Likert scale.