Kelly Miles, Ronny Ibrahim, Yvonne Tran, Alan Kan, Joerg M Buchholz
{"title":"通过传感器融合了解噪音对话中的交流困难","authors":"Kelly Miles, Ronny Ibrahim, Yvonne Tran, Alan Kan, Joerg M Buchholz","doi":"10.1121/10.0022837","DOIUrl":null,"url":null,"abstract":"Difficulty communicating is the most challenging consequence of living with hearing loss, substantially affecting personal and professional relationships. While hearing devices help to redress this challenge, there is often a mismatch between performance measures obtained in clinical and laboratory settings and observed real-world behaviour. This discrepancy is likely due to an array of parameters, with the most notable being unrealistic speech stimuli (e.g., contrived sentence materials), artificial background noise, and tasks that do not reflect real-world communication behaviour or scenarios (e.g., sentence recall). To bridge this gap, we used sensor-fusion to understand communication difficulties in familiar communication partners engaged in natural, unrestricted conversations while listening to different levels of realistic background noise. We tallied communication breakdowns as a robust, overt metric of communication difficulty and fused data from an array of sensors including microphones, eye and motion trackers, and wearables that detect autonomic nervous system activity to objectively index communication difficulty. Our approach aims to find biomarkers that may predict the communication difficulties faced by individuals with hearing loss in the real-world. Ultimately, this research will contribute to enhancing the effectiveness of hearing devices, leading to improved social connection and quality of life for people with hearing loss.","PeriodicalId":256727,"journal":{"name":"The Journal of the Acoustical Society of America","volume":"319 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Sensor-fusion to understand communication difficulty during conversations in noise\",\"authors\":\"Kelly Miles, Ronny Ibrahim, Yvonne Tran, Alan Kan, Joerg M Buchholz\",\"doi\":\"10.1121/10.0022837\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Difficulty communicating is the most challenging consequence of living with hearing loss, substantially affecting personal and professional relationships. While hearing devices help to redress this challenge, there is often a mismatch between performance measures obtained in clinical and laboratory settings and observed real-world behaviour. This discrepancy is likely due to an array of parameters, with the most notable being unrealistic speech stimuli (e.g., contrived sentence materials), artificial background noise, and tasks that do not reflect real-world communication behaviour or scenarios (e.g., sentence recall). To bridge this gap, we used sensor-fusion to understand communication difficulties in familiar communication partners engaged in natural, unrestricted conversations while listening to different levels of realistic background noise. We tallied communication breakdowns as a robust, overt metric of communication difficulty and fused data from an array of sensors including microphones, eye and motion trackers, and wearables that detect autonomic nervous system activity to objectively index communication difficulty. Our approach aims to find biomarkers that may predict the communication difficulties faced by individuals with hearing loss in the real-world. Ultimately, this research will contribute to enhancing the effectiveness of hearing devices, leading to improved social connection and quality of life for people with hearing loss.\",\"PeriodicalId\":256727,\"journal\":{\"name\":\"The Journal of the Acoustical Society of America\",\"volume\":\"319 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The Journal of the Acoustical Society of America\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1121/10.0022837\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Journal of the Acoustical Society of America","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1121/10.0022837","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Sensor-fusion to understand communication difficulty during conversations in noise
Difficulty communicating is the most challenging consequence of living with hearing loss, substantially affecting personal and professional relationships. While hearing devices help to redress this challenge, there is often a mismatch between performance measures obtained in clinical and laboratory settings and observed real-world behaviour. This discrepancy is likely due to an array of parameters, with the most notable being unrealistic speech stimuli (e.g., contrived sentence materials), artificial background noise, and tasks that do not reflect real-world communication behaviour or scenarios (e.g., sentence recall). To bridge this gap, we used sensor-fusion to understand communication difficulties in familiar communication partners engaged in natural, unrestricted conversations while listening to different levels of realistic background noise. We tallied communication breakdowns as a robust, overt metric of communication difficulty and fused data from an array of sensors including microphones, eye and motion trackers, and wearables that detect autonomic nervous system activity to objectively index communication difficulty. Our approach aims to find biomarkers that may predict the communication difficulties faced by individuals with hearing loss in the real-world. Ultimately, this research will contribute to enhancing the effectiveness of hearing devices, leading to improved social connection and quality of life for people with hearing loss.