{"title":"虚拟现实中的立体声和声音模拟","authors":"Yulia Yagunova, M. Poletti, Paul D. Teal","doi":"10.1109/TENCON54134.2021.9707299","DOIUrl":null,"url":null,"abstract":"It can be challenging for opera singers to access their performance venues for rehearsal due to venue sched-ules or travel restrictions. Virtual reality (VR) or augmented reality (AR) technologies provide the possibility for rehearsal in a virtual venue. However, these technologies are mainly focused on visuals, rather than on the sonic plausibility of a virtual space. Moreover, existing self-auralization methods have one or more of the following limitations: a small choice of virtual venues, restricted user movement, generic rather than individualized configuration, and an expensive rehearsal space. This paper presents an Ambisonics method that addresses these limitations. The method simulates the acoustics of a chosen performance venue in real-time, by simulating oral-binaural room impulse responses (OBRIRs). This method allows changing the virtual venue and user-related data, and provides three-degrees-of-freedom (3DoF) for user head movement. The method is validated using quantitative and qualitative methods, and challenges of future real-time implementation are discussed. Despite the challenges, the method is capable of facilitating virtual rehearsal in real-time while providing for greater user flexibility.","PeriodicalId":405859,"journal":{"name":"TENCON 2021 - 2021 IEEE Region 10 Conference (TENCON)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Ambisonics and Sonic Simulation in Virtual Reality\",\"authors\":\"Yulia Yagunova, M. Poletti, Paul D. Teal\",\"doi\":\"10.1109/TENCON54134.2021.9707299\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"It can be challenging for opera singers to access their performance venues for rehearsal due to venue sched-ules or travel restrictions. Virtual reality (VR) or augmented reality (AR) technologies provide the possibility for rehearsal in a virtual venue. However, these technologies are mainly focused on visuals, rather than on the sonic plausibility of a virtual space. Moreover, existing self-auralization methods have one or more of the following limitations: a small choice of virtual venues, restricted user movement, generic rather than individualized configuration, and an expensive rehearsal space. This paper presents an Ambisonics method that addresses these limitations. The method simulates the acoustics of a chosen performance venue in real-time, by simulating oral-binaural room impulse responses (OBRIRs). This method allows changing the virtual venue and user-related data, and provides three-degrees-of-freedom (3DoF) for user head movement. The method is validated using quantitative and qualitative methods, and challenges of future real-time implementation are discussed. Despite the challenges, the method is capable of facilitating virtual rehearsal in real-time while providing for greater user flexibility.\",\"PeriodicalId\":405859,\"journal\":{\"name\":\"TENCON 2021 - 2021 IEEE Region 10 Conference (TENCON)\",\"volume\":\"32 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"TENCON 2021 - 2021 IEEE Region 10 Conference (TENCON)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/TENCON54134.2021.9707299\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"TENCON 2021 - 2021 IEEE Region 10 Conference (TENCON)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TENCON54134.2021.9707299","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Ambisonics and Sonic Simulation in Virtual Reality
It can be challenging for opera singers to access their performance venues for rehearsal due to venue sched-ules or travel restrictions. Virtual reality (VR) or augmented reality (AR) technologies provide the possibility for rehearsal in a virtual venue. However, these technologies are mainly focused on visuals, rather than on the sonic plausibility of a virtual space. Moreover, existing self-auralization methods have one or more of the following limitations: a small choice of virtual venues, restricted user movement, generic rather than individualized configuration, and an expensive rehearsal space. This paper presents an Ambisonics method that addresses these limitations. The method simulates the acoustics of a chosen performance venue in real-time, by simulating oral-binaural room impulse responses (OBRIRs). This method allows changing the virtual venue and user-related data, and provides three-degrees-of-freedom (3DoF) for user head movement. The method is validated using quantitative and qualitative methods, and challenges of future real-time implementation are discussed. Despite the challenges, the method is capable of facilitating virtual rehearsal in real-time while providing for greater user flexibility.