Journal of the Audio Engineering Society最新文献

筛选
英文 中文
Auralization of Measured Room Transitions in Virtual Reality 虚拟现实中测量房间过渡的听觉化
IF 1.4 4区 工程技术
Journal of the Audio Engineering Society Pub Date : 2023-06-06 DOI: 10.17743/jaes.2022.0084
Thomas McKenzie, Nils Meyer-Kahlen, C. Hold, Sebastian J. Schlecht, V. Pulkki
{"title":"Auralization of Measured Room Transitions in Virtual Reality","authors":"Thomas McKenzie, Nils Meyer-Kahlen, C. Hold, Sebastian J. Schlecht, V. Pulkki","doi":"10.17743/jaes.2022.0084","DOIUrl":"https://doi.org/10.17743/jaes.2022.0084","url":null,"abstract":"To auralise a room’s acoustics in six degrees-of-freedom (6DoF) virtual reality (VR), a dense set of spatial room impulse response (SRIR) measurements is required, so interpolating between a sparse set is desirable. This paper studies the auralisation of room transitions by proposing a baseline interpolation method for higher-order Ambisonic SRIRs and evaluating it in VR. The presented method is simple yet applicable to coupled rooms and room transitions. It is based on linear interpolation with RMS compensation, though direct sound, early reflec-tions and late reverberation are processed separately, whereby the input direct sounds are first steered to the relative direction-of-arrival before summation and interpolated early reflections are directionally equalised. The proposed method is first evaluated numerically, which demonstrates its improvements over a basic linear interpolation. A listening test is then conducted in 6DoF VR, to assess the density of SRIR measurements needed in order to plausibly auralise a room transition using the presented interpolation method. The results suggest that, given the tested scenario, a 50 cm to 1 m inter-measurement distance can be perceptually sufficient.","PeriodicalId":50008,"journal":{"name":"Journal of the Audio Engineering Society","volume":" ","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43969491","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Measuring Motion-to-Sound Latency in Virtual Acoustic Rendering Systems 测量虚拟声学渲染系统中动作到声音的延迟
IF 1.4 4区 工程技术
Journal of the Audio Engineering Society Pub Date : 2023-06-06 DOI: 10.17743/jaes.2022.0089
Nils Meyer-Kahlen, Miranda Kastemaa, Sebastian J. Schlecht, T. Lokki
{"title":"Measuring Motion-to-Sound Latency in Virtual Acoustic Rendering Systems","authors":"Nils Meyer-Kahlen, Miranda Kastemaa, Sebastian J. Schlecht, T. Lokki","doi":"10.17743/jaes.2022.0089","DOIUrl":"https://doi.org/10.17743/jaes.2022.0089","url":null,"abstract":"","PeriodicalId":50008,"journal":{"name":"Journal of the Audio Engineering Society","volume":" ","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48943865","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Evaluation of Metaverse Music Performance With BBC Maida Vale Recording Studios BBC Maida Vale录音室对元宇宙音乐表演的评估
IF 1.4 4区 工程技术
Journal of the Audio Engineering Society Pub Date : 2023-06-06 DOI: 10.17743/jaes.2022.0086
Patrick Cairns, Anthony Hunt, D. Johnston, J. Cooper, Ben Lee, H. Daffern, G. Kearney
{"title":"Evaluation of Metaverse Music Performance With BBC Maida Vale Recording Studios","authors":"Patrick Cairns, Anthony Hunt, D. Johnston, J. Cooper, Ben Lee, H. Daffern, G. Kearney","doi":"10.17743/jaes.2022.0086","DOIUrl":"https://doi.org/10.17743/jaes.2022.0086","url":null,"abstract":"","PeriodicalId":50008,"journal":{"name":"Journal of the Audio Engineering Society","volume":" ","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48519263","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Virtual-Reality-Based Research in Hearing Science: A Platforming Approach 基于虚拟现实的听力科学研究:一种平台化方法
IF 1.4 4区 工程技术
Journal of the Audio Engineering Society Pub Date : 2023-06-06 DOI: 10.17743/jaes.2022.0083
Rasmus Lundby Pedersen, L. Picinali, Nynne Kajs, F. Patou
{"title":"Virtual-Reality-Based Research in Hearing Science: A Platforming Approach","authors":"Rasmus Lundby Pedersen, L. Picinali, Nynne Kajs, F. Patou","doi":"10.17743/jaes.2022.0083","DOIUrl":"https://doi.org/10.17743/jaes.2022.0083","url":null,"abstract":"The lack of ecological validity in clinical assessment, as well as the challenge of investigat- ing multimodal sensory processing, remain key challenges in hearing science. Virtual Reality (VR) can support hearing research in these domains by combining experimental control with situational realism. However, the development of VR-based experiments is traditionally highly resource demanding, which places a significant entry barrier for basic and clinical researchers looking to embrace VR as the research tool of choice. The Oticon Medical Virtual Reality (OMVR) experiment platform fast-tracks the creation or adaptation of hearing research experi- ment templates to be used to explore areas such as binaural spatial hearing, multimodal sensory integration, cognitive hearing behavioral strategies, auditory-visual training, etc. In this paper, the OMVR’s functionalities, architecture, and key elements of implementation are presented, important performance indicators are characterized, and a use-case perceptual evaluation is presented.","PeriodicalId":50008,"journal":{"name":"Journal of the Audio Engineering Society","volume":"1 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41360635","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Sonic Interactions in Virtual Environments (SIVE) Toolkit 虚拟环境中的声音交互(SIVE)工具包
4区 工程技术
Journal of the Audio Engineering Society Pub Date : 2023-06-06 DOI: 10.17743/jaes.2022.0082
Silvin Willemsen, Helmer Nuijens, Titas Lasickas, Stefania Serafin
{"title":"The Sonic Interactions in Virtual Environments (SIVE) Toolkit","authors":"Silvin Willemsen, Helmer Nuijens, Titas Lasickas, Stefania Serafin","doi":"10.17743/jaes.2022.0082","DOIUrl":"https://doi.org/10.17743/jaes.2022.0082","url":null,"abstract":"In this paper, the Sonic Interactions in Virtual Environments (SIVE) toolkit, a virtual reality (VR) environment for building musical instruments using physical models, is presented. The audio engine of the toolkit is based on finite-difference time-domain (FDTD) methods and works in a modular fashion. The authors show how the toolkit is built and how it can be imported in Unity to create VR musical instruments, and future developments and possible applications are discussed.","PeriodicalId":50008,"journal":{"name":"Journal of the Audio Engineering Society","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135494498","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Spatial Integration of Dynamic Auditory Feedback in Electric Vehicle Interior 电动汽车内部动态听觉反馈的空间集成
IF 1.4 4区 工程技术
Journal of the Audio Engineering Society Pub Date : 2023-06-06 DOI: 10.17743/jaes.2022.0087
Théophile Dupré, Sébastien Denjean, M. Aramaki, R. Kronland-Martinet
{"title":"Spatial Integration of Dynamic Auditory Feedback in Electric Vehicle Interior","authors":"Théophile Dupré, Sébastien Denjean, M. Aramaki, R. Kronland-Martinet","doi":"10.17743/jaes.2022.0087","DOIUrl":"https://doi.org/10.17743/jaes.2022.0087","url":null,"abstract":"With the development of electric motor vehicles, the domain of automotive sound design addresses new issues, and is now concerned by creating suitable and pleasant soundscapes inside the vehicle. For instance, the absence of predominant engine sound changes the driver perception of the dynamic of his car. Previous studies proposed relevant sonification strategies to augment the interior sound environment by bringing back vehicle dynamics with synthetic auditory cues. Yet, users report a lack of blending with the existing soundscape. In this study, we analyze acoustical and perceptual spatial characteristics of the car soundscape and show that that the spatial attributes of sound sources are fundamental to improve the perceptual coherency of the global environment.","PeriodicalId":50008,"journal":{"name":"Journal of the Audio Engineering Society","volume":" ","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42902625","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
The SONICOM HRTF Dataset
IF 1.4 4区 工程技术
Journal of the Audio Engineering Society Pub Date : 2023-05-17 DOI: 10.17743/jaes.2022.0066
Isaac Engel, Rapolas Daugintis, Thibault Vicente, Aidan O. T. Hogg, J. Pauwels, Arnaud J. Tournier, Lorenzo Picinali
{"title":"The SONICOM HRTF Dataset","authors":"Isaac Engel, Rapolas Daugintis, Thibault Vicente, Aidan O. T. Hogg, J. Pauwels, Arnaud J. Tournier, Lorenzo Picinali","doi":"10.17743/jaes.2022.0066","DOIUrl":"https://doi.org/10.17743/jaes.2022.0066","url":null,"abstract":"","PeriodicalId":50008,"journal":{"name":"Journal of the Audio Engineering Society","volume":" ","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-05-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41421529","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
The Ability to Memorize Acoustic Features in a Discrimination Task 辨别任务中的声学特征记忆能力
IF 1.4 4区 工程技术
Journal of the Audio Engineering Society Pub Date : 2023-05-17 DOI: 10.17743/jaes.2022.0073
Florian Klein, Tatiana Surdu, Lukas Treybig, S. Werner
{"title":"The Ability to Memorize Acoustic Features in a Discrimination Task","authors":"Florian Klein, Tatiana Surdu, Lukas Treybig, S. Werner","doi":"10.17743/jaes.2022.0073","DOIUrl":"https://doi.org/10.17743/jaes.2022.0073","url":null,"abstract":"","PeriodicalId":50008,"journal":{"name":"Journal of the Audio Engineering Society","volume":" ","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-05-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44964886","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Spatial Reconstruction-Based Rendering of Microphone Array Room Impulse Responses 基于空间重构的传声器阵列房间脉冲响应绘制
IF 1.4 4区 工程技术
Journal of the Audio Engineering Society Pub Date : 2023-05-17 DOI: 10.17743/jaes.2022.0072
L. McCormack, Nils Meyer-Kahlen, A. Politis
{"title":"Spatial Reconstruction-Based Rendering of Microphone Array Room Impulse Responses","authors":"L. McCormack, Nils Meyer-Kahlen, A. Politis","doi":"10.17743/jaes.2022.0072","DOIUrl":"https://doi.org/10.17743/jaes.2022.0072","url":null,"abstract":"A reconstruction-based rendering approach is explored for the task of imposing the spatial characteristics of a measured space onto a monophonic signal while also reproducing it over a target playback setup. The foundation of this study is a parametric rendering framework, which can operate either on arbitrary microphone array room impulse responses (RIRs) or Ambisonic RIRs. Spatial filtering techniques are used to decompose the input RIR into individual reflections and anisotropic diffuse reverberation, which are reproduced using dedicated rendering strategies. The proposed approach operates by considering several hypotheses involving different rendering configurations and thereafter determining which hypothesis reconstructs the input RIR most faithfully. With regard to the present study, these hypotheses involved considering different potential reflection numbers. Once the optimal number of reflections to render has been determined over time and frequency, the array directional responses used to reconstruct the input RIR are substituted with spatialization gains for the target playback setup. The results of formal listening experiments suggest that the proposed approach produces renderings that are perceptually more similar to reference responses, when compared with the use of an established subspace-based detection algorithm. The proposed approach also demonstrates similar or better performance than that achieved with existing state-of-the-art methods.","PeriodicalId":50008,"journal":{"name":"Journal of the Audio Engineering Society","volume":" ","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-05-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41791005","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Perceptual Significance of Tone-Dependent Directivity Patterns of Musical Instruments 乐器音调相关指向性模式的感知意义
IF 1.4 4区 工程技术
Journal of the Audio Engineering Society Pub Date : 2023-05-17 DOI: 10.17743/jaes.2022.0076
Andrea Corcuera, V. Chatziioannou, J. Ahrens
{"title":"Perceptual Significance of Tone-Dependent Directivity Patterns of Musical Instruments","authors":"Andrea Corcuera, V. Chatziioannou, J. Ahrens","doi":"10.17743/jaes.2022.0076","DOIUrl":"https://doi.org/10.17743/jaes.2022.0076","url":null,"abstract":"Musical instruments are complex sound sources that exhibit directivity patterns that not only vary depending on the frequency, but can also change as a function of the played tone. It is yet unclear whether the directivity variation as a function of the played tone leads to a perceptible difference compared to an auralization that uses an averaged directivity pattern. This paper examines the directivity of 38 musical instruments from a publicly available database and then selects three representative instruments among those with similar radiation characteristics (oboe, violin, and trumpet). To evaluate the listeners’ ability to perceive a difference between auralizations of virtual environments using tone-dependent and averaged directivities, a listening test was conducted using the directivity patterns of the three selected instruments in both anechoic and reverberant conditions. The results show that, in anechoic conditions, listeners can reliably detect differences between the tone-dependent and averaged directivities for the oboe but not for the violin or the trumpet. Nevertheless, in reverberant conditions, listeners can distinguish tone-dependent directivity from averaged directivity for all instruments under study.","PeriodicalId":50008,"journal":{"name":"Journal of the Audio Engineering Society","volume":" ","pages":""},"PeriodicalIF":1.4,"publicationDate":"2023-05-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43673443","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信