{"title":"AudioHaptics: audio and haptic rendering based on a physical model","authors":"H. Yano, H. Igawa, T. Kameda, K. Muzutani","doi":"10.1109/HAPTIC.2004.1287203","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a method for the synthesis of haptic and auditory senses that is based on a physical model called AudioHaptics. We have developed a haptic environment that incorporates auditory sensation. We achieved this by fitting a speaker at the end effecter of a haptic interface. The FEM (finite element method) was used to calculate the vibration of a virtual object when an impact is occurred, and the sound pressure data at the speaker position was then calculated based on the 2D complex amplitude of the object surface in real time. The AudioHaptics system can generate sounds originating from virtual objects, which can have arbitrary shapes, attributes and inner structures. Experiments for evaluation with real users demonstrated that this method is effective for rendering audio and haptic sensation.","PeriodicalId":384123,"journal":{"name":"12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings.","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2000-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HAPTIC.2004.1287203","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
In this paper, we propose a method for the synthesis of haptic and auditory senses that is based on a physical model called AudioHaptics. We have developed a haptic environment that incorporates auditory sensation. We achieved this by fitting a speaker at the end effecter of a haptic interface. The FEM (finite element method) was used to calculate the vibration of a virtual object when an impact is occurred, and the sound pressure data at the speaker position was then calculated based on the 2D complex amplitude of the object surface in real time. The AudioHaptics system can generate sounds originating from virtual objects, which can have arbitrary shapes, attributes and inner structures. Experiments for evaluation with real users demonstrated that this method is effective for rendering audio and haptic sensation.