{"title":"A New Implementation of Head-Coupled Perspective for Virtual Architecture","authors":"M. R. Asl, Gil Rosen-Thal Chengde Wu, Wei Yan","doi":"10.52842/conf.caadria.2015.251","DOIUrl":null,"url":null,"abstract":"The process of projecting 3D scenes onto a two-dimensional (2D) surface results in the loss of depth cues, which are essential for immersive experience in the scenes. Various solutions are provided to address this problem, but there are still fundamental issues need to be addressed in the existing approaches for compensating the change in the 2D image due to the change in observer’s position. Existing studies use head-coupled perspective, stereoscopy, and motion parallax methods to achieve a realistic image representation but a true natural image could not be perceived because of the inaccuracy in the calculations. This paper describes in detail an implementation method of the technique to correctly project a 3D virtual environment model onto a 2D surface to yield a more natural interaction with the virtual world. The proposed method overcomes the inaccuracies in the existing head-coupled perspective viewing and can be used with common stereoscopic displays to naturally represent virtual architecture.","PeriodicalId":191179,"journal":{"name":"Proceedings of the 20th Conference on Computer Aided Architectural Design Research in Asia (CAADRIA)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 20th Conference on Computer Aided Architectural Design Research in Asia (CAADRIA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.52842/conf.caadria.2015.251","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The process of projecting 3D scenes onto a two-dimensional (2D) surface results in the loss of depth cues, which are essential for immersive experience in the scenes. Various solutions are provided to address this problem, but there are still fundamental issues need to be addressed in the existing approaches for compensating the change in the 2D image due to the change in observer’s position. Existing studies use head-coupled perspective, stereoscopy, and motion parallax methods to achieve a realistic image representation but a true natural image could not be perceived because of the inaccuracy in the calculations. This paper describes in detail an implementation method of the technique to correctly project a 3D virtual environment model onto a 2D surface to yield a more natural interaction with the virtual world. The proposed method overcomes the inaccuracies in the existing head-coupled perspective viewing and can be used with common stereoscopic displays to naturally represent virtual architecture.