虚拟现实中舒适交互深度感知的实验研究

IF 1.7 4区 工程技术 Q3 ENGINEERING, ELECTRICAL & ELECTRONIC
Mei Guo, Haolin Gao, Yue Liu, Weitao Song, Songyue Yang, Yongtian Wang
{"title":"虚拟现实中舒适交互深度感知的实验研究","authors":"Mei Guo,&nbsp;Haolin Gao,&nbsp;Yue Liu,&nbsp;Weitao Song,&nbsp;Songyue Yang,&nbsp;Yongtian Wang","doi":"10.1002/jsid.2030","DOIUrl":null,"url":null,"abstract":"<p>Virtual reality (VR) displays aim to create highly immersive virtual environments based on the principle of binocular disparity, which reproduces spatial information of virtual scenes through the fusion processing of binocular disparity by the visual system. However, due to the differences between VR displays and real-world scenes, the challenge of rendering in VR displays in a manner that aligns with users' natural depth perception principles has not been fully addressed. In this paper, the virtual image distances (VIDs) of RGB channels in head-mounted display (HMD) were measured and a depth perception experiment based on random dot stereograms (RDS) according to the measured VID values was designed. The depth perception comfort fusion thresholds in VR systems were determined by psychophysical methods, and the results demonstrate that the comfort fusion threshold for uncrossed disparity is significantly lower than that for crossed disparity. Additionally, user interaction performance in the determined virtual depth scenarios showed a 12.94% reduction in reaction time and a 16.86% improvement in accuracy compared to other virtual depths. Our findings provide further understanding of comfortable depth visual presentation in VR displays, which is crucial for enhancing user experience and promoting the widespread adoption of VR technology across various applications.</p>","PeriodicalId":49979,"journal":{"name":"Journal of the Society for Information Display","volume":"33 4","pages":"263-273"},"PeriodicalIF":1.7000,"publicationDate":"2025-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Experimental research on depth perception of comfortable interactions in virtual reality\",\"authors\":\"Mei Guo,&nbsp;Haolin Gao,&nbsp;Yue Liu,&nbsp;Weitao Song,&nbsp;Songyue Yang,&nbsp;Yongtian Wang\",\"doi\":\"10.1002/jsid.2030\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Virtual reality (VR) displays aim to create highly immersive virtual environments based on the principle of binocular disparity, which reproduces spatial information of virtual scenes through the fusion processing of binocular disparity by the visual system. However, due to the differences between VR displays and real-world scenes, the challenge of rendering in VR displays in a manner that aligns with users' natural depth perception principles has not been fully addressed. In this paper, the virtual image distances (VIDs) of RGB channels in head-mounted display (HMD) were measured and a depth perception experiment based on random dot stereograms (RDS) according to the measured VID values was designed. The depth perception comfort fusion thresholds in VR systems were determined by psychophysical methods, and the results demonstrate that the comfort fusion threshold for uncrossed disparity is significantly lower than that for crossed disparity. Additionally, user interaction performance in the determined virtual depth scenarios showed a 12.94% reduction in reaction time and a 16.86% improvement in accuracy compared to other virtual depths. Our findings provide further understanding of comfortable depth visual presentation in VR displays, which is crucial for enhancing user experience and promoting the widespread adoption of VR technology across various applications.</p>\",\"PeriodicalId\":49979,\"journal\":{\"name\":\"Journal of the Society for Information Display\",\"volume\":\"33 4\",\"pages\":\"263-273\"},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2025-02-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of the Society for Information Display\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/jsid.2030\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the Society for Information Display","FirstCategoryId":"5","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/jsid.2030","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

摘要

虚拟现实显示器基于双眼视差原理,通过视觉系统对双眼视差进行融合处理,再现虚拟场景的空间信息,旨在创造高度沉浸式的虚拟环境。然而,由于VR显示器和现实世界场景之间的差异,以符合用户自然深度感知原则的方式在VR显示器中渲染的挑战尚未得到充分解决。本文测量了头戴式显示器(HMD)中RGB通道的虚拟图像距离(VID),并根据测量的VID值设计了基于随机点立体图(RDS)的深度感知实验。采用心理物理方法确定了VR系统的深度感知舒适融合阈值,结果表明,未交叉视差的舒适度融合阈值明显低于交叉视差的舒适度融合阈值。此外,与其他虚拟深度相比,在确定虚拟深度场景中的用户交互性能显示,反应时间减少了12.94%,准确性提高了16.86%。我们的研究结果进一步理解了VR显示器的舒适深度视觉呈现,这对于增强用户体验和促进VR技术在各种应用中的广泛采用至关重要。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Experimental research on depth perception of comfortable interactions in virtual reality

Experimental research on depth perception of comfortable interactions in virtual reality

Virtual reality (VR) displays aim to create highly immersive virtual environments based on the principle of binocular disparity, which reproduces spatial information of virtual scenes through the fusion processing of binocular disparity by the visual system. However, due to the differences between VR displays and real-world scenes, the challenge of rendering in VR displays in a manner that aligns with users' natural depth perception principles has not been fully addressed. In this paper, the virtual image distances (VIDs) of RGB channels in head-mounted display (HMD) were measured and a depth perception experiment based on random dot stereograms (RDS) according to the measured VID values was designed. The depth perception comfort fusion thresholds in VR systems were determined by psychophysical methods, and the results demonstrate that the comfort fusion threshold for uncrossed disparity is significantly lower than that for crossed disparity. Additionally, user interaction performance in the determined virtual depth scenarios showed a 12.94% reduction in reaction time and a 16.86% improvement in accuracy compared to other virtual depths. Our findings provide further understanding of comfortable depth visual presentation in VR displays, which is crucial for enhancing user experience and promoting the widespread adoption of VR technology across various applications.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of the Society for Information Display
Journal of the Society for Information Display 工程技术-材料科学:综合
CiteScore
4.80
自引率
8.70%
发文量
98
审稿时长
3 months
期刊介绍: The Journal of the Society for Information Display publishes original works dealing with the theory and practice of information display. Coverage includes materials, devices and systems; the underlying chemistry, physics, physiology and psychology; measurement techniques, manufacturing technologies; and all aspects of the interaction between equipment and its users. Review articles are also published in all of these areas. Occasional special issues or sections consist of collections of papers on specific topical areas or collections of full length papers based in part on oral or poster presentations given at SID sponsored conferences.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信