{"title":"运动线索、颜色和亮度对光学透视 AR 显示器深度知觉的影响","authors":"Omeed Ashtiani, Hung-Jui Guo, Balakrishnan Prabhakaran","doi":"10.3389/frvir.2023.1243956","DOIUrl":null,"url":null,"abstract":"Introduction: Augmented Reality (AR) systems are systems in which users view and interact with virtual objects overlaying the real world. AR systems are used across a variety of disciplines, i.e., games, medicine, and education to name a few. Optical See-Through (OST) AR displays allow users to perceive the real world directly by combining computer-generated imagery overlaying the real world. While perception of depth and visibility of objects is a widely studied field, we wanted to observe how color, luminance, and movement of an object interacted with each other as well as external luminance in OST AR devices. Little research has been done regarding the issues around the effect of virtual objects’ parameters on depth perception, external lighting, and the effect of an object’s mobility on this depth perception.Methods: We aim to perform an analysis of the effects of motion cues, color, and luminance on depth estimation of AR objects overlaying the real world with OST displays. We perform two experiments, differing in environmental lighting conditions (287 lux and 156 lux), and analyze the effects and differences on depth and speed perceptions.Results: We have found that while stationary objects follow previous research with regards to depth perception, motion and both object and environmental luminance play a factor in this perception.Discussion: These results will be significantly useful for developers to account for depth estimation issues that may arise in AR environments. Awareness of the different effects of speed and environmental illuminance on depth perception can be utilized when performing AR or MR applications where precision matters.","PeriodicalId":73116,"journal":{"name":"Frontiers in virtual reality","volume":"47 6","pages":""},"PeriodicalIF":3.2000,"publicationDate":"2023-12-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Impact of motion cues, color, and luminance on depth perception in optical see-through AR displays\",\"authors\":\"Omeed Ashtiani, Hung-Jui Guo, Balakrishnan Prabhakaran\",\"doi\":\"10.3389/frvir.2023.1243956\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Introduction: Augmented Reality (AR) systems are systems in which users view and interact with virtual objects overlaying the real world. AR systems are used across a variety of disciplines, i.e., games, medicine, and education to name a few. Optical See-Through (OST) AR displays allow users to perceive the real world directly by combining computer-generated imagery overlaying the real world. While perception of depth and visibility of objects is a widely studied field, we wanted to observe how color, luminance, and movement of an object interacted with each other as well as external luminance in OST AR devices. Little research has been done regarding the issues around the effect of virtual objects’ parameters on depth perception, external lighting, and the effect of an object’s mobility on this depth perception.Methods: We aim to perform an analysis of the effects of motion cues, color, and luminance on depth estimation of AR objects overlaying the real world with OST displays. We perform two experiments, differing in environmental lighting conditions (287 lux and 156 lux), and analyze the effects and differences on depth and speed perceptions.Results: We have found that while stationary objects follow previous research with regards to depth perception, motion and both object and environmental luminance play a factor in this perception.Discussion: These results will be significantly useful for developers to account for depth estimation issues that may arise in AR environments. Awareness of the different effects of speed and environmental illuminance on depth perception can be utilized when performing AR or MR applications where precision matters.\",\"PeriodicalId\":73116,\"journal\":{\"name\":\"Frontiers in virtual reality\",\"volume\":\"47 6\",\"pages\":\"\"},\"PeriodicalIF\":3.2000,\"publicationDate\":\"2023-12-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Frontiers in virtual reality\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3389/frvir.2023.1243956\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, SOFTWARE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in virtual reality","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/frvir.2023.1243956","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
Impact of motion cues, color, and luminance on depth perception in optical see-through AR displays
Introduction: Augmented Reality (AR) systems are systems in which users view and interact with virtual objects overlaying the real world. AR systems are used across a variety of disciplines, i.e., games, medicine, and education to name a few. Optical See-Through (OST) AR displays allow users to perceive the real world directly by combining computer-generated imagery overlaying the real world. While perception of depth and visibility of objects is a widely studied field, we wanted to observe how color, luminance, and movement of an object interacted with each other as well as external luminance in OST AR devices. Little research has been done regarding the issues around the effect of virtual objects’ parameters on depth perception, external lighting, and the effect of an object’s mobility on this depth perception.Methods: We aim to perform an analysis of the effects of motion cues, color, and luminance on depth estimation of AR objects overlaying the real world with OST displays. We perform two experiments, differing in environmental lighting conditions (287 lux and 156 lux), and analyze the effects and differences on depth and speed perceptions.Results: We have found that while stationary objects follow previous research with regards to depth perception, motion and both object and environmental luminance play a factor in this perception.Discussion: These results will be significantly useful for developers to account for depth estimation issues that may arise in AR environments. Awareness of the different effects of speed and environmental illuminance on depth perception can be utilized when performing AR or MR applications where precision matters.