{"title":"PaRUS: a Virtual Reality Shopping Method Focusing on Contextual Information between Products and Real Usage Scenes.","authors":"Yinyu Lu, Weitao You, Ziqing Zheng, Yizhan Shao, Changyuan Yang, Zhibin Zhou","doi":"10.1109/TVCG.2025.3549539","DOIUrl":null,"url":null,"abstract":"<p><p>The development of AR and VR technologies is enhancing users' online shopping experiences in various ways. However, in existing VR shopping applications, shopping contexts merely refer to the products and virtual malls or metaphorical scenes where users select products. This leads to the defect that users can only imagine rather than intuitively feel whether the selected products are suitable for their real usage scenes, resulting in a significant discrepancy between their expectations before and after the purchase. To address this issue, we propose PaRUS, a VR shopping approach that focuses on the context between products and their real usage scenexns. PaRUS begins by rebuilding the virtual scenario of the products' real usage scene through a new semantic scene reconstruction pipeline (manual operation needed), which preserves both the structured scene and textured object models in the scene. Afterwards, intuitive visualization of how the selected products fit the reconstructed virtual scene is provided. We conducted two user studies to evaluate how PaRUS impacts user experience, behavior, and satisfaction with their purchase. The results indicated that PaRUS significantly reduced the perceived performance risk and improved users' trust and expectation with their results of purchase.</p>","PeriodicalId":94035,"journal":{"name":"IEEE transactions on visualization and computer graphics","volume":"PP ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2025-03-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on visualization and computer graphics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TVCG.2025.3549539","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The development of AR and VR technologies is enhancing users' online shopping experiences in various ways. However, in existing VR shopping applications, shopping contexts merely refer to the products and virtual malls or metaphorical scenes where users select products. This leads to the defect that users can only imagine rather than intuitively feel whether the selected products are suitable for their real usage scenes, resulting in a significant discrepancy between their expectations before and after the purchase. To address this issue, we propose PaRUS, a VR shopping approach that focuses on the context between products and their real usage scenexns. PaRUS begins by rebuilding the virtual scenario of the products' real usage scene through a new semantic scene reconstruction pipeline (manual operation needed), which preserves both the structured scene and textured object models in the scene. Afterwards, intuitive visualization of how the selected products fit the reconstructed virtual scene is provided. We conducted two user studies to evaluate how PaRUS impacts user experience, behavior, and satisfaction with their purchase. The results indicated that PaRUS significantly reduced the perceived performance risk and improved users' trust and expectation with their results of purchase.