Yu Chen Chen, Kai-Hsin Chiu, Yu Ju Yang, Tsaiping Chang, Li Fei Kung, Kaiyuan Lin
{"title":"SiteSCAN: Intuitive Exchange of Spatial Information in Mixed Reality","authors":"Yu Chen Chen, Kai-Hsin Chiu, Yu Ju Yang, Tsaiping Chang, Li Fei Kung, Kaiyuan Lin","doi":"10.1145/3447527.3474876","DOIUrl":null,"url":null,"abstract":"Nowadays, people are shown information via different platforms and are bombarded with overwhelming information as it accumulates rapidly over time. Consequently, suitable and efficient approaches to access specific information become crucial in our daily lives. However, there’s barely a way to directly get information from our surroundings, and the existing display of information fails to describe its relation to space precisely. This research introduces SiteSCAN, a system that provides an intuitive design solution to exchange and explore spatial information. By letting people directly build the information onto our physical environment and allowing them to access the information from it, we connect the information to the surroundings it is intended for. The system also updates real-time information and enables network effect that supports system to grow gradually. Through image retrieval technology and geolocation positioning, the system can identify user’s current location and the surrounding space, which helps them retrieve information or attach information to a specific spot. With the help of SiteSCAN, users will experience an intuitive interaction with spatial information and be presented with the seamless integration of virtual and physical reality.","PeriodicalId":281566,"journal":{"name":"Adjunct Publication of the 23rd International Conference on Mobile Human-Computer Interaction","volume":"205 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Adjunct Publication of the 23rd International Conference on Mobile Human-Computer Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3447527.3474876","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Nowadays, people are shown information via different platforms and are bombarded with overwhelming information as it accumulates rapidly over time. Consequently, suitable and efficient approaches to access specific information become crucial in our daily lives. However, there’s barely a way to directly get information from our surroundings, and the existing display of information fails to describe its relation to space precisely. This research introduces SiteSCAN, a system that provides an intuitive design solution to exchange and explore spatial information. By letting people directly build the information onto our physical environment and allowing them to access the information from it, we connect the information to the surroundings it is intended for. The system also updates real-time information and enables network effect that supports system to grow gradually. Through image retrieval technology and geolocation positioning, the system can identify user’s current location and the surrounding space, which helps them retrieve information or attach information to a specific spot. With the help of SiteSCAN, users will experience an intuitive interaction with spatial information and be presented with the seamless integration of virtual and physical reality.