{"title":"Spatial display system for designing live audiovisual content","authors":"Naoya Umeta, Tomohito Yamamoto","doi":"10.1145/3132787.3132811","DOIUrl":"https://doi.org/10.1145/3132787.3132811","url":null,"abstract":"Many types of displays have been developed for expressing high levels of presence. However, these systems tend to be expensive. To solve this problem, we developed a spatial display system using mobile devices. In this study, we implement design system for 3D audiovisual live contents based on our system.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128775206","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Delabrida, M. Billinghurst, B. Thomas, Ricardo A. R. Rabelo, S. Ribeiro
{"title":"Design of a wearable system for 3D data acquisition and reconstruction for tree climbers","authors":"S. Delabrida, M. Billinghurst, B. Thomas, Ricardo A. R. Rabelo, S. Ribeiro","doi":"10.1145/3132787.3139198","DOIUrl":"https://doi.org/10.1145/3132787.3139198","url":null,"abstract":"Ecologists often need to extract data from forests to determine the flora and fauna conditions. Tree climbing is one technique used for this. Climbers go to the top of the tree and use hand held equipment to collect and make notes while they are descending. This process demands expert climbing skills and biology expertise, which are two characteristics not commonly found. This paper describes the design of mobile system for 3D data collection and reconstruction of the field research environment for remote evaluation. The prototype can be used to reduce the time spent up trees and collected data that can be used for crowd-sourced evaluation.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"442 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124605238","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Alaeddin Nassani, Gun A. Lee, M. Billinghurst, T. Langlotz, R. Lindeman
{"title":"Using visual and spatial cues to represent social contacts in AR","authors":"Alaeddin Nassani, Gun A. Lee, M. Billinghurst, T. Langlotz, R. Lindeman","doi":"10.1145/3132787.3139199","DOIUrl":"https://doi.org/10.1145/3132787.3139199","url":null,"abstract":"One of the key problems with representing social networks in Augmented Reality (AR) is how to differentiate between contacts. In this paper we explore how visual and spatial cues based on social relationships can be used to represent contacts in social AR applications, making it easier to distinguish between them. Previous implementations of social AR have been mostly focusing on location based visualization with no focus on the social relationship to the user. In contrast, we explore how to visualise social relationships in mobile AR environments using proximity and visual fidelity filters. We ran a focus group to explore different options for representing social contacts in a mobile an AR application. We also conducted a user study to test a head-worn AR prototype using proximity and visual fidelity filters. We found out that filtering social contacts on wearable AR is preferred and useful. We discuss the results of focus group and the user study, and provide insights into directions for future work.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"144 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124598718","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Effective ray tracing of large 3D scenes through mobile distributed computing","authors":"Woong Seo, Yeonsoo Kim, I. Ihm","doi":"10.1145/3132787.3139206","DOIUrl":"https://doi.org/10.1145/3132787.3139206","url":null,"abstract":"Ray tracing large-scale 3D scenes at interactive frame rates is a challenging problem on mobile devices. In this paper, we present a mobile ray tracing system that aims to render large scenes with many millions of triangles at interactive speeds on a small-scale mobile cluster. To mitigate performance degradation due to excessive data communication on mobile and wireless networks with still high latency, we employ a tile-based rendering strategy where each participating mobile device keeps an entire copy of the necessary rendering data. To realize such a system, we compress the 3D scene data to a size loadable into graphics memory, which enables an effective mobile GPU ray tracing. Also, by using a careful interaction scheme between the master and slave devices in the mobile cluster, we enhance the efficiency of the mobile distributed GPU ray tracing markedly.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126767487","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
G. Pintore, F. Ganovelli, Roberto Scopigno, E. Gobbetti
{"title":"Mobile metric capture and reconstruction in indoor environments","authors":"G. Pintore, F. Ganovelli, Roberto Scopigno, E. Gobbetti","doi":"10.1145/3132787.3139202","DOIUrl":"https://doi.org/10.1145/3132787.3139202","url":null,"abstract":"Mobile devices have become progressively more attractive for solving environment sensing problems. Thanks to their multi-modal acquisition capabilities and their growing processing power, they can perform increasingly sophisticated computer vision and data fusion tasks. In this context, we summarize our recent advances in the acquisition and reconstruction of indoor structures, describing the evolution of the methods from current single-view approaches to novel mobile multi-view methodologies. Starting from an overview on the features and capabilities of current hardware (ranging from commodity smartphones to recent 360° cameras), we present in details specific real-world cases which exploit modern devices to acquire structural, visual and metric information.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127589723","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Duopography: using back-of-device multi-touch input to manipulate spatial data on mobile tangible interactive topography","authors":"Nico Li, E. Sharlin, M. Sousa","doi":"10.1145/3132787.3139197","DOIUrl":"https://doi.org/10.1145/3132787.3139197","url":null,"abstract":"In this short paper we present the design of Duopography1, a dual-surface mobile tangible interface for spatial representation and manipulation of topography. The 3D physical topographic front of Duopography acts as a tangible interface, enabling sketching directly on the 3D terrain, as well as visual augmentation of the topography. At the same time, Duopography's flat back-of-device supports gestures that are hard to perform on the irregular front, allowing common interaction techniques such as panning and pinching. We contribute a prototype and the results of a preliminary evaluation of a dual-surface topography interface combining 3D printed front and a flat back-of-device.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"308 ","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120930599","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Alaeddin Nassani, Gun A. Lee, M. Billinghurst, T. Langlotz, R. Lindeman
{"title":"AR social continuum: representing social contacts","authors":"Alaeddin Nassani, Gun A. Lee, M. Billinghurst, T. Langlotz, R. Lindeman","doi":"10.1145/3132787.3132812","DOIUrl":"https://doi.org/10.1145/3132787.3132812","url":null,"abstract":"One of the key problems with representing social networks in Augmented Reality (AR) is how to differentiate between contacts. In this paper we explore how visual and spatial cues based on social relationships can be used to represent contacts in social AR applications, making it easier to distinguish between them. Previous implementations of social AR have been mostly focusing on location based visualization with no focus on the social relationship to the user. In contrast, we explore how to visualise social relationships in mobile AR environments using proximity and visual fidelity filters. We ran a focus group to explore different options for representing social contacts in a mobile an AR application. We also conducted a user study to test a head-worn AR prototype using proximity and visual fidelity filters. We found out that filtering social contacts on wearable AR is preferred and useful. We discuss the results of focus group and the user study, and provide insights into directions for future work.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"286 3-4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134426665","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Zi Siang See, M. S. Sunar, M. Billinghurst, Arindam Dey, Delas Santano, H. Esmaeili, H. Thwaites
{"title":"Exhibition approach using an AR and VR pillar","authors":"Zi Siang See, M. S. Sunar, M. Billinghurst, Arindam Dey, Delas Santano, H. Esmaeili, H. Thwaites","doi":"10.1145/3132787.3132810","DOIUrl":"https://doi.org/10.1145/3132787.3132810","url":null,"abstract":"This demonstration presents a development of an Augmented Reality (AR) and Virtual Reality (AR) pillar, a novel approach for showing AR and VR content in a public setting. A pillar in a public exhibition venue was converted to a four-sided AR and VR showcase. A cultural heritage theme of Boatbuilders of Pangkor was been featured in an experiment of the AR and VR Pillar. Multimedia tablets and mobile AR head-mount-displays (HMDs) were freely provided for the public visitors to experience multisensory content demonstrated on the pillar. The content included AR-based videos, maps, images and text, and VR experiences that allowed visitors to view reconstructed 3D subjects and remote locations in a 360 virtual environment. A miniature version of the pillar will be used for the demonstration where users could experience features of the prototype system.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"362 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121722076","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Christian Zimmer, Michael Bertram, Fabian Büntig, Daniel Drochtert, C. Geiger
{"title":"Mobile augmented reality illustrations that entertain and inform: design and implementation issues with the hololens","authors":"Christian Zimmer, Michael Bertram, Fabian Büntig, Daniel Drochtert, C. Geiger","doi":"10.1145/3132787.3132804","DOIUrl":"https://doi.org/10.1145/3132787.3132804","url":null,"abstract":"We present design and implementation issues of selected mixed reality prototypes using the Microsoft Hololens. Main focus of the applications is to demonstrate possible applications providing information and entertainment in mixed reality space in different scenarios, while considering the technical capabilities and restrictions of the current Hololens device. The prototypes allow for several input modalities including voice input, gesture recognition and spatial input. Design and implementation recommendations, derived from informal user feedback and the experiences gained during the development process are introduced in the end of the paper.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"5 2","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120982298","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lei Gao, Huidong Bai, R. Lindeman, M. Billinghurst
{"title":"Static local environment capturing and sharing for MR remote collaboration","authors":"Lei Gao, Huidong Bai, R. Lindeman, M. Billinghurst","doi":"10.1145/3132787.3139204","DOIUrl":"https://doi.org/10.1145/3132787.3139204","url":null,"abstract":"We present a Mixed Reality (MR) system that supports entire scene capturing of the local physical work environment for remote collaboration in a large-scale workspace. By integrating the key-frames captured with external depth sensor as one single 3D point-cloud data set, our system could reconstruct the entire local physical workspace into the VR world. In this case, the remote helper could observe the local scene independently from the local user's current head and camera position, and provide gesture guiding information even before the local user staring at the target object. We conducted a pilot study to evaluate the usability of the system by comparing it with our previous oriented view system which only sharing the current camera view together with the real-time head orientation data. Our results indicate that this entire scene capturing and sharing system could significantly increase the remote helper's spatial awareness of the local work environment, especially in a large-scale workspace, and gain an overwhelming user preference (80%) than previous system.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"116 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121575001","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}