{"title":"Mom's tray: real-time dietary monitoring system","authors":"Gyuwon Jung, A. Zarzycki, Ji-Hyun Lee","doi":"10.1145/3132787.3139196","DOIUrl":"https://doi.org/10.1145/3132787.3139196","url":null,"abstract":"Mom's Tray is a dietary monitoring system that integrates smart tray with embedded sensors, pre-arranged and RFID-tagged food packages, and a mobile app to provide a real-time feedback on food ordering and consumption in a school cafeteria setup.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"64 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114010697","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mandy Klingbeil, S. Pasewaldt, Amir Semmo, J. Döllner
{"title":"Challenges in user experience design of image filtering apps","authors":"Mandy Klingbeil, S. Pasewaldt, Amir Semmo, J. Döllner","doi":"10.1145/3132787.3132803","DOIUrl":"https://doi.org/10.1145/3132787.3132803","url":null,"abstract":"Photo filtering apps successfully deliver image-based stylization techniques to a broad audience, in particular in the ubiquitous domain (e.g., smartphones, tablet computers). Interacting with these inherently complex techniques has so far mostly been approached in two different ways: (1) by exposing many (technical) parameters to the user, resulting in a professional application that typically requires expert domain knowledge, or (2) by hiding the complexity via presets that only allows the application of filters but prevents creative expression thereon. In this work, we outline challenges of and present approaches for providing interactive image filtering on mobile devices, thereby focusing on how to make them usable for people in their daily life. This is discussed by the example of BeCasso, a user-centric app for assisted image stylization that targets two user groups: mobile artists and users seeking casual creativity. Through user research, qualitative and quantitative user studies, we identify and outline usability issues that showed to prevent both user groups from reaching their objectives when using the app. On the one hand, user-group-targeting has been improved by an optimized user experience design. On the other hand, multiple level of controls have been implemented to ease the interaction and hide the underlying complex technical parameters. Evaluations underline that the presented approach can increase the usability of complex image stylization techniques for mobile apps.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117151953","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Christian Zimmer, Daniel Drochtert, C. Geiger, Michael Brink, Rolf Mütze
{"title":"Mobile previsualization using augmented reality: a use case from film production","authors":"Christian Zimmer, Daniel Drochtert, C. Geiger, Michael Brink, Rolf Mütze","doi":"10.1145/3132787.3132805","DOIUrl":"https://doi.org/10.1145/3132787.3132805","url":null,"abstract":"We present a mobile augmented reality application for the planning and pre-visualisation of film productions. The tool was developed in an iterative design process in the collaboration between a VFX studio and a mixed reality research lab. This paper explains the specification of the requirements, the interaction design of previsualization techniques, and details about the implementation on a mobile device.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"100 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125072392","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Angus Yu Chi Yin, Astrid Cheung Oi Lam, C. Wai, H. Wing, Yuki Ho Sin Yi, Christine Fung Lim Chi, Dave Tam Kai Fung
{"title":"Hugus","authors":"Angus Yu Chi Yin, Astrid Cheung Oi Lam, C. Wai, H. Wing, Yuki Ho Sin Yi, Christine Fung Lim Chi, Dave Tam Kai Fung","doi":"10.1145/3132787.3139195","DOIUrl":"https://doi.org/10.1145/3132787.3139195","url":null,"abstract":"Long-distance relationship is not always an easy journey to get though. Several problems that come along with long distance-relationship have been identified, including spatial barrier, endless waiting and direction associating. The project Hugus aims at solving problems in long-distance relationship with the approach of interactivity. The soft, bear shaped interactive device Hugus is expected to comfort users' feeling and maintain stronger connections between users.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133673574","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Youngho Lee, Choonsung Shin, Thammathip Piumsomboon, Gun A. Lee, M. Billinghurst
{"title":"Automated enabling of head mounted display using gaze-depth estimation","authors":"Youngho Lee, Choonsung Shin, Thammathip Piumsomboon, Gun A. Lee, M. Billinghurst","doi":"10.1145/3132787.3139201","DOIUrl":"https://doi.org/10.1145/3132787.3139201","url":null,"abstract":"Recently, global companies have released OST-HMDs (Optical See-through Head Mounted Displays) for Augmented Reality. The main feature of these HMDs is that you can see virtual objects while seeing real space. However, if you do not want to see a virtual object and you want to focus on a real object, this functionality is inconvenient. In this paper, we propose a method to turn on / off the screen of HMD according to user's gaze when using an augmented reality HMD. The proposed method uses the eye-tracker attached to the mobile HMD to determine the line of sight along the distance. We put this data into a neural network to create a learning model. After the learning is completed, the gaze data is input in real time to obtain the gaze predicted distance. Through various experiments, the possibilities and limits of machine learning algorithms are grasped and suggestions for improvement are suggested.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"251 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128739475","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","authors":"M. Billinghurst, Witawat Rungjiratananon","doi":"10.1145/3132787","DOIUrl":"https://doi.org/10.1145/3132787","url":null,"abstract":"The SIGGRAPH Asia Symposium on Mobile Graphics and Interactive Applications will offer attendees the opportunity to explore the opportunities and challenges of mobile applications relevant to the global graphics community. \u0000 \u0000The program will cover the development, technology, and marketing of mobile graphics and interactive applications. It will especially highlight novel uses of graphics and interactivity on mobile devices. Attendees can expect to be exposed to the latest in mobile graphics and interactive applications through expert keynote talks, paper presentations, panel discussions, industry case studies, and hands-on demonstrations.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"140 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125987660","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Christian Zimmer, Michael Bertram, Fabian Büntig, Daniel Drochtert, C. Geiger
{"title":"Mobile augmented reality illustrations that entertain and inform with the hololens","authors":"Christian Zimmer, Michael Bertram, Fabian Büntig, Daniel Drochtert, C. Geiger","doi":"10.1145/3132787.3140546","DOIUrl":"https://doi.org/10.1145/3132787.3140546","url":null,"abstract":"We demonstrate three different mixed reality prototypes using the Microsoft Hololens. Main focus of the applications is providing information and entertainment in mixed reality space in different scenarios, while considering the technical capabilities and restrictions of the current Hololens device. The prototypes allow for several input modalities including voice, gesture and spatial input.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125366600","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Patchara Pattanakimhun, W. Chinthammit, N. Chotikakamthorn
{"title":"Enhanced engagement with public displays through mobile phone interaction","authors":"Patchara Pattanakimhun, W. Chinthammit, N. Chotikakamthorn","doi":"10.1145/3132787.3139205","DOIUrl":"https://doi.org/10.1145/3132787.3139205","url":null,"abstract":"Public displays are widely used in public spaces for displaying information. The flow of the information is largely a one-way passive communication, which results in the level of the engagement between the viewers and the content very difficult to predict. We propose a use of a mobile application to interact with a public display as a way to enhance users' engagement with the content of the public displays. In this paper, we present results of a usability test of our proposed mobile-public display application that demonstrated that it could enhance the engagement of mobile application's users and the content on public displays.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125400511","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Barrett Ens, A. Quigley, H. Yeo, Pourang Irani, M. Billinghurst
{"title":"Multi-scale gestural interaction for augmented reality","authors":"Barrett Ens, A. Quigley, H. Yeo, Pourang Irani, M. Billinghurst","doi":"10.1145/3132787.3132808","DOIUrl":"https://doi.org/10.1145/3132787.3132808","url":null,"abstract":"We present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially awkward, and susceptible to fatigue. Our prototype application uses multiple sensors to detect gestures from both arm and hand motions (macro-scale), and finger gestures (micro-scale). Micro-gestures can provide precise input through a belt-worn sensor configuration, with the hand in a relaxed posture. We present an application that combines direct manipulation with microgestures for precise interaction, beyond the capabilities of direct manipulation alone.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114402264","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Thammathip Piumsomboon, Arindam Dey, Barrett Ens, Youngho Lee, Gun A. Lee, M. Billinghurst
{"title":"Exploring enhancements for remote mixed reality collaboration","authors":"Thammathip Piumsomboon, Arindam Dey, Barrett Ens, Youngho Lee, Gun A. Lee, M. Billinghurst","doi":"10.1145/3132787.3139200","DOIUrl":"https://doi.org/10.1145/3132787.3139200","url":null,"abstract":"In this paper, we explore techniques for enhancing remote Mixed Reality (MR) collaboration in terms of communication and interaction. We created CoVAR, a MR system for remote collaboration between an Augmented Reality (AR) and Augmented Virtuality (AV) users. Awareness cues and AV-Snap-to-AR interface were proposed for enhancing communication. Collaborative natural interaction, and AV-User-Body-Scaling were implemented for enhancing interaction. We conducted an exploratory study examining the awareness cues and the collaborative gaze, and the results showed the benefits of the proposed techniques for enhancing communication and interaction.","PeriodicalId":243902,"journal":{"name":"SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127670348","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}