{"title":"Using point-light movement as peripheral visual guidance for scooter navigation","authors":"Hung-Yu Tseng, Rong-Hao Liang, Liwei Chan, Bing-Yu Chen","doi":"10.1145/2735711.2735800","DOIUrl":"https://doi.org/10.1145/2735711.2735800","url":null,"abstract":"This work presents a preliminary study of utilizing point-light movement in scooter drivers' peripheral vision for turn-by-turn navigation. We examine six types of basic 1D point-light movement, and the results suggests several of them can be easily picked up and comprehended by peripheral vision in parallel with the on-going foveal vision task, and can be use to provide effective and distraction-free route-guiding experiences for scooter driving.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121956762","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"AR-HITOKE: visualizing popularity of brick and mortar shops to support purchase decisions","authors":"Soh Masuko, Ryo Kuroki","doi":"10.1145/2735711.2735804","DOIUrl":"https://doi.org/10.1145/2735711.2735804","url":null,"abstract":"We propose a shopping support system (AR-HITOKE) that visualizes the popularity of brick and mortar shops by aggregating online and offline information using augmented reality technology that can be understood intuitively. In the proposed method, 3D-animated human-shaped icons in queues and user comments are overlaid above a shop's location on a physical map. Popularity is expressed visually by adjusting the queues length depending on offline sales data. We also visualize user comments related to each shop that are extracted from online reviews using our sentiment analysis framework. The proposed method offers new evaluation information for decision making in a physical environment and new online shopping experiences. Through exhibition of the proposed system at an actual event, we found that users are able to recognize the popularity of shops intuitively.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125821219","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yuki Ban, Sho Sakurai, Takuji Narumi, T. Tanikawa, M. Hirose
{"title":"Improving work productivity by controlling the time rate displayed by the virtual clock","authors":"Yuki Ban, Sho Sakurai, Takuji Narumi, T. Tanikawa, M. Hirose","doi":"10.1145/2735711.2735791","DOIUrl":"https://doi.org/10.1145/2735711.2735791","url":null,"abstract":"The main contribution of this paper is establishing the method for improving work productivity unconsciously by controlling the time rate that a virtual clock displays. Recently, it became clear that the work efficiency is influenced by various environmental factors. One of a way to increase work productivity is improving the work rate during certain duration. On the contrary, it is becoming clarified that the time pressure has the potential to enhance the task performance and the work productivity. The approximation of the work rate per certain time and this time pressure are evoked by the time sensation. In this study, we focus on a \"clock\" as a tool, which gives the recognition of time rate and length for everyone mutually. We propose a method to improve a person's work productivity unconsciously by giving an illusion of false sense of the passaged time by a virtual clock that displays the time rate that differ from real one visually. We conducted experiments to investigate the influence of the changes in the displayed virtual time rate on time perception and work efficiency. The experimental results showed that by displaying an the accelerated time rate, it is possible to improve work efficiency with constant time perception.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125455708","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Superimposed projection of ghosted view on real object with color correction","authors":"Naoto Uekusa, T. Koike","doi":"10.1145/2735711.2735810","DOIUrl":"https://doi.org/10.1145/2735711.2735810","url":null,"abstract":"We describe a spatial augmented reality system that enables superimposed projection of an internal image on a real object with color correction. Our system is a projector-camera system, which consists of a camera, a projector, and a PC. At first, we generate a first projection image from the internal image of CG and a camera image of the real object captured by the camera. Next, we project the first projection image on the real object, and again capture an image of the real object with the internal image. At last, we update the projection image with color correction on CIELUV color space and project the image on the real object. This system will be able to visualize the internal structures on various objects easily.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124133692","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"LumoSpheres: real-time tracking of flying objects and image projection for a volumetric display","authors":"H. Koike, H. Yamaguchi","doi":"10.1145/2735711.2735824","DOIUrl":"https://doi.org/10.1145/2735711.2735824","url":null,"abstract":"This paper proposes a method for real-time tracking of flying objects and image projection onto them for developing a particle-based volumetric 3D display. The first section describes the concept using high-speed cameras and projectors for a particle-based volumetric 3D display. Our solution suggests a prediction model with kinematic laws and uses Kalman Filters to address latency issues within the projector-camera system. We conducted experiments to show the accuracy of the image projection. We also present an application of our method in entertainment, Digital Juggling.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127935584","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The kraftwork and the knittstruments: augmenting knitting with sound","authors":"Enrique Encinas, Konstantia Koulidou, Robb Mitchell","doi":"10.1145/2735711.2735833","DOIUrl":"https://doi.org/10.1145/2735711.2735833","url":null,"abstract":"This paper presents a novel example of technological augmentation of a craft practice. By translating the skilled, embodied knowledge of knitting practice into the language of sound, our study explores how audio augmentation of routinized motion patterns affects an individual's awareness of her bodily movements and alters conventional practice. Four different instruments (The Knittstruments: The ThereKnitt, The KnittHat, The Knittomic, and The KraftWork) were designed and tested in four different locations. This research entails cycles of data collection and analysis based on the action and grounded theory methods of noting, coding and memoing. Analysis of the data collected suggests substantial alterations in the knitters performance due to audio feedback at both an individual and group level and improvisation in the process of making. We argue that the usage of Knittstruments can have relevant consequences in the fields of interface design, wearable computing or artistic and musical creation in general and hope to provide a new inspiring venue for designers, artists and knitters to explore.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126301877","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Vision enhancement: defocus correction via optical see-through head-mounted displays","authors":"Yuta Itoh, G. Klinker","doi":"10.1145/2735711.2735787","DOIUrl":"https://doi.org/10.1145/2735711.2735787","url":null,"abstract":"Vision is our primary, essential sense to perceive the real world. Human beings have been keen to enhance the limit of the eye function by inventing various vision devices such as corrective glasses, sunglasses, telescopes, and night vision goggles. Recently, Optical See-Through Head-Mounted Displays (OST-HMD) have penetrated in the commercial market. While the traditional devices have improved our vision by altering or replacing it, OST-HMDs can augment and mediate it. We believe that future OST-HMDs will dramatically improve our vision capability, combined with wearable sensing systems including image sensors. For taking a step toward this future, this paper investigates Vision Enhancement (VE) techniques via OST-HMDs. We aim at correcting optical defects of human eyes, especially defocus, by overlaying a compensation image on the user's actual view so that the filter cancels the aberration. Our contributions are threefold. Firstly, we formulate our method by taking the optical relationships between OST-HMD and human eye into consideration. Secondly, we demonstrate the method in proof-of-concept experiments. Lastly and most importantly, we provide a thorough analysis of the results including limitations of the current system, potential research issues necessary for realizing practical VE systems, and possible solutions for the issues for future research.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114257711","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
C. Skovgaard, J. Thomsen, N. Verdezoto, Daniel Vestergaard
{"title":"DogPulse: augmenting the coordination of dog walking through an ambient awareness system at home","authors":"C. Skovgaard, J. Thomsen, N. Verdezoto, Daniel Vestergaard","doi":"10.1145/2735711.2735825","DOIUrl":"https://doi.org/10.1145/2735711.2735825","url":null,"abstract":"This paper presents DogPulse, an ambient awareness system to support the coordination of dog walking among family members at home. DogPulse augments a dog collar and leash set to activate an ambient shape-changing lamp and visualize the last time the dog was taken for a walk. The lamp gradually changes its form and pulsates its lights in order to keep the family members aware of the dog walking activity. We report the iterative prototyping of DogPulse, its implementation and its preliminary evaluation. Based on our initial findings, we present the limitations and lessons learned as well as highlight recommendations for future work.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"90 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115849060","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The mind-window: brain activity visualization using tablet-based AR and EEG for multiple users","authors":"Jonathan Mercier-Ganady, M. Marchal, A. Lécuyer","doi":"10.1145/2735711.2735809","DOIUrl":"https://doi.org/10.1145/2735711.2735809","url":null,"abstract":"In this poster we introduce a novel approach, called the \"Mind-Window\", for real-time visualization of brain activity. The Mind-Window enables one or multiple users to visualize the brain activity of another person as if her skull was transparent. Our approach relies on the use of multiple tablet PCs that the observers can move around the head of the observed person wearing an EEG cap. A 3D virtual brain model is superimposed onto the head of the observed person using augmented reality by tracking a 3D marker placed on top of the head. The EEG cap records the electrical fields emitted by the brain, and they are processed in real-time to update the display of the virtual brain model. Several visualization techniques are proposed such as an interactive cutting plane which can be manipulated with touch-based inputs on the tablet. The Mind-Window could be used for various application purposes such as for Education as teaching tool to learn brain anatomy/activity and EEG features, e.g., electrodes localization, electrical patterns, etc.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116966868","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Masaharu Hirose, Karin Iwazaki, Kozue Nojiri, M. Takeda, Yuta Sugiura, M. Inami
{"title":"Gravitamine spice: a system that changes the perception of eating through virtual weight sensation","authors":"Masaharu Hirose, Karin Iwazaki, Kozue Nojiri, M. Takeda, Yuta Sugiura, M. Inami","doi":"10.1145/2735711.2735795","DOIUrl":"https://doi.org/10.1145/2735711.2735795","url":null,"abstract":"The flavor of food is not just limited to the sense of taste, but also it changes according to the perceived information from other perception such as the auditory, visual, tactile senses, or through individual experiences or cultural background, etc. We proposed \"Gravitamine Spice\", a system that focuses on the cross-modal interaction between our perception; mainly the weight of food we perceived when we carry the utensils. This system consists of a fork and a seasoning called the \"OMOMI\". User can change the weight of the food by sprinkling seasoning onto it. Through this sequence of actions, users can enjoy different dining experiences, which may change the taste of their food or the feeling towards the food when they are chewing it.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116930531","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}