{"title":"CapacitiveMarker: novel interaction method using visual marker integrated with conductive pattern","authors":"Kohei Ikeda, K. Tsukada","doi":"10.1145/2735711.2735783","DOIUrl":"https://doi.org/10.1145/2735711.2735783","url":null,"abstract":"The visual markers have spatial limitations to require certain distances between a camera and markers. Meanwhile, as capacitive multi-touch displays on mobile devices have become popular, many researchers proposed interaction techniques using conductive patterns and a capacitive display. In this study, we propose a novel visual marker, \"CapacitiveMarker\", which can be recognized both by a camera and capacitive display. The CapacitiveMarker consists of two layered markers: a visual marker printed on a seal and a conductive pattern on a plastic film. We also propose a new interaction method using CapacitiveMarker through applications.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121049492","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"B-C-invisibility power: introducing optical camouflage based on mental activity in augmented reality","authors":"Jonathan Mercier-Ganady, M. Marchal, A. Lécuyer","doi":"10.1145/2735711.2735835","DOIUrl":"https://doi.org/10.1145/2735711.2735835","url":null,"abstract":"In this paper we introduce a novel and interactive approach for controlling optical camouflage called \"B-C-Invisibility power\". We propose to combine augmented reality and Brain-Computer Interface (BCI) technologies to design a system which somehow provides the \"power of becoming invisible\". Our optical camouflage is obtained on a PC monitor combined with an optical tracking system. A cut out image of the user is computed from a live video stream and superimposed to the prerecorded background image using a transparency effect. The transparency level is controlled by the output of a BCI, making the user able to control her invisibility directly with mental activity. The mental task required to increase/decrease the invisibility is related to a concentration/relaxation state. Results from a preliminary study based on a simple video-game inspired by the Harry Potter universe could notably show that, compared to a standard control made with a keyboard, controlling the optical camouflage directly with the BCI could enhance the user experience and the feeling of \"having a super-power\".","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122801519","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nick Nielsen, S. P. Pedersen, Jens A. Sørensen, N. Verdezoto, Nikolai H. Øllegaard
{"title":"EcoBears: augmenting everyday appliances with symbolic and peripheral feedback","authors":"Nick Nielsen, S. P. Pedersen, Jens A. Sørensen, N. Verdezoto, Nikolai H. Øllegaard","doi":"10.1145/2735711.2735817","DOIUrl":"https://doi.org/10.1145/2735711.2735817","url":null,"abstract":"This paper introduces the EcoBears concept that aims to augment household appliances with functional and aesthetic features to promote their use and longevity of use to prevent their disposal. The EcoBears also aim to support the communication of environmental issues in the home setting. The initial design and implementation of the EcoBears that consist of two bear modules (a mother and her cub) is presented as well as the preliminary concept validation and lessons learned to be considered for future work.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129559167","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"VISTouch: dynamic three-dimensional connection between multiple mobile devices","authors":"Masasuke Yasumoto, Takehiro Teraoka","doi":"10.1145/2735711.2735823","DOIUrl":"https://doi.org/10.1145/2735711.2735823","url":null,"abstract":"It has become remarkably common recently for people to own multiple mobile devices, although it is still difficult to effectively use them in combination. In this study, we constructed a new system called VISTouch that achieves a new operational capability and increases user interest in mobile devices by enabling multiple devices to be used in combination dynamically and spatially. Using VISTouch, for example, to spatially connect a smart-phone to a horizontally positioned tablet that is displaying a map as viewed from above enables these devices to dynamically obtain the correct relative position. The smart-phone displays images viewed from its position, direction, and angle in real time as a window to show the virtual 3D space. We applied VISTouch to two applications that used detailed information of the relative position in real space between multiple devices. These applications showed the potential improvement in using multiple devices in combination.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128702064","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
T. Nojima, Ngoc Phuong, Takahiro Kai, Toshiki Sato, H. Koike
{"title":"Augmented dodgeball: an approach to designing augmented sports","authors":"T. Nojima, Ngoc Phuong, Takahiro Kai, Toshiki Sato, H. Koike","doi":"10.1145/2735711.2735834","DOIUrl":"https://doi.org/10.1145/2735711.2735834","url":null,"abstract":"Ubiquitous computing offers enhanced interactive, human-centric experiences including sporting and fitness-based applications. To enhance this experience further, we consider augmenting dodgeball by adding digital elements to a traditional ball game. To achieve this, an understanding of the game mechanics with participating movable bodies, is required. This paper discusses the design process of a ball--player-centric interface that uses live data acquisition during gameplay for augmented dodgeball, which is presented as an application of augmented sports. Initial prototype testing shows that player detection can be achieved using a low-energy wireless sensor based network such as that used with fitness sensors, and a ball with an embedded sensor together with proximity tagging.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124357056","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Kunze, S. Sanchez, Tilman Dingler, Olivier Augereau, K. Kise, M. Inami, T. Terada
{"title":"The augmented narrative: toward estimating reader engagement","authors":"K. Kunze, S. Sanchez, Tilman Dingler, Olivier Augereau, K. Kise, M. Inami, T. Terada","doi":"10.1145/2735711.2735814","DOIUrl":"https://doi.org/10.1145/2735711.2735814","url":null,"abstract":"We present the concept of bio-feedback driven computing to design a responsive narrative, which acts according to the readers experience. We explore on how to detect engagement and give our evaluation on the usefulness of different sensor modalities. We find temperature and blink frequency are best to estimate engagement and can classify engaging and non-engaging user-independent without error for a small user sample size (5 users).","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126384878","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jochen Huber, Hasantha Malavipathirana, Yikun Wang, Xinyu Li, Jody C. Fu, P. Maes, Suranga Nanayakkara
{"title":"Feel & see the globe: a thermal, interactive installation","authors":"Jochen Huber, Hasantha Malavipathirana, Yikun Wang, Xinyu Li, Jody C. Fu, P. Maes, Suranga Nanayakkara","doi":"10.1145/2735711.2735776","DOIUrl":"https://doi.org/10.1145/2735711.2735776","url":null,"abstract":"\"Feel & See the Globe\" is a thermal, interactive installation. The central idea is to map temperature information in regions around the world from prehistoric, modern to futuristic times onto a low fidelity display. The display visually communicates global temperature rates and lets visitors experience the temperature physically through a tangible, thermal artifact. A pertinent educational aim is to inform and teach about global warming.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132072761","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Paweł W. Woźniak, Kristina Knaving, M. Obaid, M. G. Carcedo, A. Ünlüer, M. Fjeld
{"title":"ChromaGlove: a wearable haptic feedback device for colour recognition","authors":"Paweł W. Woźniak, Kristina Knaving, M. Obaid, M. G. Carcedo, A. Ünlüer, M. Fjeld","doi":"10.1145/2735711.2735781","DOIUrl":"https://doi.org/10.1145/2735711.2735781","url":null,"abstract":"While colourblindness is a disability that does not prevent those suffering from it from living fruitful lives, it does cause difficulties in everyday life situations such as buying clothes. Users suffering from colourblindness may be helped by designing devices that integrate well with their daily routines. This paper introduces ChromaGlove, a wearable device that converts colour input into haptic output thus enhancing the colour-sensing ability of the user. The device uses variable pulse widths on vibration motor to communicate differences in hue. Data is obtained through an illuminated colour sensor placed on the palm. In the future, we plan to conduct studies that will show how well a haptic glove can be integrated in everyday actions.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130019148","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Daichi Nagata, Yutaka Arakawa, Takatomi Kubo, K. Yasumoto
{"title":"Effective napping support system by hypnagogic time estimation based on heart rate sensor","authors":"Daichi Nagata, Yutaka Arakawa, Takatomi Kubo, K. Yasumoto","doi":"10.1145/2735711.2735808","DOIUrl":"https://doi.org/10.1145/2735711.2735808","url":null,"abstract":"In daily life, lack of sleep is one of the main reasons for poor concentration. To support an effective napping, considered as one of good methods for recovering insufficient sleep and enhancing a user's concentration, we propose a hypnagogic time estimation using a heart rate sensor. Because a heart rate sensor has already been common, our method can be used widely and easily in our daily life. Most of existing sleep support systems aim to provide a comfortable wake-up by observing the sleep stage. Unlike these methods, we aim to provide an appropriate sleep duration by estimating a hypnagogic timing. By using various heart rate sensors, existing sleep support systems and 64ch electroencephalography, we tried to find out the relationship between various vital signals and sleep stages during a napping. Finally, we build a hypnagogic time estimation model by using the machine learning technique.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"506 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116329988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Snow walking: motion-limiting device that reproduces the experience of walking in deep snow","authors":"Tomohiro Yokota, Motohiro Ohtake, Yukihiro Nishimura, Toshiya Yui, Rico Uchikura, Tomoko Hashida","doi":"10.1145/2735711.2735829","DOIUrl":"https://doi.org/10.1145/2735711.2735829","url":null,"abstract":"We propose \"Snow Walking,\" a boot-shaped device that reproduces the experience of walking in deep snow. The main purpose of this study is reproducing the feel of walking in a unique environment that we do not experience daily, particularly one that has depth, such as of deep snow. When you walk in deep snow, you get three feelings: the feel of pulling your foot up from the deep snow, the feel of putting your foot down into the deep snow, and the feel of your feet crunching across the bottom of deep snow. You cannot walk in deep snow easily, and with the system, you get a unique feeling not only on the sole of your foot but as if your entire foot is buried in the snow. We reproduce these feelings by using a slider, electromagnet, vibration speaker, a hook and loop fastener, and potato starch.","PeriodicalId":246615,"journal":{"name":"Proceedings of the 6th Augmented Human International Conference","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125991959","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}