Shogo Yamashita, Xinlei Zhang, Takashi Miyaki, Shunichi Suwa, J. Rekimoto
{"title":"Visualizing water flows with transparent tracer particles for a surround-screen swimming pool","authors":"Shogo Yamashita, Xinlei Zhang, Takashi Miyaki, Shunichi Suwa, J. Rekimoto","doi":"10.1145/3041164.3041171","DOIUrl":"https://doi.org/10.1145/3041164.3041171","url":null,"abstract":"A surround-screen swimming pool can realize various forms of underwater entertainment and enable enhanced swimming training with supplemental visual information during underwater activities. However, one of the big challenges for such an augmented swimming pool is user interaction because the surround screen and water can make existing position-tracking methods unusable. In this paper, we propose a water flow visualization method with transparent tracer particles to enhance interactivity. We used an optical property of clear plastics called birefringence that provides vivid colors on transparent tracer particles when they are between two circular polarization sheets. Tracing objects using cameras in front of a complex background is not a stable method, but this technology enables visible tracer particles on a simple and dark background. For underwater entertainment, the water flow tracing works as a user interface because the transparent tracer particles do not stop users from viewing the images on the screen. For enhanced swimming training, swimmers can view visualized water flow caused by strokes in the augmented swimming pool. From the results of our stability evaluation of water flow tracing, the proposed method is valid even for complex backgrounds. We also conducted a feasibility test of the enhanced swimming training. According to the trial, the tracing particles could visualize the water flow caused by the strokes made by a swimmer.","PeriodicalId":210662,"journal":{"name":"Proceedings of the 8th Augmented Human International Conference","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129454825","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"HandshakAR","authors":"Mihai Bâce, Gábor Sörös, S. Staal, G. Corbellini","doi":"10.1145/3041164.3041203","DOIUrl":"https://doi.org/10.1145/3041164.3041203","url":null,"abstract":"When people are introduced to each other, exchanging contact information happens either via smartphone interactions or via more traditional business cards. Crowded social events make it more challenging to keep track of all the new contacts. We introduce HandshakAR, a novel wearable augmented reality application that enables effortless sharing of digital information. When two people share the same greeting gesture (e.g., shaking hands) and are physically close to each other, their contact information is effortlessly exchanged. There is no instrumentation in the environment required, our approach works on the users' wearable devices. Physical proximity is detected via inaudible acoustic signals, hand gestures are recognized from motion sensors, the communication between devices is handled over Bluetooth, and contact information is displayed on smartglasses. We describe the concept, the design, and an implementation of our system on unmodified wearable devices.","PeriodicalId":210662,"journal":{"name":"Proceedings of the 8th Augmented Human International Conference","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132277013","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Roger Boldu, Haimo Zhang, J. P. F. Cortés, Sachith Muthukumarana, Suranga Nanayakkara
{"title":"InSight: a systematic approach to create dynamic human-controller-interactions","authors":"Roger Boldu, Haimo Zhang, J. P. F. Cortés, Sachith Muthukumarana, Suranga Nanayakkara","doi":"10.1145/3041164.3041195","DOIUrl":"https://doi.org/10.1145/3041164.3041195","url":null,"abstract":"We present InSight, an intuitive technique to control smart objects with existing input devices in the environment, while simply looking at them. By leveraging the user's line of sight as a heuristic of gaze and attention, the InSight system directs input focus from input devices to the device that the user is looking at, thus creating an intuitive metaphor: you control the object you are looking at. In this paper, we contribute with technical details of the hardware and software implementation, and a discussion of single user and multi-user interaction possibilities.","PeriodicalId":210662,"journal":{"name":"Proceedings of the 8th Augmented Human International Conference","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127193375","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"AdaptiVisor: assisting eye adaptation via occlusive optical see-through head-mounted displays","authors":"Yuichi Hiroi, Yuta Itoh, T. Hamasaki, M. Sugimoto","doi":"10.1145/3041164.3041178","DOIUrl":"https://doi.org/10.1145/3041164.3041178","url":null,"abstract":"Brightness adaptation is a fundamental ability in human visual system, and adjusts various levels of darkness and light. While this ability is continuously used, and it can mostly handle sudden lighting changes in the environment, the adaptation could still take several minutes. Moreover, during the adaptation, the color perception changes as well. This slow reactivity and perception change of the eyes could lead to mistakes for tasks performed in dazzling or temporally high-contrast environments such as when driving into the sun or during a welding process. We present AdaptiVisor, a vision augmentation system that assists the brightness adaptation of the eye. Our system selectively modulates the intensity of the light coming into the eyes via occlusion-capable Optical See-Through Head-Mounted Displays (OST-HMD). An integrated camera captures highlights and brightness in the environment via high-dynamic range capture, and our display system selectively dims or enhances part of field of views so that the user would not perceive rapid brightness changes. We build a proof-of-concept system to evaluate the feasibility of the adaptation assistance by combining a transmissive LCD panel and an OST-HMD, and test it with a user-perspective, view-point camera. The evaluation shows that the system decreases the overexposed area in a scene to 1/15th, and enhances the color by reducing majorly underexposed area to half. We also include a preliminary user trial and it indicates that the system also works for real eyes for the HMD part and to some extent for the LCD.","PeriodicalId":210662,"journal":{"name":"Proceedings of the 8th Augmented Human International Conference","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132023335","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Suranga Nanayakkara, T. Schroepfer, L. Wyse, Aloysius Lian, A. Withana
{"title":"SonicSG: from floating to sounding pixels","authors":"Suranga Nanayakkara, T. Schroepfer, L. Wyse, Aloysius Lian, A. Withana","doi":"10.1145/3041164.3041190","DOIUrl":"https://doi.org/10.1145/3041164.3041190","url":null,"abstract":"SonicSG aimed at fostering a holistic understanding of the ways in which technology is changing our thinking about design in high-density urban city and how its creative use can reflect a sense of place. The project consisted of a large-scale interactive light installation that consisted on 1,800 floating LED lights in the shape of the island nation. These lights were individually addressable through the network and used to generate light and sound effects. The field of light was extended with \"sonified personal pixels\" that were created by the audience through personal mobile devices. These personal pixels generated a light and sound \"texture\" that connected visitors to the light field in the river and to each other. In this paper, we describe the design concept, prototyping and, implementation of as well as the user reactions to this interactive public light installation.","PeriodicalId":210662,"journal":{"name":"Proceedings of the 8th Augmented Human International Conference","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124579677","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Don Samitha Elvitigala, R. Peiris, E. Wilhelm, S. Foong, Suranga Nanayakkara
{"title":"GrabAmps: grab a wire to sense the current flow","authors":"Don Samitha Elvitigala, R. Peiris, E. Wilhelm, S. Foong, Suranga Nanayakkara","doi":"10.1145/3041164.3041199","DOIUrl":"https://doi.org/10.1145/3041164.3041199","url":null,"abstract":"In this paper, we present GrabAmps, an intuitive interface that allows users to simply grab a bundled cable and get feedback on the AC (Alternating Current) current flow through it. Single phase and three phase AC was estimated with a regression model developed using principles of applied electromagnetism. This regression model is embedded into a standalone glove with a display attached at the rear so that the users can intuitively read the current consumption information. The users may configure the glove to detect AC single phase or AC three phase current using the same sensor setup on the glove. End users such as electrical engineers and electricians who frequently wear gloves during their work can benefit from GrabAmp to identify a wire that is live, or even grab and move along a wire to trace any potential failures. We believe GrabAmps can potentially speed up maintenance processes and monitor equipment more efficiently without downtime, which is increasingly important for data centres and other critical infrastructure.","PeriodicalId":210662,"journal":{"name":"Proceedings of the 8th Augmented Human International Conference","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129548349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Towards understanding of play with augmented toys","authors":"P. Sridhar, Suranga Nanayakkara","doi":"10.1145/3041164.3041191","DOIUrl":"https://doi.org/10.1145/3041164.3041191","url":null,"abstract":"This work is directed towards understanding how the transformation of a regular object/traditional toy into an augmented toy may affect the dynamics of play behavior. We present an observational user study with 8 children from kindergarten to understand the play value of SparKubes. SparKubes are stand-alone tangible objects that accept light from one direction and pass it on in another direction. We found that children who were aware of the SparKubes' interactivity features displayed more variety of patterns and showed greater interaction with SparKubes as compared to the control group who were not aware of the features. The play behaviour revealed that SparKubes have constructive play value on the play pyramid and that adding light features changed the patterns of constructions by children. This knowledge opens up an exciting area of research in technology-mediated play and designing augmented toys for children.","PeriodicalId":210662,"journal":{"name":"Proceedings of the 8th Augmented Human International Conference","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131352796","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Shoichi Tagami, S. Yoshida, Nami Ogawa, Takuji Narumi, T. Tanikawa, M. Hirose
{"title":"Routine++: implementing pre-performance routine in a short time with an artificial success simulator","authors":"Shoichi Tagami, S. Yoshida, Nami Ogawa, Takuji Narumi, T. Tanikawa, M. Hirose","doi":"10.1145/3041164.3041187","DOIUrl":"https://doi.org/10.1145/3041164.3041187","url":null,"abstract":"This study proposes \"Routine++,\" a new technique to implement a pre-performance routine (PPR) in a short period of time by providing a user artificial successful experiences via a simulator. Implementing a PPR, which is the conventional approach for controlling a user's own mental state and improving performance in competitive sport, requires identifying the user's action that results in success. However, implementing PPRs is time-consuming because consistently achieving success in the real world is difficult. Therefore, the proposed technique relates user actions with artificial successes, and lets users perform a PPR in a relatively short time. In this study, we focused on putter golf as a task to evaluate the effectiveness of Routine++ because it requires consistent actions to achieve a good result. We then developed a virtual golf simulator that provides the user false feedback in terms of the user's success at achieving the objectives of golf without a feeling of incongruity. The results of our user study indicate that Routine++ helps to improve the performance of novice golfers playing under pressure. Furthermore, with this work we propose the effectiveness of practicing with computer technology to improve not only technical skills like body movements but also mental skills.","PeriodicalId":210662,"journal":{"name":"Proceedings of the 8th Augmented Human International Conference","volume":"144 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131814132","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chetan Thakur, Kazunori Ogawa, T. Ikeda, T. Tsuji, Y. Kurita
{"title":"Evaluation of unplugged powered suit with pneumatic gel muscles","authors":"Chetan Thakur, Kazunori Ogawa, T. Ikeda, T. Tsuji, Y. Kurita","doi":"10.1145/3041164.3041192","DOIUrl":"https://doi.org/10.1145/3041164.3041192","url":null,"abstract":"Assistive suits are useful in situations such as injury, muscle fatigue, stressful work environment in factories and are also useful for different age groups in these situations. In our research, we use pneumatic gel muscles (PGM) to develop assistive suit to enhance walking gait experience. We focused on assisting the swing phase of the gait cycle as it accounts for higher metabolic costs during gait. The pneumatic gel muscles are actuated with small pumps fitted in the shoe of the contralateral foot of assisted leg. By doing this we can take advantage of dual support phase in gait cycle and provide required power for PGM. This process provides assisting force during swing phase and thus helps user with improved walking experience. To test the effectiveness of this suit we identified muscles with reduced force and activation for assisted gait using Opensim simulation and measured EMG for these muscles during experiment. Similar patterns in both simulations and experimental results were observed.","PeriodicalId":210662,"journal":{"name":"Proceedings of the 8th Augmented Human International Conference","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131865362","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Distortion in perceived size and body-based scaling in virtual environments","authors":"Nami Ogawa, Takuji Narumi, M. Hirose","doi":"10.1145/3041164.3041204","DOIUrl":"https://doi.org/10.1145/3041164.3041204","url":null,"abstract":"In this paper, we report findings pertaining to the size perception of objects and hands in a Virtual Environment(VE). First, we found that size perception is distorted in a VE and the effect is different between objects and hands. We perceive our virtual hands as larger and objects as smaller in VEs than in real environments(REs). However, when hands interact with objects, our body is used as a metric to scale the apparent sizes of objects in the environment (body-based scaling; BBS). We also found that not only does the size of our hands influence the perceived size of the environment, but the size of familiar-sized objects influences the perceived size of our hands as well. In summary, in contrast to the independent traits of size perception of hands and objects in VEs, we tend to perceive the size based on what we see at first, either hands or objects, when we interact with the objects. These findings provide a benchmark for scale adjustment for interactive scale-sensitive virtual reality applications so as to create perceptually more precise representations of virtual objects and bodies.","PeriodicalId":210662,"journal":{"name":"Proceedings of the 8th Augmented Human International Conference","volume":"317 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124486984","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}