{"title":"Build your Own!: Open-Source VR Shoes for Unity3D","authors":"J. Reinhardt, E. Lewandowski, Katrin Wolf","doi":"10.1145/3311823.3311852","DOIUrl":"https://doi.org/10.1145/3311823.3311852","url":null,"abstract":"Hand-held controllers enable all kinds of interaction in Virtual Reality (VR), such as object manipulation as well as for locomotion. VR shoes allow using the hand exclusively for naturally manual tasks, such as object manipulation, while locomotion could be realized through feet input -- just like in the physical world. While hand-held VR controllers became standard input devices for consumer VR products, VR shoes are only barely available, and also research on that input modality remains open questions. We contribute here with open-source VR shoes and describe how to build and implement them as Unity3D input device. We hope to support researchers in VR research and practitioners in VR product design to increase usability and natural interaction in VR.","PeriodicalId":433578,"journal":{"name":"Proceedings of the 10th Augmented Human International Conference 2019","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121263918","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Evaluation of a device reproducing the pseudo-force sensation caused by a clothespin","authors":"Masahiro Miyakami, Takuto Nakamura, H. Kajimoto","doi":"10.1145/3311823.3311837","DOIUrl":"https://doi.org/10.1145/3311823.3311837","url":null,"abstract":"A pseudo-force sensation can be elicited by pinching a finger with a clothespin. When the clothespin is used to pinch the finger from the palm side, a pseudo-force is felt in the direction towards the palm side, and when it is used to pinch the finger from the back side of the hand, the pseudo-force is felt in the extension direction. Here, as a first step to utilizing this phenomenon in human-machine interfaces, we developed a device that reproduces the clothespin phenomenon and confirmed the occurrence rate of the pseudo-force sensation.","PeriodicalId":433578,"journal":{"name":"Proceedings of the 10th Augmented Human International Conference 2019","volume":"89 11","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131957734","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ayane Saito, W. Kuno, Wataru Kawai, N. Miyata, Yuta Sugiura
{"title":"Estimation of Fingertip Contact Force by Measuring Skin Deformation and Posture with Photo-reflective Sensors","authors":"Ayane Saito, W. Kuno, Wataru Kawai, N. Miyata, Yuta Sugiura","doi":"10.1145/3311823.3311824","DOIUrl":"https://doi.org/10.1145/3311823.3311824","url":null,"abstract":"A wearable device for measuring skin deformation of the fingertip---to obtain contact force when the finger touches an object---was prototyped and experimentally evaluated. The device is attached to the fingertip and uses multiple photo-reflective sensors (PRSs) to measures the distance from the PRSs to the side surface of the fingertip. The sensors do not touch the contact surface between the fingertip and the object; as a result, the contact force is obtained without changing the user's tactile sensation. In addition, the accuracy of estimated contact force was improved by determining the posture of the fingertip by measuring the distance between the fingertip and the contact surface. Based on the prototyped device, a system for estimating three-dimensional contact force on the fingertip was implemented.","PeriodicalId":433578,"journal":{"name":"Proceedings of the 10th Augmented Human International Conference 2019","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133641673","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Augmented taste of wine by artificial climate room: Influence of temperature and humidity on taste evaluation","authors":"Toshiharu Igarashi, Tatsuya Minagawa, Yoichi Ochiai","doi":"10.1145/3311823.3311871","DOIUrl":"https://doi.org/10.1145/3311823.3311871","url":null,"abstract":"In previous research, there is a augmenting device limited taste influences due to limited contact with utensils. However, in the situation such as enjoying wine while talking with other people and matching cheese with wine, the solution that limits human behaviors must not have been acceptable. So, we focused on changing the temperature and humidity when drinking wine. To study the influence of temperature and humidity on the ingredients and subjective taste of wine, we conducted wine tasting experiments with 16 subjects using an artificial climate room. For the environmental settings, three conditions, i.e., a room temperature of 14°C and humidity of 35%, 17°C and 40% humidity, and 26°C and 40% humidity, were evaluated. In one of the two wines used in the experiment, significant differences in [Color intensity], [Smell development] and [Body] were detected among conditions (p < 0.05). We further investigated changes in the components of the two wines at different temperature conditions (14°C, 17°C, 23°C, and 26°C). Malic acid, protocatechuic acid, gallic acid, and epicatechin were related to temperature in the former wine only. In conclusion, we confirmed that we can change the taste evaluation of wine by adjusting temperature and humidity using the artificial climate room, without attaching the device to human beings themselves. This suggests the possibility to serve wine in a more optimal environment if we can identify the type of wine and person's preference.","PeriodicalId":433578,"journal":{"name":"Proceedings of the 10th Augmented Human International Conference 2019","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114483990","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Let Your World Open: CAVE-based Visualization Methods of Public Virtual Reality towards a Shareable VR Experience","authors":"Akira Ishii, M. Tsuruta, Ippei Suzuki, Shuta Nakamae, Junichi Suzuki, Yoichi Ochiai","doi":"10.1145/3311823.3311860","DOIUrl":"https://doi.org/10.1145/3311823.3311860","url":null,"abstract":"Virtual reality (VR) games are currently becoming part of the public-space entertainment (e.g., VR amusement parks). Therefore, VR games should be attractive for players, as well as for bystanders. Current VR systems are still mostly focused on enhancing the experience of the head-mounted display (HMD) users; thus, bystanders without an HMD cannot enjoy the experience together with the HMD users. We propose the \"ReverseCAVE\": a proof-of-concept prototype for public VR visualization using CAVE-based projection with translucent screens for bystanders toward a shareable VR experience. The screens surround the HMD user and the VR environment is projected onto the screens. This enables the bystanders to see the HMD user and the VR environment simultaneously. We designed and implemented the ReverseCAVE, and evaluated it in terms of the degree of attention, attractiveness, enjoyment, and shareability, assuming that it is used in a public space. Thus, we can make the VR world more accessible and enhance the public VR experience of the bystanders via the ReverseCAVE.","PeriodicalId":433578,"journal":{"name":"Proceedings of the 10th Augmented Human International Conference 2019","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114530293","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"BitoBody","authors":"Erwin Wu, Mistki Piekenbrock, Hideki Koike","doi":"10.1145/3311823.3311855","DOIUrl":"https://doi.org/10.1145/3311823.3311855","url":null,"abstract":"In this research, we propose a novel human body contact detection and projection system with dynamic mesh collider. We use motion capture camera and generated human 3D models to detect the contact between user's bodies. Since it is difficult to update human mesh collider every frame, a special algorithm that divides body meshes into small pieces of polygons to do collision detection is developed and detected hit information will be dynamically projected according to its magnitude of damage. The maximum deviation of damage projection is about 7.9cm under a 240-fps optitrack motion capture system and 12.0cm under a 30-fps Kinect camera. The proposed system can be used in various sports where bodies come in contact and it allows the audience and players to understand the context easier.","PeriodicalId":433578,"journal":{"name":"Proceedings of the 10th Augmented Human International Conference 2019","volume":"117 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117279563","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Investigating Universal Appliance Control through Wearable Augmented Reality","authors":"Vincent Becker, Felix Rauchenstein, Gábor Sörös","doi":"10.1145/3311823.3311853","DOIUrl":"https://doi.org/10.1145/3311823.3311853","url":null,"abstract":"The number of interconnected devices around us is constantly growing. However, it may become challenging to control all these devices when control interfaces are distributed over mechanical elements, apps, and configuration webpages. We investigate interaction methods for smart devices in augmented reality. The physical objects are augmented with interaction widgets, which are generated on demand and represent the connected devices along with their adjustable parameters. For example, a loudspeaker can be overlaid with a controller widget for its volume. We explore three ways of manipulating the virtual widgets: (a) in-air finger pinching and sliding, (b) whole arm gestures rotating and waving, (c) incorporating physical objects in the surrounding and mapping their movements to the interaction primitives. We compare these methods in a user study with 25 participants and find significant differences in the preference of the users, the speed of executing commands, and the granularity of the type of control.","PeriodicalId":433578,"journal":{"name":"Proceedings of the 10th Augmented Human International Conference 2019","volume":"515 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116210188","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Design of Enhanced Flashcards for Second Language Vocabulary Learning with Emotional Binaural Narration","authors":"S. Fukushima","doi":"10.1145/3311823.3311867","DOIUrl":"https://doi.org/10.1145/3311823.3311867","url":null,"abstract":"In this paper, we report on the design of a flashcard application with which learners experience the meaning of written words with emotional binaural voice narrations to enhance second language vocabulary learning. Typically, voice used in English vocabulary learning is recorded by a native speaker with no accent, and it aims for accurate pronunciation and clarity. However, the voice can also be flat and monotonous, and it can be difficult for learners to retain the new vocabulary in the semantic memory. Enhancing textual flashcards with emotional narration in the learner's native language helps the retention of new second language vocabulary items in the episodic memory instead of the semantic memory. Further, greater emotionality in the narration reinforces the retention of episodic memory.","PeriodicalId":433578,"journal":{"name":"Proceedings of the 10th Augmented Human International Conference 2019","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122021619","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"MusiArm","authors":"Kaito Hatakeyama, M. Y. Saraiji, K. Minamizawa","doi":"10.1145/3311823.3311873","DOIUrl":"https://doi.org/10.1145/3311823.3311873","url":null,"abstract":"The emergence of prosthetic limbs where solely focused on substituting the missing limb with an artificial one, in order for the handicap people to manage their daily life independently. Past research on prosthetic hands has mainly focused on prosthesis' function and performance. Few proposals focused on the entertainment aspect of prosthetic hands. In this research, we considered the defective part as a potential margin for freely designing our bodies, and coming up with new use cases beyond the original function of the limb. Thus, we are not aiming to create anthropomorphic designs or functions of the limbs. By fusing the prosthetic hands and musical instruments, we propose a new prosthetic hand called \"MusiArm\" that extends the body part's function to become an instrument. MusiArm concept was developed through the dialogue between the handicapped people, engineers and prosthetists using the physical characteristics of the handicapped people as a \"new value\" that only the handicapped person can possess. We asked handicapped people who cannot play musical instruments, as well as people who do not usually play instruments, to use prototypes we made. As a result of the usability tests, using MusiArm, we made a part of the body function as a musical instrument, drawing out the unique expression methods of individuals, and enjoying the performance and clarify the possibility of showing interests.","PeriodicalId":433578,"journal":{"name":"Proceedings of the 10th Augmented Human International Conference 2019","volume":"111 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122839215","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tamás Karácsony, J. P. Hansen, H. Iversen, S. Puthusserypady
{"title":"Brain Computer Interface for Neuro-rehabilitation With Deep Learning Classification and Virtual Reality Feedback","authors":"Tamás Karácsony, J. P. Hansen, H. Iversen, S. Puthusserypady","doi":"10.1145/3311823.3311864","DOIUrl":"https://doi.org/10.1145/3311823.3311864","url":null,"abstract":"Though Motor Imagery (MI) stroke rehabilitation effectively promotes neural reorganization, current therapeutic methods are immeasurable and their repetitiveness can be demotivating. In this work, a real-time electroencephalogram (EEG) based MI-BCI (Brain Computer Interface) system with a virtual reality (VR) game as a motivational feedback has been developed for stroke rehabilitation. If the subject successfully hits one of the targets, it explodes and thus providing feedback on a successfully imagined and virtually executed movement of hands or feet. Novel classification algorithms with deep learning (DL) and convolutional neural network (CNN) architecture with a unique trial onset detection technique was used. Our classifiers performed better than the previous architectures on datasets from PhysioNet offline database. It provided fine classification in the real-time game setting using a 0.5 second 16 channel input for the CNN architectures. Ten participants reported the training to be interesting, fun and immersive. \"It is a bit weird, because it feels like it would be my hands\", was one of the comments from a test person. The VR system induced a slight discomfort and a moderate effort for MI activations was reported. We conclude that MI-BCI-VR systems with classifiers based on DL for real-time game applications should be considered for motivating MI stroke rehabilitation.","PeriodicalId":433578,"journal":{"name":"Proceedings of the 10th Augmented Human International Conference 2019","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123957810","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}