{"title":"Brain Response to Focal Vibro-Tactile Stimulation Prior to Muscle Contraction","authors":"T. Jevtic, A. Zivanovic, R. Loureiro","doi":"10.1109/IE.2016.40","DOIUrl":"https://doi.org/10.1109/IE.2016.40","url":null,"abstract":"This paper presents a single case study of an on-going study evaluating cortical association with facilitation and management of vibro-tactile stimulation applied prior to voluntary muscle contraction. The study consisted of three repetitions of relaxation phase during which vibrations are applied, and a contraction phase. EEG and EMG data was collected to determine muscle and brain activation patterns. The EEG analysis of the mu waves during relaxation + vibration phase seem to indicate sensory cortex activation during focal muscle vibrations. With repetitiveness of vibrations, an increase in maximal calculated mu power was observed that could suggest optimization of the muscle fibers prior to the contraction. When contraction is performed, mu waves are desynchronizing with the movement execution. The analysis of the last relaxation period indicate that the muscle itself facilitates the last contraction locally possibly due to cortical learning.","PeriodicalId":425456,"journal":{"name":"2016 12th International Conference on Intelligent Environments (IE)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127455381","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Group Interaction through a Multi-modal Haptic Framework","authors":"Hoang H. Le, M. Loomes, R. Loureiro","doi":"10.1109/IE.2016.18","DOIUrl":"https://doi.org/10.1109/IE.2016.18","url":null,"abstract":"This paper introduces a new haptic-supported software framework that facilitates the set up of different types of group interaction. The framework consists of multiple open source libraries supporting a range of external devices and services (e.g. Microsoft Kinect, cameras, Arduino controllers, sensors, AR tracking, remote haptic interaction,). To date three pilot studies have been conducted to test out the framework based on some existing benchmarks from the literature. Benchmarking studies have shown the flexibility and stability of the framework to devise interactive tasks in different social environments. It pointed out that this framework has the potential to be applied for socially assistive robotics field.","PeriodicalId":425456,"journal":{"name":"2016 12th International Conference on Intelligent Environments (IE)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130375175","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Qualitative Image Descriptor QIDL+ Applied to Ambient Intelligent Systems","authors":"Zoe Falomir","doi":"10.1109/IE.2016.11","DOIUrl":"https://doi.org/10.1109/IE.2016.11","url":null,"abstract":"A model for obtaining logic descriptions of real digital images in office scenarios is presented in this paper. The QIDL+ descriptor uses qualitative features of shape, colour, topology, location and it also includes the size feature, which shows is effectivity in scenes where the point of view of the observer and the camera is the same. QIDL+ is aimed at describing the location of target objects with respect to known or unknown objects in a scene. Known objects in the scene are identified by object detectors, while other object categories are inferred using its qualitative features. Moreover, the logic description provided is shown to be useful for reasoning about spatial locations, and the qualitative features obtained can be included in a narrative description for enhancing human-machine interaction. Tests have been carried out on office top desks scenarios to illustrate the performance of the approach and promising results are obtained.","PeriodicalId":425456,"journal":{"name":"2016 12th International Conference on Intelligent Environments (IE)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121333726","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mobile Computing with Near Field Communication (NFC) Smart Objects Applied to Archaeological Research","authors":"A. López, Gloria Fernandez, Francisco Burillo","doi":"10.1109/IE.2016.22","DOIUrl":"https://doi.org/10.1109/IE.2016.22","url":null,"abstract":"In this work, a mobile system designed to manage information at an excavation site is presented. A group of wireless devices connected to the telephone network, send the data collected by different users to a common database. At the same time, every archaeological finding is integrated into the information system when it is tagged with a NFC transponder that can communicate with the mobile devices equipped with a NFC interface. In this way, the identification code attached to the element together with the information recorded inside the memory of the NFC tag determine which system services automatically start up, which tasks are requested to the user and which actions must be prevented or unless warned as undesirable along the entire process of the archaeological research. This technique improves archaeological work in several ways. First, it speeds up the process of collecting, saving, updating and duplicating the data associated with every piece of material. Second, smart NFC objects helps researchers to make decisions about the next steps in the archaeological research, because they stored the information that determines the necessities of every artifact. Third, it reduces human error in transcribing information or in performing incorrect actions with a specific material. This work extends the use of NFC technology and mobile systems to a new field, the archaeological research.","PeriodicalId":425456,"journal":{"name":"2016 12th International Conference on Intelligent Environments (IE)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123022195","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Ubidoo: Embedded Multi-room, Multi-user Tracking","authors":"Benjamin Schmitz, Sven Stamm, Julia Brich","doi":"10.1109/IE.2016.51","DOIUrl":"https://doi.org/10.1109/IE.2016.51","url":null,"abstract":"We present Ubidoo, a home automation system that is able to fuse information from a variety of sensors and use logical inference to trigger appropriate actions based on observed events. The prototype is capable of tracking and identifying multiple users within a domestic setting, resolving ambiguous situations that occur when users cross ways. The system is applied to keep track of multimedia playback on a user's device, with the capability of rerouting it to the user's current location.","PeriodicalId":425456,"journal":{"name":"2016 12th International Conference on Intelligent Environments (IE)","volume":"24 9","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121018063","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Thomas Vilarinho, B. Farshchian, Leendert W. M. Wienhofen, Thomas Franang, H. Gulbrandsen
{"title":"Combining Persuasive Computing and User Centered Design into an Energy Awareness System for Smart Houses","authors":"Thomas Vilarinho, B. Farshchian, Leendert W. M. Wienhofen, Thomas Franang, H. Gulbrandsen","doi":"10.1109/IE.2016.14","DOIUrl":"https://doi.org/10.1109/IE.2016.14","url":null,"abstract":"The environmental impacts of the usage of fossil fuels together with its limited supply has been pushing governments, industries and people to seek cleaner and renewable energy supplies and to adopt energy habits leading to a greater sustainability. While acquiring a Photo Voltaic solar panel (PV) is a big step in that direction, much can also be accomplished with changes in individual and collective energy consumption behavior. Moreover both strategies can be used together. This paper presents a software prototype capable of increasing the awareness of the energy consumption in a smart house and supporting a behavior change towards greener consumption habits. The software was developed following the design science research methodology anchored by the application of User Centered Design and the theories of persuasive computing. The development of the User Interface (UI) was done following several iterations with both end-users and experts. In this work, we focus on the UI elements created in order to apply the concepts from different behavior change support methods and theories, such as Feedback, Gamification, and Social norm into energy savings and efficiency. The result was acclaimed during an expert evaluation and will now be trialed in two different trial neighborhoods.","PeriodicalId":425456,"journal":{"name":"2016 12th International Conference on Intelligent Environments (IE)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114982017","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Raúl Parada, Prof. Dr. Kamruddin Nur, J. Melià-Seguí, R. Pous
{"title":"Smart Surface: RFID-Based Gesture Recognition Using k-Means Algorithm","authors":"Raúl Parada, Prof. Dr. Kamruddin Nur, J. Melià-Seguí, R. Pous","doi":"10.1109/IE.2016.25","DOIUrl":"https://doi.org/10.1109/IE.2016.25","url":null,"abstract":"Elder adults may have some dependence on performing common activities like zapping on the television through a remote control (i.e. due to possible hand mobility problems). The Internet of Things (IoT), including the Radio Frequency Identification (RFID), interconnects devices to provide a higher variety of services. Together, and by applying intelligence through Machine Learning (ML) techniques, advanced applications can be implemented improving people's life. We present the Smart Surface system, relying on state of the art RFID equipment. It uses the unsupervised machine learning technique K-means clustering to detect and trigger actions by means of simple gestures, in real time and in a non-intrusive way. We implemented and evaluated a prototype of the Smart Surface system achieving an accuracy of 100% gesture recognition.","PeriodicalId":425456,"journal":{"name":"2016 12th International Conference on Intelligent Environments (IE)","volume":"310 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125763500","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Helin, J. Karjalainen, T. Kuula, Nicolas Philippon
{"title":"Virtual/Mixed/Augmented Reality Laboratory Research for the Study of Augmented Human and Human-Machine Systems","authors":"K. Helin, J. Karjalainen, T. Kuula, Nicolas Philippon","doi":"10.1109/IE.2016.35","DOIUrl":"https://doi.org/10.1109/IE.2016.35","url":null,"abstract":"In this work we introduce a new research concept called Augmented Human, and consider how it may benefit from prior research efforts. In essence, the paper describes the long research history of a specific Virtual/Mixed/Augmented Reality (VR/MR/AR) laboratory and reflects how it may well be employed as a premise for Augmented Human research and the design of new human-machine systems. The paper briefly describes how constitutive laboratory research has already been employed for some years in more than a hundred company cases, which have exploited participatory design and human-centered design for the human-machine system development. The paper describes further how the latest cases have moved closer to the human boundary level and thus oriented towards Augmented Human research. This fifth generation VR/MR/AR/AH laboratory has taken the form of an open cave-like environment with a motion platform, 3D sound, haptics, VR/AR Head-mounted displays and physical objects.","PeriodicalId":425456,"journal":{"name":"2016 12th International Conference on Intelligent Environments (IE)","volume":"183 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127913924","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Moniri, Fabio Andres Espinosa Valcarcel, Dieter Merkel, Daniel Sonntag
{"title":"Human Gaze and Focus-of-Attention in Dual Reality Human-Robot Collaboration","authors":"M. Moniri, Fabio Andres Espinosa Valcarcel, Dieter Merkel, Daniel Sonntag","doi":"10.1109/IE.2016.54","DOIUrl":"https://doi.org/10.1109/IE.2016.54","url":null,"abstract":"Human gaze is an important indicator of the direction of visual focus-of-attention. This information can be very useful in human-robot interaction scenarios. This paper describes a research prototype that utilizes the user's visual attention in a collaborative dual reality environment. We add an additional dimension to existing human-robot interaction scenarios and describe a human-robot collaboration scenario in which the involved human participants are in two different physical locations. One user has the same physical actions space as the robot, the second user is monitoring the setup and thereby collaborating through a virtual reality system. The proposed research prototype monitors the user's visual attention in both real and virtual environments. The prototype also provides information in both the virtual and real environment which results in a dual reality collaboration scenario. As a result, new human-robot interactions brought about by Industrie 4.0, with novel forms of collaborative factory work, can be constructed.","PeriodicalId":425456,"journal":{"name":"2016 12th International Conference on Intelligent Environments (IE)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130765136","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Saha, B. Bortz, Wei Huang, Thomas L. Martin, R. B. Knapp
{"title":"Affect-Aware Intelligent Environment Using Musical Cues as an Emotion Learning Framework","authors":"D. Saha, B. Bortz, Wei Huang, Thomas L. Martin, R. B. Knapp","doi":"10.1109/IE.2016.39","DOIUrl":"https://doi.org/10.1109/IE.2016.39","url":null,"abstract":"This position paper posits the use of an individual's affective response to musical cues as a means of designing an implicit communication channel between the user and their immediate computing infrastructure in the form of an intelligent environment. Interaction design for a sensor-rich intelligent environment is a challenging problem, often arising from the dynamic nature of such pervasive systems having no fixed set of interaction devices or users. However, the knowledge of a user's affective responses to known musical cues may provide a learning framework for inferring affective states such as stress or frustration. This, in turn, may be used by the intelligent environment to assess the user's (dis)approval of the services it provides, helping it to refine its services to better suit the user's immediate needs.","PeriodicalId":425456,"journal":{"name":"2016 12th International Conference on Intelligent Environments (IE)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134570132","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}