{"title":"Session details: NOTION: Human Behaviour Monitoring, Interpretation and Understanding Workshop (1ST Session)","authors":"Ahmad Lotfi","doi":"10.1145/3258053","DOIUrl":"https://doi.org/10.1145/3258053","url":null,"abstract":"","PeriodicalId":359504,"journal":{"name":"Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131720898","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"ADAMAAS: Towards Smart Glasses for Mobile and Personalized Action Assistance","authors":"K. Essig, Benjamin Strenge, T. Schack","doi":"10.1145/2910674.2910727","DOIUrl":"https://doi.org/10.1145/2910674.2910727","url":null,"abstract":"In this paper, we describe the assistive system ADAMAAS (Adaptive and Mobile Action Assistance) introducing a new advanced smartglasses technology. The aim of ADAMAAS is to move from stationary status diagnosis systems to a mobile and adaptive action support and monitoring system, which is able to dynamically react in a context-sensitive way to human error (slips and mistakes) and to provide individualized feedback on a transparent virtual plane superimposed on user's field of view. For this purpose ADAMAAS uses advanced technologies like augmented reality (AR), eye tracking, object recognition, and systematic analysis of users' mental representations in long-term memory. Preliminary user tests with disabled participants at an early prototype stage revealed no substantial physical restrictions in the execution of their activities, positive feedback regarding the assistive hints, and that participants could imagine wearing the glasses for long periods of time.","PeriodicalId":359504,"journal":{"name":"Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments","volume":"103 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131704929","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Izdebski, A. S. Oliveira, Bryan R. Schlink, Petr Legkov, Silke Kärcher, W. Hairston, Daniel P. Ferris, P. König
{"title":"Usability of EEG Systems: User Experience Study","authors":"K. Izdebski, A. S. Oliveira, Bryan R. Schlink, Petr Legkov, Silke Kärcher, W. Hairston, Daniel P. Ferris, P. König","doi":"10.1145/2910674.2910714","DOIUrl":"https://doi.org/10.1145/2910674.2910714","url":null,"abstract":"In recent years there was a change in EEG experimental designs - from simple behavior in the lab to complex behavior outside. That change required also an adjustment of EEG systems -- from being static and sensitive to mobile and noise-resistant. The rapid technological development has to balance performance (e.g. number of channels, low impedance contact) with usability (e.g. comfort for the participant, contact pressure, wet/dry electrodes) and mobility (e.g. wiring, weight). This has led to wide variety of designs which differ widely in properties. Here we compare 7 EEG systems with respect to the participant's user experience. Results demonstrate that from perspective of user experience of participants, mobile wet system (Cwet) had the highest score.","PeriodicalId":359504,"journal":{"name":"Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117195763","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Enhancing Memory Retention by Increasing Alpha and Decreasing Beta Brainwaves using Music","authors":"Tasnim Makada, Danyal Ozair, Mudassir Mohammed, Cheryl Abellanoza","doi":"10.1145/2910674.2935851","DOIUrl":"https://doi.org/10.1145/2910674.2935851","url":null,"abstract":"Brain waves can aptly define the state of a person's mind. High activity and attention lead to dominant beta waves while relaxation and focus lead to dominant alpha waves in the brain. Alpha state of mind is ideal for learning and memory retention. In our experiment we aim to increase alpha waves and decrease beta waves in a person with the help of music to measure improvement in memory retention. Our hypothesis is that, when a person listens to music which causes relaxation, he is more likely to attain the alpha state of mind and enhance his memory retention ability. To verify this hypothesis, we conducted an experiment on 5 participants. The participants were asked to take a similar quiz twice, under different states of mind. During the experimentation process, the brain activity of the participants was recorded and analyzed using MUSE, an off-the-shelf device for brainwave capturing and analysis.","PeriodicalId":359504,"journal":{"name":"Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments","volume":"90 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114401321","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
C. Smailis, N. Sarafianos, Theodoros Giannakopoulos, S. Perantonis
{"title":"Fusing active orientation models and mid-term audio features for automatic depression estimation","authors":"C. Smailis, N. Sarafianos, Theodoros Giannakopoulos, S. Perantonis","doi":"10.1145/2910674.2935856","DOIUrl":"https://doi.org/10.1145/2910674.2935856","url":null,"abstract":"In this paper, we predict a human's depression level in the BDI-II scale, using facial and voice features. Active orientation models (AOM) and several voice features were extracted from the video and audio modalities. Long-term and mid-term features were computed and a fusion is performed in the feature space. Videos from the Depression Recognition Sub-Challenge of the 2014 Audio-Visual Emotion Challenge and Workshop (AVEC 2014) were used and support vector regression models were trained to predict the depression level. We demonstrated that the fusion of AOMs with audio features leads to better performance compared to individual modalities. The obtained regression results indicate the robustness of the proposed technique, under different settings, as well as an RMSE improvement compared to the AVEC 2014 video baseline.","PeriodicalId":359504,"journal":{"name":"Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124107024","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Suad Albawendi, Kofi Appiah, Heather Powell, Ahmad Lotfi
{"title":"Video Based Fall Detection with Enhanced Motion History Images","authors":"Suad Albawendi, Kofi Appiah, Heather Powell, Ahmad Lotfi","doi":"10.1145/2910674.2935832","DOIUrl":"https://doi.org/10.1145/2910674.2935832","url":null,"abstract":"Computer vision systems offer a new promising solution which can help older people stay at home by providing a secure environment and improve their quality of life. One application area of video surveillance is to analyse human behaviour and detect unusual behaviour. Falls are one of the greatest risks for the elderly living at home. This paper presents a novel approach for detecting falls, based on a combination of motion information and human shape variation. The motion information of a segmented silhouette, when extracted can provide a useful cue for classifying different behaviours. Also, the variation in human shape can used to establish the pose and hence fall events. The approach presented here extracts motion information, use variation in shape and in addition use best-fit approximated ellipse around the human body to further improved the accuracy of falls detection. Result of our approach demonstrates a 20% improvement over motion information only implementations.","PeriodicalId":359504,"journal":{"name":"Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129527662","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
P. Agrafiotis, A. Doulamis, G. Athanasiou, A. Amditis
{"title":"Real Time Earthquake's Survivor Detection using a Miniaturized LWIR Camera","authors":"P. Agrafiotis, A. Doulamis, G. Athanasiou, A. Amditis","doi":"10.1145/2910674.2935864","DOIUrl":"https://doi.org/10.1145/2910674.2935864","url":null,"abstract":"In this paper a system suitable to perform precise and fast earthquake's survivor detection using a miniaturized Long Wave Infrared (LWIR) camera is described. Main challenge of this work is the detection environment which may be characterized by smoke, dust, rubble and extremely narrow spaces as well as the extremely low resolution of the continuous moving LWIR camera. To this direction the thermal information received by the LWIR camera is exploited. In addition, research is carried out in order to implement feature descriptors for detecting only parts of partly occluded people (arms, etc.) in order to reduce false positive ratio. The proposed system achieves real time earthquake's survivor detection using a miniaturized LWIR camera. The results have been tested and evaluated in real life conditions using two different LWIR cameras for proving the robustness and the accuracy of the developed system.","PeriodicalId":359504,"journal":{"name":"Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132428712","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Michail Theofanidis, Alexandros Lioulemes, F. Makedon
{"title":"A Motion and Force Analysis System for Human Upper-limb Exercises","authors":"Michail Theofanidis, Alexandros Lioulemes, F. Makedon","doi":"10.1145/2910674.2910698","DOIUrl":"https://doi.org/10.1145/2910674.2910698","url":null,"abstract":"This paper describes a novel system that can demonstrate the potential to track and estimate the torques that affect the human arm of an individual that performs rehabilitation exercises with the use of Kinect v2. The system focuses on eliminating the jerky motions captured by the Kinect with the incorporation of robotic mechanics methodologies that have been applied in the field of robotic mechanical design. In order to achieve this results, the system takes full advantage of the dynamic and kinematic formulas that describe the motion rigid bodies. Lastly, a simulation experiment is depicted to demonstrate the results of the system.","PeriodicalId":359504,"journal":{"name":"Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133355781","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Multimodal Interfaces and Human-Computer Interaction","authors":"Oliver Korn","doi":"10.1145/3258056","DOIUrl":"https://doi.org/10.1145/3258056","url":null,"abstract":"","PeriodicalId":359504,"journal":{"name":"Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments","volume":"135 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115162891","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Abujelala, Cheryl Abellanoza, Aayush Sharma, F. Makedon
{"title":"Brain-EE: Brain Enjoyment Evaluation using Commercial EEG Headband","authors":"M. Abujelala, Cheryl Abellanoza, Aayush Sharma, F. Makedon","doi":"10.1145/2910674.2910691","DOIUrl":"https://doi.org/10.1145/2910674.2910691","url":null,"abstract":"Previous studies that involve measuring EEG, or electroencephalograms, have mainly been experimentally-driven projects; for instance, EEG has long been used in research to help identify and elucidate our understanding of many neuroscientific, cognitive, and clinical issues (e.g., sleep, seizures, memory). However, advances in technology have made EEG more accessible to the population. This opens up lines for EEG to provide more information about brain activity in everyday life, rather than in a laboratory setting. To take advantage of the technological advances that have allowed for this, we introduce the Brain-EE system, a method for evaluating user engaged enjoyment that uses a commercially available EEG tool (Muse). During testing, fifteen participants engaged in two tasks (playing two different video games via tablet), and their EEG data were recorded. The Brain-EE system supported much of the previous literature on enjoyment; increases in frontal theta activity strongly and reliably predicted which game each individual participant preferred. We hope to develop the Brain-EE system further in order to contribute to a wide variety of applications (e.g., usability testing, clinical or experimental applications, evaluation methods, etc.).","PeriodicalId":359504,"journal":{"name":"Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127544080","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}