Sebastian Günther, Florian Müller, Markus Funk, Jan Kirchner, Niloofar Dezfuli, Max Mühlhäuser
{"title":"TactileGlove","authors":"Sebastian Günther, Florian Müller, Markus Funk, Jan Kirchner, Niloofar Dezfuli, Max Mühlhäuser","doi":"10.1145/3197768.3197785","DOIUrl":"https://doi.org/10.1145/3197768.3197785","url":null,"abstract":"With the recent advance in computing technology, more and more environments are becoming interactive. For interacting with these environments, traditionally 2D input and output elements are being used. However, recently interaction spaces also expanded to 3D space, which enabled new possibilities but also led to challenges in assisting users with interacting in such a 3D space. Usually, this challenge of communicating 3D positions is solved visually. This paper explores a different approach: spatial guidance through vibrotactile instructions. Therefore, we introduce TactileGlove, a smart glove equipped with vibrotactile actuators for providing spatial guidance in 3D space. We contribute a user study with 15 participants to explore how a different number of actuators and metaphors affect the user performance. As a result, we found that using a Pull metaphor for vibrotactile navigation instructions is preferred by our participants. Further, we found that using a higher number of actuators reduces the target acquisition time than when using a low number.","PeriodicalId":130190,"journal":{"name":"Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114641688","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The EU General Data Protection Regulation and its Effects on Designing Assistive Environments","authors":"E. Krempel, J. Beyerer","doi":"10.1145/3197768.3201567","DOIUrl":"https://doi.org/10.1145/3197768.3201567","url":null,"abstract":"On the 25th of May 2018 the EU will start to enforce the General Data Protection Regulation (EU-GDPR)[3]. This new regulation will replace the old Data Protection Act from 1998 and will disrupt common data processing practices. While the new regulation will make it easier to develop systems that comply with data protection laws all over Europe, it will change the way we design technology. With data protection a much more important factor and huge fines for data protection violations, technology vendors will demand systems where data protection was already considered during development. This will force the research community to broaden their perspective and consider how to develop and design systems in a way, that complies with data protection. This paper focuses on some of the more important parts of the GDPR for Assistive Environments. Reading the paper will not solve all your privacy related challenges but will help you to know which questions to ask.","PeriodicalId":130190,"journal":{"name":"Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114731338","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nela Murauer, Florian Müller, Sebastian Günther, Dominik Schön, Nerina Pflanz, Markus Funk
{"title":"An Analysis of Language Impact on Augmented Reality Order Picking Training","authors":"Nela Murauer, Florian Müller, Sebastian Günther, Dominik Schön, Nerina Pflanz, Markus Funk","doi":"10.1145/3197768.3201570","DOIUrl":"https://doi.org/10.1145/3197768.3201570","url":null,"abstract":"Order picking is a difficult and cognitively demanding task. Traditionally textual instructions are helping new workers to learn different picking routines. However, the textual instructions are sometimes not written in the workers ' native languages. In the area of Industry 4.0, where digital functions are finding their way into manufacturing processes, language-independent instructions are possible. Through a user study with 15 participants, we compare textual feedback in the workers' native language, textual feedback that is written in an unknown foreign language, and visual Augmented Reality (AR) feedback. We found that AR feedback is significantly faster and leads to a lower perceived cognitive load.","PeriodicalId":130190,"journal":{"name":"Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116227538","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"PISS-IoT: Person Identification and Spotting System in an Internet-of-Things Way","authors":"Harish Ram Nambiappan, S. Datta","doi":"10.1145/3197768.3197781","DOIUrl":"https://doi.org/10.1145/3197768.3197781","url":null,"abstract":"Missing persons has become one of the most serious issues in many countries around the world. It has been a fact to worry not only for those persons who lost their relatives but also for the law and order officials to search them and find them. So with this kind of problem statement in mind we have created a person identification and spotting system using the Internet-of-Things concept. Our system uses android mobile phone cameras and a server system coded with LBPH Facial recognition along with email sending functions. All these components are interconnected through Amazon AWS cloud in an Internet-of-Things fashion. This is a unique, novel, scalable and better approach, when compared to the previous works because instead of focusing on face detection and recognition in just a single device, our system involves interconnection of face detecting cameras with the face recognition and verification system which is running elsewhere. Also, our system includes the automatic email sending component interconnected with the identification system through cloud which has not been performed in any of the previous works. Experiments were conducted in real-time with real subjects at various locations around the university campus and the system worked very well where the average timings measured from the detection of face till notifying the user with the spotted location was within the range of 30 seconds to 1 minute and 40 seconds with an average value of accuracy ranging from 92 to 96 percent.","PeriodicalId":130190,"journal":{"name":"Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference","volume":"135 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129478340","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
E. Graf, C. Bauer, V. Power, A. D. Eyto, E. Bottenberg, Tommaso Poliero, M. Sposito, D. Scherly, René Henke, C. Pauli, L. Erkens, G. Brinks, L. O’Sullivan, M. Wirz, K. S. Stadler, J. Ortiz
{"title":"Basic functionality of a prototype wearable assistive soft exoskeleton for people with gait impairments: a case study","authors":"E. Graf, C. Bauer, V. Power, A. D. Eyto, E. Bottenberg, Tommaso Poliero, M. Sposito, D. Scherly, René Henke, C. Pauli, L. Erkens, G. Brinks, L. O’Sullivan, M. Wirz, K. S. Stadler, J. Ortiz","doi":"10.1145/3197768.3197779","DOIUrl":"https://doi.org/10.1145/3197768.3197779","url":null,"abstract":"XoSoft is a soft modular wearable assistive exoskeleton for people with mild to moderate gait impairments. It is currently being developed by a European Consortium (www.xosoft.eu) and aims to provide tailored and active lower limb support during ambulation. During development, user-centered design principles were followed in parallel with the aim of providing functional support during gait. A prototype was developed and was tested for practicability, usability, comfort and assistive function (summarized as basic functionality) with a potential end user. The prototype consisted of a garment, electromagnetic clutch-controlled elastic bands supporting knee- and hip flexion and a backpack containing the sensor and actuator control of the system. The participant had experienced a stroke and presented with unilateral impairment of the lower and upper extremities. In testing, he donned and doffed the prototype independently as far as possible, and performed walking trials with the system in both active (powered on) and passive (powered off) modes. Afterwards, the participant rated the perceived pressure and various elements of usability. Results highlighted aspects of the system for improvement during future phases of XoSoft development, and also identified useful aspects of prototype design to be maintained. The basic functionality of XoSoft could be assumed as satisfactory given that it was the first version of a working prototype. The study highlights the benefits of this participatory evaluation design approach in assistive soft robotics development.","PeriodicalId":130190,"journal":{"name":"Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference","volume":"92 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131662366","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Srujana Gattupalli, Ashwin Ramesh Babu, J. Brady, F. Makedon, V. Athitsos
{"title":"Towards Deep Learning based Hand Keypoints Detection for Rapid Sequential Movements from RGB Images","authors":"Srujana Gattupalli, Ashwin Ramesh Babu, J. Brady, F. Makedon, V. Athitsos","doi":"10.1145/3197768.3201538","DOIUrl":"https://doi.org/10.1145/3197768.3201538","url":null,"abstract":"Hand keypoints detection and pose estimation has numerous applications in computer vision, but it is still an unsolved problem in many aspects. An application of hand keypoints detection is in performing cognitive assessments of a subject by observing the performance of that subject in physical tasks involving rapid finger motion. As a part of this work, we introduce a novel hand keypoints benchmark dataset that consists of hand gestures recorded specifically for cognitive behavior monitoring. We explore the state of the art methods in hand keypoint detection and we provide quantitative evaluations for the performance of these methods on our dataset. In future, these results and our dataset can serve as a useful benchmark for hand keypoint recognition for rapid finger movements.","PeriodicalId":130190,"journal":{"name":"Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference","volume":"509 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115893001","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}