Eric Ras, Fridolin Wild, Christoph Stahl, Alexandre Baudet
{"title":"Bridging the Skills Gap of Workers in Industry 4.0 by Human Performance Augmentation Tools: Challenges and Roadmap","authors":"Eric Ras, Fridolin Wild, Christoph Stahl, Alexandre Baudet","doi":"10.1145/3056540.3076192","DOIUrl":"https://doi.org/10.1145/3056540.3076192","url":null,"abstract":"Industry 4.0 is a coordinated push for automation in Smart Factories and other Cyber-Physical Systems (CPS). The increasing complexity of frequently changing production environments challenges shop floor workers to perform well. The tasks they work on are getting less routine and ask for continuous knowledge and skills development. For example, the skills portfolio of workers likely requires improved higher-order thinking and decision-making skills. A wide range of research and development efforts already today sets focus on different areas of workplace learning, including performance appraisals, pedagogy and education, technology, and business economics. Bridging the skills gap, however, requires novel user-facing technologies -- such as Augmented Reality (AR) and wearables -- for human performance augmentation to improve efficiency and effectiveness of staff delivered through live guidance. AR branches out beyond mobile apps with 3D-object superimposition for marketing purposes to rather complex use cases delivered by a rapidly growing innovation ecosystem of hard- and software providers collaborating closely with R&D organisations. This paper provides a first shared vision on how AR can tackle four different challenges related to handling complexity in a CPS environment: develop intelligent assistance systems for learning and performance assessment at the workplace, adapt job profiles accordingly, and last but not least to address also the issue of work-life balance. The paper concludes with an outline of a research roadmap.","PeriodicalId":140232,"journal":{"name":"Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121407803","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Wolfartsberger, Jan Zenisek, Mathias Silmbroth, Christoph Sievi
{"title":"Towards an Augmented Reality and Sensor-Based Assistive System for Assembly Tasks","authors":"J. Wolfartsberger, Jan Zenisek, Mathias Silmbroth, Christoph Sievi","doi":"10.1145/3056540.3064969","DOIUrl":"https://doi.org/10.1145/3056540.3064969","url":null,"abstract":"Assistive systems provide tools to increase workers' productivity for industrial assembly and maintenance tasks. Current research focusses on showing step-by-step instructions in the user's field-of-view using projections or smart glasses. In this paper we present our concept for an assistive system for assembly tasks, which makes use of augmented reality technologies in combination with sensor-based systems to provide context-sensitive feedback. By this means we create an assistive environment that provides information at the right time and furthermore checks the correctness of tasks performed. Our goal is to provide a sensor-based workstation for assembly and maintenance workers that speeds up work and reduces error rates.","PeriodicalId":140232,"journal":{"name":"Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121445853","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Specialized Interactive Data Application for EEG- Based Sleep Studies","authors":"G. Panagopoulos, C. Palmer","doi":"10.1145/3056540.3076179","DOIUrl":"https://doi.org/10.1145/3056540.3076179","url":null,"abstract":"In this paper we present an approach for multimodal visualization of a subject's EEG recordings, specialized for sleep studies. A web application was developed and a real test case from asleep study with a clinically-anxious child (age 7, male) were exploited in order to showcase the operation and use of the tool.","PeriodicalId":140232,"journal":{"name":"Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments","volume":"55 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117298668","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The effects of projected versus display instructions on productivity, quality and workload in a simulated assembly task","authors":"T. Bosch, R. Könemann, H. Cock, G. V. Rhijn","doi":"10.1145/3056540.3076189","DOIUrl":"https://doi.org/10.1145/3056540.3076189","url":null,"abstract":"In this paper, we describe an experimental study that investigated the effects of electronic work instructions and AR based instructions (projected work instructions) on productivity, product quality and human operator workload. In a simulated assembly task, projected instructions on the work spot and operator guidance picking the correct component lead to significantly higher productivity and quality rates compared to instructions presented on a screen. Remarkably, at these higher levels of performance, the work load on the operator was not increased, but instead significantly decreased.","PeriodicalId":140232,"journal":{"name":"Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114668167","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Arne Bernin, Larissa Müller, Sobin Ghose, K. Luck, C. Grecos, Qi Wang, Florian Vogt
{"title":"Towards More Robust Automatic Facial Expression Recognition in Smart Environments","authors":"Arne Bernin, Larissa Müller, Sobin Ghose, K. Luck, C. Grecos, Qi Wang, Florian Vogt","doi":"10.1145/3056540.3056546","DOIUrl":"https://doi.org/10.1145/3056540.3056546","url":null,"abstract":"In this paper, we provide insights towards achieving more robust automatic facial expression recognition in smart environments based on our benchmark with three labeled facial expression databases. These databases are selected to test for desktop, 3D and smart environment application scenarios. This work is meant to provide a neutral comparison and guidelines for developers and researchers interested to integrate facial emotion recognition technologies in their applications, understand its limitations and adaptation as well as enhancement strategies. We also introduce and compare three different metrics for finding the primary expression in a time window of a displayed emotion. In addition, we outline facial emotion recognition limitations and enhancements for smart environments and non-frontal setups. By providing our comparison and enhancements we hope to build a bridge from affective computing research and solution providers to application developers that like to enhance new applications by including emotion based user modeling.","PeriodicalId":140232,"journal":{"name":"Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127269719","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Guillermo Bernal, S. Colombo, Mohammed Al Ai Baky, F. Casalegno
{"title":"Safety++: Designing IoT and Wearable Systems for Industrial Safety through a User Centered Design Approach","authors":"Guillermo Bernal, S. Colombo, Mohammed Al Ai Baky, F. Casalegno","doi":"10.1145/3056540.3056557","DOIUrl":"https://doi.org/10.1145/3056540.3056557","url":null,"abstract":"In this paper, we describe Safety++, an IoT ecosystem of connected wearable elements aimed at improving safety in the workplace in the energy industry. Safety is a major concern for energy companies and, despite the large availability of protective equipment and the adoption of strict safety procedures, it still remains a problem. Unsafe behavior is the main cause of incidents and is not addressed by current solutions. We exploit advancements in IoT and wearable computing to design, build and test a platform aimed at improving awareness, peer supervision and emergency fast response in the workplace. We do this by following a user centered design approach, which takes into consideration the user needs and actual experience, in order to increase also the solution's acceptability and adoption. Results demonstrate that focusing on real time feedback, awareness and peer communication in IoT systems can reinforce safe practices and attitudes and lead to a safer environment in energy companies.","PeriodicalId":140232,"journal":{"name":"Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments","volume":"237 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124614282","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
X. Papageorgiou, G. Chalvatzaki, A. Dometios, C. Tzafestas, P. Maragos
{"title":"Intelligent Assistive Robotic Systems for the elderly: Two real-life use cases","authors":"X. Papageorgiou, G. Chalvatzaki, A. Dometios, C. Tzafestas, P. Maragos","doi":"10.1145/3056540.3076184","DOIUrl":"https://doi.org/10.1145/3056540.3076184","url":null,"abstract":"Mobility impairments are prevalent in the elderly population and constitute one of the main causes related to difficulties in performing Activities of Daily Living (ADLs) and consequent reduction of quality of life. When designing a user-friendly assistive device for mobility constrained people, it is important to take into account the diverse spectrum of disabilities, which results into completely different needs to be covered by the device for each specific user. An intelligent adaptive behavior is necessary for the deployment of such systems. Also, elderly people have particular needs in specific case of performing bathing activities, since these tasks require body flexibility. We explore new aspects of assistive living via intelligent assistive robotic systems involving human robot interaction in a natural interface. Our aim is to build assistive robotic systems, in order to increase the independence and safety of these procedures. Towards this end, the expertise of professional carers for walking or bathing sequences and appropriate motions have to be adopted, in order to achieve natural, physical human - robot interaction. Our goal is to report current research work related to the development of two real-life use cases of intelligent robotic systems for elderly aiming to provide user-adaptive and context-aware assistance.","PeriodicalId":140232,"journal":{"name":"Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116456045","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A 3D Printable Hand Exoskeleton for the Haptic Exploration of Virtual 3D Scenes","authors":"T. Götzelmann","doi":"10.1145/3056540.3064950","DOIUrl":"https://doi.org/10.1145/3056540.3064950","url":null,"abstract":"Virtual reality is currently experiencing a comeback. A considerable market has developed for VR computer games and educational applications. Some solutions integrate tracked devices which allow users to freely move within a certain space. Virtual 3D model can be visually explored, implemented collision detected allows users to get a feedback for instance by sound or vibration. For research projects there are several approaches which offer to get the actual feedback for the fingers of a hand, when the users virtually touches the surface of a 3D model. However, in the consumer market currently no product is sold which offers this direct feedback for the whole hand. In this paper we introduce a low-cost hand exoskeleton which is usable in conjunction with commodity hardware. It covers each of the five fingers of the user's hand, its design is open-source, low-cost, can be customized and 3D printed by individuals. It aims at improving the haptic perception of users, bases of a popular physical computing platform and is designed to be assembled even by electronically unexperienced users. We show the integration of our lean interface of the wireless exoskeleton into exemplary VR environment and describe a calibration process which is flexible for customizations.","PeriodicalId":140232,"journal":{"name":"Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114472125","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
N. Bakalos, M. Bonazountas, V. Tsiakos, Vasilis Hadjipanos
{"title":"Providing Certified Paths for Safe Port Operation: The e-Mariner paradigm","authors":"N. Bakalos, M. Bonazountas, V. Tsiakos, Vasilis Hadjipanos","doi":"10.1145/3056540.3076204","DOIUrl":"https://doi.org/10.1145/3056540.3076204","url":null,"abstract":"In this paper we describe a system that utilizes supervised machine learning over GNSS PVT data to produce \"safe\" paths for maritime port operations. By using meta-knowledge describing the behavior of mobile objects based on specific criteria, and also extend this to address sections of trajectories, we aim at certifying","PeriodicalId":140232,"journal":{"name":"Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125984629","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Adama, Ahmad Lotfi, C. Langensiepen, Kevin Lee, Pedro Trindade
{"title":"Learning Human Activities for Assisted Living Robotics","authors":"D. Adama, Ahmad Lotfi, C. Langensiepen, Kevin Lee, Pedro Trindade","doi":"10.1145/3056540.3076197","DOIUrl":"https://doi.org/10.1145/3056540.3076197","url":null,"abstract":"Assistive living has gained increased focus in recent years with the increase in elderly population. This has led to a desire for technical solutions to reduce cost. Learning to perform human activities of daily living through the use of assistive technology (especially assistive robots) becomes more important in areas like elderly care. This paper proposes an approach to learning to perform human activities using a method of activity recognition from information obtained from an RGB-D sensor. Key features obtained from clustering and classification of relevant aspects of an activity will be used for learning. Existing approaches to activity recognition still have limitations preventing them from going mainstream. This is part of a project directed towards transfer learning of human activities to enhance human-robot interaction. For test and validation of our method, the CAD-60 human activity data set is used.","PeriodicalId":140232,"journal":{"name":"Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121977070","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}