A. Mecocci, F. Micheli, C. Zoppetti, Andrea Baghini
{"title":"Automatic falls detection in hospital-room context","authors":"A. Mecocci, F. Micheli, C. Zoppetti, Andrea Baghini","doi":"10.1109/COGINFOCOM.2016.7804537","DOIUrl":"https://doi.org/10.1109/COGINFOCOM.2016.7804537","url":null,"abstract":"This paper presents a framework for the monitoring of hospitalized people, including fall detection capabilities, using an environmentally mounted depth imaging sensor. The purpose is to characterize the fall event, depending on the location of the person when the fall event happens. In particular, we distinguish two basic starting point conditions: fall from standing position (e.g. due to blood pressure failure) and fall out of bed (e.g. due to agitation). To achieve this goal, we exploit the context information to adaptively extract the person's silhouette and then reliably tracking the trajectory. If a fall occurs, the system is capable of recognize this event on the basis of the inferred starting condition. The current implementation has been tested on available online datasets and on a self-made dedicated dataset. In this latter dataset, we have included falls from standing position and falls out of bed, even in presence of occlusions.","PeriodicalId":440408,"journal":{"name":"2016 7th IEEE International Conference on Cognitive Infocommunications (CogInfoCom)","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130899446","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mathability and an animation related to a convex-like property","authors":"A. Gilányi, N. Merentes, Roy Quintero","doi":"10.1109/COGINFOCOM.2016.7804553","DOIUrl":"https://doi.org/10.1109/COGINFOCOM.2016.7804553","url":null,"abstract":"In connection with investigations related to mathability and to applications of computer assisted methods for studying mathematical problems, an animation of the m-convex hull of finite sets of points on the Cartesian plane is presented.","PeriodicalId":440408,"journal":{"name":"2016 7th IEEE International Conference on Cognitive Infocommunications (CogInfoCom)","volume":"107 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133642919","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Uniform dispersal of oblivious mobile robots","authors":"Attila Hideg, L. Blázovics, B. Forstner","doi":"10.1109/COGINFOCOM.2016.7804569","DOIUrl":"https://doi.org/10.1109/COGINFOCOM.2016.7804569","url":null,"abstract":"The research area of uniform dispersal focuses on the minimal capabilities required by mobile robots to solve the problem. Consider the uniform dispersion problem in an unknown connected space which is decomposed into smaller cells. Autonomous robots are injected through an entry point and has to occupy all the cells while avoiding collisions. These robots are considered \"weak\" in terms of hardware capabilities. They have limited memory, visibility and communication range. When the robots do not have any persistent storage (i.e. without any memory of past positions or actions) they are called oblivious. This paper presents a method which enables oblivious robots to solve the uniform dispersion problem.","PeriodicalId":440408,"journal":{"name":"2016 7th IEEE International Conference on Cognitive Infocommunications (CogInfoCom)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132305471","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Compassion, empathy and sympathy expression features in affective robotics","authors":"Barbara - Lewandowska-Tomaszczyk, P. Wilson","doi":"10.1109/COGINFOCOM.2016.7804526","DOIUrl":"https://doi.org/10.1109/COGINFOCOM.2016.7804526","url":null,"abstract":"The present paper identifies differences in the expression features of compassion, sympathy and empathy in British English and Polish that need to be tuned accordingly in socially interactive robots to enable them to operate successfully in these cultures. The results showed that English compassion is characterised by more positive valence and more of a desire to act than Polish współczucie. Polish empatia is also characterised by a more negative valence than English empathy, which has a wider range of application. When used in positive contexts, English sympathy corresponds to Polish sympatia; however, it also acquires elements of negative valence in English. The results further showed that although the processes of emotion recognition and expression in robotics must be tuned to culture-specific emotion models, the more explicit patterns of responsiveness (British English for the compassion model in our case) is also recommended for the transfer to make the cognitive and sensory infocommunication more readily interpretable by the interacting agents.","PeriodicalId":440408,"journal":{"name":"2016 7th IEEE International Conference on Cognitive Infocommunications (CogInfoCom)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114346096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
P. Barsocchi, M. Bianchini, A. Crivello, Davide La Rosa, Filippo Palumbo, F. Scarselli
{"title":"An unobtrusive sleep monitoring system for the human sleep behaviour understanding","authors":"P. Barsocchi, M. Bianchini, A. Crivello, Davide La Rosa, Filippo Palumbo, F. Scarselli","doi":"10.1109/COGINFOCOM.2016.7804531","DOIUrl":"https://doi.org/10.1109/COGINFOCOM.2016.7804531","url":null,"abstract":"Sleep plays a vital role in good health and well-being throughout our life. Getting enough quality sleep at the right times can help protect mental and physical health, quality of life, and safety. Emerging wearable devices allow people to measure and keep track of sleep duration, patterns, and quality. Often, these approaches are intrusive and change the user's daily sleep habits. In this paper, we present an unobtrusive approach for the detection of sleep stages and positions. The proposed system is able to overcome the weakness of classic actigraphy-based systems, since it is easy to deploy and it is based on inexpensive technology. With respect to the actigraphy-based systems, the proposed system is able to detect the bed posture, that is crucial to support pressure ulcer prevention (i.e. bedsores). Results from our algorithm look promising and show that we can accurately infer sleep duration, sleep positions, and routines with a completely unobtrusive approach.","PeriodicalId":440408,"journal":{"name":"2016 7th IEEE International Conference on Cognitive Infocommunications (CogInfoCom)","volume":"379 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131720702","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The influence of verticality metaphor on moral judgment and intuition","authors":"K. Ścigała, B. Indurkhya","doi":"10.1109/COGINFOCOM.2016.7804550","DOIUrl":"https://doi.org/10.1109/COGINFOCOM.2016.7804550","url":null,"abstract":"Lakoff and Johnson' s theory of conceptual metaphor predicts that the notion of verticality is often used as a basis for understanding concepts: so, for example, physically higher locations are associated with morally good, and lower locations are associated with immorality. As the moral dimension plays a crucial role when we judge other people, one would expect that verticality metaphor is also connected with moral evaluation of other people. We present here two experiments to explore this issue. Results of the first experiment suggest that the judgment of a morally ambivalent behaviour description is more favourable when presented at the top of the page in comparison to presenting it at the bottom of the page. The second experiment shows that participants are more willing to stop to talk to a volunteer asking for donation for charity after they ride up the escalator rather than after riding down. These results together lead to the conclusion that activation of verticality metaphor influences moral judgment, both when it comes to deliberate and conscious evaluation (first experiment), and also when the decision is based on the first impression, intuition and automatic reaction (second experiment).","PeriodicalId":440408,"journal":{"name":"2016 7th IEEE International Conference on Cognitive Infocommunications (CogInfoCom)","volume":"113 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115075117","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Cultural heritage in a pocket: Case study “Turku castle in your hand”","authors":"Mika Luimula, N. Trygg","doi":"10.1109/COGINFOCOM.2016.7804524","DOIUrl":"https://doi.org/10.1109/COGINFOCOM.2016.7804524","url":null,"abstract":"Turku University of Applied Sciences has provided game development education since 2009, with focus areas in entertainment games, serious games and gamification. In this paper, we will report the development of the mobile application \"Turku Castle in Your Hand\". The main aim of this project is to design an interactive digital platform that can be adopted to the needs of public or private institution or an event by simply changing the content while minigames would require customized solutions. The development of this application is based on the results from our work in facilitating research, knowing the needs of the industry and aim to provide decisive user experience. Our main focus in this application is to evoke new approaches in maintenance of cultural heritage by introducing new technologies and evaluating the user experience of applying this platform in diverse real-life settings. The process of developing the project has included international research exchange program, working closely with international experts both in our game lab and Middle East.","PeriodicalId":440408,"journal":{"name":"2016 7th IEEE International Conference on Cognitive Infocommunications (CogInfoCom)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129851032","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}