{"title":"Harnessing the quantified self movement for optimal mental health and wellbeing","authors":"Alishia D. Williams","doi":"10.1145/2983576.2983585","DOIUrl":"https://doi.org/10.1145/2983576.2983585","url":null,"abstract":"Treatment innovation in the mental health sector is a major public-health priority. A specific sub-challenge underlying the development of new treatments is the use of digital technologies to support mental health interventions. In addition to the potential benefits of increased access to care and reduced costs to service providers, the implementation of digital technologies could enhance patient engagement by offering greater flexibility than afforded by routine systems of care and through the tailoring of interventions to best suit patients in their own environments. Using self-relevant information extracted from individual's day-to-day lives may be one means to achieve a personalised approach. Wearable technologies provide an unprecedented means to collect information about individuals, which has led to the 'quantified self movement'. As these technologies become seamlessly integrated into people's lives, they also open up new possibilities for research and clinical avenues to promote mental health and wellbeing. The potential synergy between wearable devices makes it possible to track, monitor, and provide immediate feedback to promote behavioural and/or cognitive change in a manner that is currently impossible in traditional mental health settings. Wearable devices with wireless connectivity are capable of transmitting information that can automatically interact with another digital interface, such as a smartphone or smartwatch. In this way, information collected in an individual's environment (e.g., visual capture from a Narrative Clip wearable camera) could lead to immediate targeted action. This combination of features makes wearable devices ideal for mental health research where exposure to systematic cues can be assessed or when compliance to recommendations needs to be monitored. The aim of this talk is to explore the potential applications (and challenges) of wearable devices in mental health contexts, with the hope of stimulating further cross-disciplinary work at the intersection of technology and applied social and behavioural science research.","PeriodicalId":352947,"journal":{"name":"Proceedings of the first Workshop on Lifelogging Tools and Applications","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123063381","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Cued Retrieval of Personal Memories of Social Interactions","authors":"Seyed Ali Bahrainian, F. Crestani","doi":"10.1145/2983576.2983577","DOIUrl":"https://doi.org/10.1145/2983576.2983577","url":null,"abstract":"This paper aims at developing a social interactions summarizer system that firstly summarizes a person's daily social interactions, with the purpose of enhancing his episodic memory and secondly provides various methods for searching in the collected data. The first goal originates from studies that have shown that replaying video or audio recordings of an experience have proved effective in enhancing people's episodic memory. The second goal is based on the fact that with the emergence of wearable devices for lifelogging, every day huge archives of data consisting of images, audio, etc could be generated from a person's life. Over time the growth rate of such archives is so substantially high that manually searching them, even after a few months of data collection, would be cumbersome and virtually impossible. In this study, we present a method for effectively summarizing one's social interactions for enhancing her episodic memory. Our results, reporting work in progress, illustrate that our method is highly effective in augmenting one's episodic memory.","PeriodicalId":352947,"journal":{"name":"Proceedings of the first Workshop on Lifelogging Tools and Applications","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122977746","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Describing Lifelogs with Convolutional Neural Networks: A Comparative Study","authors":"A. Molino, Qianli Xu, Joo-Hwee Lim","doi":"10.1145/2983576.2983579","DOIUrl":"https://doi.org/10.1145/2983576.2983579","url":null,"abstract":"Life-logging technologies, e.g. wearable cameras taking pictures at a fixed interval, can be used as a means of memory preservation (in digital form), caregiver monitoring and even cognitive therapy to train our brains. Yet, such large amount of data needs to be processed and edited to be of use. Automatic summarization of the life-logs into short story boards is a possible solution. But how good are these summaries? Are the selected key-frames informative and representative enough as to be good memory cues? The proposed approach (i) filters uninformative images by analyzing their ratio of edges and (ii) describes the images using the available Convolutional Neural Networks (CNN) models for objects and places with egocentric-driven data augmentation. We perform a comparative study to evaluate different summarization methods in terms of coverage, informativeness and representativeness in two different datasets, both with annotated ground truth and an on-line user study. Results show that filtering uninformative images improves the user satisfaction: users would request to change less frames from the original summary than without filtering. Moreover, the proposed egocentric image descriptor generates more diverse content than the standard cropping strategy used by most CNN-based approaches.","PeriodicalId":352947,"journal":{"name":"Proceedings of the first Workshop on Lifelogging Tools and Applications","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114424570","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"NTCIR-12 Lifelog Data Analytics","authors":"Na Li, C. Gurrin, M. Crane, H. Ruskin","doi":"10.1145/2983576.2983583","DOIUrl":"https://doi.org/10.1145/2983576.2983583","url":null,"abstract":"Lifelogging is the process of automatically, ambiently and digitally recording episodes of one's life experiences. NTCIR-12 Lifelog test collection was initially created, as support for the Information Retrieval (IR) community, to develop new and novel lifelogging retrieval and visualisation systems. In this paper, our goal is organising and analysing the NTCIR-12 Lifelog dataset by using a time series approach to facilitate automatic discovery of repeat events.","PeriodicalId":352947,"journal":{"name":"Proceedings of the first Workshop on Lifelogging Tools and Applications","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114890776","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Keynote Talk 1","authors":"C. Gurrin","doi":"10.1145/3255951","DOIUrl":"https://doi.org/10.1145/3255951","url":null,"abstract":"","PeriodicalId":352947,"journal":{"name":"Proceedings of the first Workshop on Lifelogging Tools and Applications","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123468514","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Ortis, G. Farinella, V. D'Amico, Luca Addesso, Giovanni Torrisi, S. Battiato
{"title":"Organizing Egocentric Videos for Daily Living Monitoring","authors":"A. Ortis, G. Farinella, V. D'Amico, Luca Addesso, Giovanni Torrisi, S. Battiato","doi":"10.1145/2983576.2983578","DOIUrl":"https://doi.org/10.1145/2983576.2983578","url":null,"abstract":"Egocentric videos are becoming popular since the possibility to observe the scene flow from the user's point of view (First Person Vision). Among the different assistive applications in this context there is the daily living monitoring of a user that is wearing the camera. In this paper we propose a system devoted to automatically organize videos acquired by the user over different days. By employing an unsupervised segmentation, each egocentric video is divided in chapters by considering the visual content. The video segments related to the different days are hence linked to produce graphs which are coherent with respect to the context in which the user acts. Experiments on two different datasets demonstrate the effectiveness of the proposed approach which outperforms the state of the art, both in accuracy and computational time with a good margin.","PeriodicalId":352947,"journal":{"name":"Proceedings of the first Workshop on Lifelogging Tools and Applications","volume":"1983 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128051537","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Zaher Hinbarji, Moohamad Hinbarji, Rami Albatal, C. Gurrin
{"title":"Personal Information Manager to Capture and Re-Access What We See on Computers","authors":"Zaher Hinbarji, Moohamad Hinbarji, Rami Albatal, C. Gurrin","doi":"10.1145/2983576.2983580","DOIUrl":"https://doi.org/10.1145/2983576.2983580","url":null,"abstract":"Nowadays we live in a world where many of us engage with computers more than humans as a result of spending a major part of our life in front of a range of computing devices. Consequently, it's becoming important to shed more light on our interactions with computing devices, which we see as a special domain of lifelogging (information-lifelogging), where capturing and archiving what we see on our computer screens can be utilised for several useful applications such as user profiling, personalization and memory support. In this work, we present a tool that allows us to passively capture the digital content we see on our screens for later re-access. It can be considered as a type of digital memory that stores user's computer usage to recall a user's information creation and access activities. This has potential to assist users to better achieve their daily tasks by having access to a digital backup where their previous content and experience can be recalled as required.","PeriodicalId":352947,"journal":{"name":"Proceedings of the first Workshop on Lifelogging Tools and Applications","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134286304","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Keynote Talk 2","authors":"P. Radeva","doi":"10.1145/3255953","DOIUrl":"https://doi.org/10.1145/3255953","url":null,"abstract":"","PeriodicalId":352947,"journal":{"name":"Proceedings of the first Workshop on Lifelogging Tools and Applications","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116137867","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Oral Session","authors":"Xavier Giró-i-Nieto","doi":"10.1145/3255952","DOIUrl":"https://doi.org/10.1145/3255952","url":null,"abstract":"","PeriodicalId":352947,"journal":{"name":"Proceedings of the first Workshop on Lifelogging Tools and Applications","volume":"139 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122062261","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Prizm: A Wireless Access Point for Proxy-Based Web Lifelogging","authors":"Jimmy J. Lin, Zhucheng Tu, M. Rose, Patrick White","doi":"10.1145/2983576.2983581","DOIUrl":"https://doi.org/10.1145/2983576.2983581","url":null,"abstract":"We present Prizm, a prototype lifelogging device that comprehensively records a user's web activity. Prizm is a wireless access point deployed on a Raspberry Pi that is designed to be a substitute for the user's normal wireless access point. Prizm proxies all HTTP(S) requests from devices connected to it and records all activity it observes. Although this particular design is not entirely novel, there are a few features that are unique to our approach, most notably the physical deployment as a wireless access point. Such a package allows capture of activity from multiple devices, integration with web archiving for preservation, and support for offline operation. This paper describes the design of Prizm, the current status of our project, and future plans.","PeriodicalId":352947,"journal":{"name":"Proceedings of the first Workshop on Lifelogging Tools and Applications","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132478633","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}