{"title":"Video-Recording Your Life: User Perception and Experiences","authors":"Daniel Buschek, M. Spitzer, Florian Alt","doi":"10.1145/2702613.2732743","DOIUrl":"https://doi.org/10.1145/2702613.2732743","url":null,"abstract":"Video recording is becoming an integral part of our daily activities: Action cams and wearable cameras allow us to capture scenes of our daily life effortlessly. This trend generates vast amounts of video material impossible to review manually. However, these recordings also contain a lot of information potentially interesting to the recording individual and to others. Such videos can provide a meaningful summary of the day, serving as a digital extension to the user's human memory. They might also be interesting to others as tutorials (e.g. how to change a flat tyre). As a first step towards this vision, we present a survey assessing the users' view and their video recording behavior. Findings were used to inform the design of a prototype based on off-the-shelf components, which allows users to create meaningful video clips of their daily activities in an automated manner by using their phone and any wearable camera. We conclude with a preliminary, qualitative study showing the feasibility and potential of the approach and sketch future research directions.","PeriodicalId":142786,"journal":{"name":"Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems","volume":"224 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116165343","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Andreas Lindner, M. Hall, C. Niemeyer, Simon Caton
{"title":"BeWell: A Sentiment Aggregator for Proactive Community Management","authors":"Andreas Lindner, M. Hall, C. Niemeyer, Simon Caton","doi":"10.1145/2702613.2732787","DOIUrl":"https://doi.org/10.1145/2702613.2732787","url":null,"abstract":"Granular, localized information can be unobtrusively gathered to assess public sentiment as a superior measure of policy impact. This information is already abundant and available via Online Social Media. The missing link is a rigorous, anonymized and open source artefact that gives feedback to stakeholders and constituents. To address this, BeWell, an unobtrusive, low latency multi-resolution measurement for the observation, analysis and modelling of community dynamics, is proposed. To assess communal well-being, 42 Facebook pages of a large public university in Germany are analyzed with a dictionary-based text analytics program, LIWC. We establish the baseline of emotive discourse across the sample, and detect significant campus-wide events in this proof of concept implementation, then discuss future iterations including a community dashboard and a participatory management plan.","PeriodicalId":142786,"journal":{"name":"Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems","volume":"174 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116553429","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Lukosch, H. Lukosch, D. Datcu, Marina-Anca Cidotã
{"title":"On the Spot Information in Augmented Reality for Teams in the Security Domain","authors":"S. Lukosch, H. Lukosch, D. Datcu, Marina-Anca Cidotã","doi":"10.1145/2702613.2732879","DOIUrl":"https://doi.org/10.1145/2702613.2732879","url":null,"abstract":"For operational teams in the security domain it is important to quickly and adequately exchange context-related information. This is necessary to develop distributed situational awareness and facilitate collaboration. Currently, information exchange is mainly based on oral communication. Oral communication can be misunderstood or ambiguous. This paper reports on different scenarios from the security domain in which augmented reality (AR) techniques are used to support information exchange. A combination of quantitative and qualitative evaluation showed that AR can improve the distributed situational awareness of a team.","PeriodicalId":142786,"journal":{"name":"Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems","volume":"151 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122458984","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"First Insights with a Vibrotactile Interface for Children with Multiple Disabilities","authors":"C. Manresa-Yee, A. Morrison, J. J. Muntaner","doi":"10.1145/2702613.2732910","DOIUrl":"https://doi.org/10.1145/2702613.2732910","url":null,"abstract":"Designing and evaluating interactive systems for users with multiple disabilities is a challenge due to their cognitive, sensory, physical and behavioral conditions. Vibrotactile interfaces to motivate users' actions exist for users with hearing and sight impairments, but there are hardly any for users with multiple disabilities. We developed V-Sense, a vibrotactile interface that encourages children with multiple disabilities to move their arms by using vibrations and exploiting the saltation perceptual illusion. In this paper we describe our initial experience evaluating the interface with 5 children for 7 weeks and we discuss the first insights concerning the use of the interface and the difficulties encountered while conducting the evaluation sessions.","PeriodicalId":142786,"journal":{"name":"Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems","volume":"38 11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122508983","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Y. Wohn, Jacob Solomon, Dan Sarkar, Kami Vaniea
{"title":"Factors Related to Privacy Concerns and Protection Behaviors Regarding Behavioral Advertising","authors":"D. Y. Wohn, Jacob Solomon, Dan Sarkar, Kami Vaniea","doi":"10.1145/2702613.2732722","DOIUrl":"https://doi.org/10.1145/2702613.2732722","url":null,"abstract":"Research on online behavioral advertising has focused on users' attitudes towards sharing and what information they are willing to share. An unexplored area in this domain is how users' knowledge of how to protect their information differs from their self-efficacy about executing privacy protection behavior. The results of a 179-participant online study show that knowledge explains privacy concerns, but self-efficacy explains protection behaviors. Perceived behavioral control was related to both concerns and behavior.","PeriodicalId":142786,"journal":{"name":"Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems","volume":"461 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123026821","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
N. Kim, C. Lee, Jonghyuk Jung, Eun-Young Ko, Juho Kim, Ji Hee Kim
{"title":"BudgetMap: Issue-Driven Navigation for a Government Budget","authors":"N. Kim, C. Lee, Jonghyuk Jung, Eun-Young Ko, Juho Kim, Ji Hee Kim","doi":"10.1145/2702613.2732932","DOIUrl":"https://doi.org/10.1145/2702613.2732932","url":null,"abstract":"We present BudgetMap, an interactive tool for navigating budgets of government programs through a lens of social issues of public interests. Our novel issue-driven approach can complement the traditional budget classification system used by government organizations by addressing time-evolving public interests. BudgetMap elicits the public to tag government programs with social issues by providing active and passive tagging methods. BudgetMap then facilitates visual exploration of the tagged budget metadata. Through a lab study, we show how the design of BudgetMap helps users develop awareness and understanding of budgetary issues while identifying issue-budget links. We also share lessons learned from a preliminary live deployment.","PeriodicalId":142786,"journal":{"name":"Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114438171","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Single-Pixel Eye Tracking via Patterned Contact Lenses: Design and Evaluation in HCI Domain","authors":"Ioannis Rigas, Oleg V. Komogortsev","doi":"10.1145/2702613.2732745","DOIUrl":"https://doi.org/10.1145/2702613.2732745","url":null,"abstract":"This paper presents a preliminary study of an eye tracking technique suitable for use in devices with low-power consumption demands, e.g. Google Glass. The method uses a patterned contact lens and a single-pixel imaging sensor. Its applicability is explored via a semi-simulated user study, where real eye movements from 50 subjects are used to animate a 3-D graphics replica of an eye wearing a patterned contact lens. An accurate single-pixel camera simulator is used to perform gaze estimation via capturing of the imprinted pattern. The results show the promising potential of the technique in the field of eye tracking and eye gesture recognition.","PeriodicalId":142786,"journal":{"name":"Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122089053","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Gerhard Johann Hagerer, Michael Lux, S. Ehrlich, G. Cheng
{"title":"Augmenting Affect from Speech with Generative Music","authors":"Gerhard Johann Hagerer, Michael Lux, S. Ehrlich, G. Cheng","doi":"10.1145/2702613.2732792","DOIUrl":"https://doi.org/10.1145/2702613.2732792","url":null,"abstract":"In this work we propose a prototype to improve interpersonal communication of emotions. Therefore music is generated with the same affect as when humans talk on the fly. Emotions in speech are detected and conveyed to music according to music psychological rules. Existing evaluated modules from affective generative music and speech emotion detection, use cases, emotional models and projected evaluations are discussed.","PeriodicalId":142786,"journal":{"name":"Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems","volume":"64 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129496637","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Research Methods for Child Computer Interaction","authors":"J. Read, Shuli Gilutz","doi":"10.1145/2702613.2706687","DOIUrl":"https://doi.org/10.1145/2702613.2706687","url":null,"abstract":"In this course participants will learn about theory and practice of conducting research in children's HCI. The course is divided into two sessions: basic principles and theory, and best practices. The instructors have multiple years of experience designing, conducting, and analyzing children-computer interaction (CCI) studies, in the UK, USA, and Israel.","PeriodicalId":142786,"journal":{"name":"Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129832563","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Therapeutic Gaming in Context: Observing Game Use for Brain Injury Rehabilitation","authors":"Jinghui Cheng, C. Putnam","doi":"10.1145/2702613.2732697","DOIUrl":"https://doi.org/10.1145/2702613.2732697","url":null,"abstract":"Video games are often used in brain injury (BI) therapy sessions to help motivate patients to engage in rehabilitation activities. However, very little is known about contexts of game use in real-world rehabilitation settings. In this paper, we explore contexts of commercial game use in BI therapy through observation of inpatient therapy sessions. Based on a systematic analysis of the observation recordings, we found that (1) only 30% of session time was used for gameplay; (2) therapists needed to provide various kinds of cognitive and physical patient support during the play sessions; and (3) therapists adopted multiple strategies to reinforce the therapeutic values of the games. This study is helping us create decision and information sharing tools to support the use and creation of games for BI rehabilitation.","PeriodicalId":142786,"journal":{"name":"Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128446072","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}