{"title":"Session details: Head-worn displays","authors":"B. Bailey","doi":"10.1145/3251005","DOIUrl":"https://doi.org/10.1145/3251005","url":null,"abstract":"","PeriodicalId":20599,"journal":{"name":"Proceedings of the SIGCHI Conference on Human Factors in Computing Systems","volume":"15 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74160447","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Gesture-based interaction","authors":"Jörg Müller","doi":"10.1145/3250990","DOIUrl":"https://doi.org/10.1145/3250990","url":null,"abstract":"","PeriodicalId":20599,"journal":{"name":"Proceedings of the SIGCHI Conference on Human Factors in Computing Systems","volume":"6 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75773477","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Interactive surfaces and pervasive displays","authors":"A. Quigley","doi":"10.1145/3250991","DOIUrl":"https://doi.org/10.1145/3250991","url":null,"abstract":"","PeriodicalId":20599,"journal":{"name":"Proceedings of the SIGCHI Conference on Human Factors in Computing Systems","volume":"40 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74621854","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Finder highlights: field evaluation and design of an augmented file browser","authors":"Stephen Fitchett, A. Cockburn, C. Gutwin","doi":"10.1145/2556288.2557014","DOIUrl":"https://doi.org/10.1145/2556288.2557014","url":null,"abstract":"Navigating to files through a hierarchy is often a slow, laborious, and repetitive task. Recent lab studies showed that file browser interface augmentations, such as Icon Highlights and Search Directed Navigation, have the potential to reduce file retrieval times. However, for this potential to be realised in actual systems, further study is necessary to address two important issues. First, there are important design and implementation challenges in advancing the research prototypes previously evaluated into complete interactive systems that can be used for real work. Second, it is unknown how real users would employ these systems while engaged in actual work; would the potential performance improvements suggested by the earlier lab studies be realised? We therefore describe the design, implementation, and longitudinal field study evaluation of Finder Highlights, a file browser plugin for the OS X 'Finder' that adds support for Icon Highlights and Search Directed Navigation. Study results confirm that the augmentations are effective in reducing real-world file retrieval times, with retrieval times 13% faster when using Finder Highlights compared to the standard tool (10.6 s versus 12.2 s), while also emphasising important differences between lab and field studies. In summary, the paper strongly suggests that large-scale deployment of interface augmentations to file browsers, particularly Icon Highlights, will have a marked effect in improving users' real-world file retrieval.","PeriodicalId":20599,"journal":{"name":"Proceedings of the SIGCHI Conference on Human Factors in Computing Systems","volume":"64 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74797367","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Staccato social support in mobile health applications","authors":"Phil Adams, E. Baumer, Geri Gay","doi":"10.1145/2556288.2557297","DOIUrl":"https://doi.org/10.1145/2556288.2557297","url":null,"abstract":"Social support plays an important role in health systems. While significant work has explored the role of social support in CMC environments, less analysis has considered social support in mobile health systems. This paper describes socially supportive messages in VERA, a mobile application for sharing health decisions and behaviors. The short and bursty interactions in social awareness streams [36] afford a particular style of social support, for which we offer the label staccato social support. Results indicate that, in comparison to previous work, staccato social support is characterized by a greater prevalence of esteem support, which builds respect and confidence. We further note the presence of 'following up', a positive behavior that contributes to supportive interactions, likely via social pressure and accountability [7,38]. These findings suggest design recommendations to developers of mobile social support systems and contribute to understanding technologically mediated social support for health.","PeriodicalId":20599,"journal":{"name":"Proceedings of the SIGCHI Conference on Human Factors in Computing Systems","volume":"54 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77378421","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Addressing the subtleties in dementia care: pre-study & evaluation of a GPS monitoring system","authors":"Lin Wan, Claudia Müller, V. Wulf, D. Randall","doi":"10.1145/2556288.2557307","DOIUrl":"https://doi.org/10.1145/2556288.2557307","url":null,"abstract":"In this work we present a user-centered development process for a GPS-based monitoring system to be used in dementia care. Our research covers a full design process including a qualitative-empirical pre-study, the prototyping process and the investigation of long-term appropriation processes of the stable prototypes in three different practice environments. Specifically, we deal with the problem of 'wandering' by persons suffering from late-phase dementia. Although GPS tracking is not a novel technological objective, the usage of those systems in dementia care remains very low. The paper therefore takes a socio-technical stance on development and appropriation of GPS technology in dementia care and assesses the practical and ideological issues surrounding care to understand why. We additionally provide design research in two different settings, familial and institutional care, and report on the design of a GPS-based tracking system reflecting these considerations. What comes to the fore is the need for ICT to reflect complex organizational, ideological and practical issues that form part of a moral universe where sensitivity is crucial.","PeriodicalId":20599,"journal":{"name":"Proceedings of the SIGCHI Conference on Human Factors in Computing Systems","volume":"19 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77703288","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Expressive touch: studying tapping force on tabletops","authors":"E. Pedersen, K. Hornbæk","doi":"10.1145/2556288.2557019","DOIUrl":"https://doi.org/10.1145/2556288.2557019","url":null,"abstract":"This paper investigates users' ability to perform force-sensitive tapping and explores its potential as an input modality in touch-based systems. We study force-sensitive tapping using Expressive Touch, a tabletop interface that infers tapping force from the sound waves created by the users' finger upon impact. The first part of the paper describes the implementation details of Expressive Touch and shows how existing tabletop interfaces can be augmented to reliably detect tapping force across the entire surface. The second part of the paper reports on the results of three studies of force-sensitive tapping. First, we use a classic psychophysic task to gain insights into participants' perception of tapping force (Study 1). Results show that although participants tap with different absolute tapping forces, they have a similar perception of relative tapping force. Second, we investigate participants' ability to control tapping force (Study 2) and find that users can produce two force levels with 99% accuracy. For six levels of force, accuracy drops to 58%. Third, we investigate the usability of force tapping by studying participants' reactions to seven force-sensitive touch applications (Study 3).","PeriodicalId":20599,"journal":{"name":"Proceedings of the SIGCHI Conference on Human Factors in Computing Systems","volume":"26 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80201411","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yun-En Liu, Travis Mandel, E. Brunskill, Zoran Popovic
{"title":"Towards automatic experimentation of educational knowledge","authors":"Yun-En Liu, Travis Mandel, E. Brunskill, Zoran Popovic","doi":"10.1145/2556288.2557392","DOIUrl":"https://doi.org/10.1145/2556288.2557392","url":null,"abstract":"We present a general automatic experimentation and hypothesis generation framework that utilizes a large set of users to explore the effects of different parts of an intervention parameter space on any objective function. We also incorporate importance sampling, allowing us to run these automatic experiments even if we cannot give out the exact intervention distributions that we want. To show the utility of this framework, we present an implementation in the domain of fractions and numberlines, using an online educational game as the source of players. Our system is able to automatically explore the parameter space and generate hypotheses about what types of numberlines lead to maximal short-term transfer; testing on a separate dataset shows the most promising hypotheses are valid. We briefly discuss our results in the context of the wider educational literature, showing that one of our results is not explained by current research on multiple fraction representations, thus proving our ability to generate potentially interesting hypotheses to test.","PeriodicalId":20599,"journal":{"name":"Proceedings of the SIGCHI Conference on Human Factors in Computing Systems","volume":"9 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79363207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Visual recognition in museum guide apps: do visitors want it?","authors":"Leonard Wein","doi":"10.1145/2556288.2557270","DOIUrl":"https://doi.org/10.1145/2556288.2557270","url":null,"abstract":"In this paper, visual recognition (VisRec) is evaluated as a method to access background information on artworks in mobile museum guide applications (apps) by means of a field experiment. While museums and previous research have explored technical aspects, it is unclear whether visitors actually want to use VisRec. A prototype featuring VisRec, QR codes and number codes was developed and assessed with a usability study in two museums (N=89). The prototype confirms the efficacy of the recently introduced ORB-algorithm for VisRec. Compared to previous literature, the results highlight the context-dependency of perceived usability and variability in the importance of usability factors. The results reveal a clear preference for VisRec among participants (53%); only 14% preferred QR codes. Ease of use, enjoyability and distance are identified as the main factors. This provides strong evidence to further explore the potential of VisRec to improve visitors' museum experiences.","PeriodicalId":20599,"journal":{"name":"Proceedings of the SIGCHI Conference on Human Factors in Computing Systems","volume":"5 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81880818","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Novice use of a predictive human performance modeling tool to produce UI recommendations","authors":"Kyung Wha Hong, R. Amant","doi":"10.1145/2556288.2556972","DOIUrl":"https://doi.org/10.1145/2556288.2556972","url":null,"abstract":"This note describes two studies of the use of a performance modeling tool, CogTool, for making recommendations to improve a user interface. The first study replicates findings by Bonnie John [7]: the rates at which novice modelers made correct recommendations (88.1%) and supported them (68.2%) are close to the values in John's study (91.7% and 75.1%, respectively). A follow-on study of novice modelers on the same task without CogTool produced sig-nificantly lower values. CogTool improves the UI design recommendations made by novices.","PeriodicalId":20599,"journal":{"name":"Proceedings of the SIGCHI Conference on Human Factors in Computing Systems","volume":"29 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84508057","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}