Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems最新文献

筛选
英文 中文
Understanding Users' Capability to Transfer Information between Mixed and Virtual Reality: Position Estimation across Modalities and Perspectives 理解用户在混合现实和虚拟现实之间传递信息的能力:跨模态和视角的位置估计
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems Pub Date : 2018-04-21 DOI: 10.1145/3173574.3173937
J. Roo, J. Basset, Pierre-Antoine Cinquin, M. Hachet
{"title":"Understanding Users' Capability to Transfer Information between Mixed and Virtual Reality: Position Estimation across Modalities and Perspectives","authors":"J. Roo, J. Basset, Pierre-Antoine Cinquin, M. Hachet","doi":"10.1145/3173574.3173937","DOIUrl":"https://doi.org/10.1145/3173574.3173937","url":null,"abstract":"Mixed Reality systems combine physical and digital worlds, with great potential for the future of HCI. It is possible to design systems that support flexible degrees of virtuality by combining complementary technologies. In order for such systems to succeed, users must be able to create unified mental models out of heterogeneous representations. In this paper, we present two studies focusing on the users' accuracy on heterogeneous systems using Spatial Augmented Reality (SAR) and immersive Virtual Reality (VR) displays, and combining viewpoints (egocentric and exocentric). The results show robust estimation capabilities across conditions and viewpoints.","PeriodicalId":20512,"journal":{"name":"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems","volume":"14 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86933057","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
Knowing You, Seeing Me: Investigating User Preferences in Drone-Human Acknowledgement 了解你,看到我:调查无人机-人类识别中的用户偏好
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems Pub Date : 2018-04-21 DOI: 10.1145/3173574.3173939
Walther Jensen, Simon Hansen, H. Knoche
{"title":"Knowing You, Seeing Me: Investigating User Preferences in Drone-Human Acknowledgement","authors":"Walther Jensen, Simon Hansen, H. Knoche","doi":"10.1145/3173574.3173939","DOIUrl":"https://doi.org/10.1145/3173574.3173939","url":null,"abstract":"In the past, human proxemics research has poorly predicted human robot interaction distances. This paper presents three studies on drone gestures to acknowledge human presence and clarify suitable acknowledging distances. We evaluated four drone gestures based on non-verbal human greetings. The gestures included orienting towards the counterpart and salutation gestures. We tested these individually and in combination to create a feeling of acknowledgement in people. Our users preferred being acknowledged from two meters away but gestures were also effective from four meters. Rotating the drone towards the user elicited a higher degree of acknowledgement than without. We conclude with a set design guidelines for drone gestures.","PeriodicalId":20512,"journal":{"name":"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems","volume":"29 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90391954","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 32
Whiskers: Exploring the Use of Ultrasonic Haptic Cues on the Face 胡须:探索在面部使用超声波触觉线索
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems Pub Date : 2018-04-21 DOI: 10.1145/3173574.3174232
Hyunjae Gil, Hyungki Son, Jin Ryong Kim, Ian Oakley
{"title":"Whiskers: Exploring the Use of Ultrasonic Haptic Cues on the Face","authors":"Hyunjae Gil, Hyungki Son, Jin Ryong Kim, Ian Oakley","doi":"10.1145/3173574.3174232","DOIUrl":"https://doi.org/10.1145/3173574.3174232","url":null,"abstract":"Haptic cues are a valuable feedback mechanism for smart glasses. Prior work has shown how they can support navigation, deliver notifications and cue targets. However, a focus on actuation technologies such as mechanical tactors or fans has restricted the scope of research to a small number of cues presented at fixed locations. To move beyond this limitation, we explore perception of in-air ultrasonic haptic cues on the face. We present two studies examining the fundamental properties of localization, duration and movement perception on three facial sites suitable for use with glasses: the cheek, the center of the forehead, and above the eyebrow. The center of the forehead led to optimal performance with a localization error of 3.77mm and accurate duration (80%) and movement perception (87%). We apply these findings in a study delivering eight different ultrasonic notifications and report mean recognition rates of up to 92.4% (peak: 98.6%). We close with design recommendations for ultrasonic haptic cues on the face.","PeriodicalId":20512,"journal":{"name":"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems","volume":"54 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73049729","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 28
Frames and Slants in Titles of Visualizations on Controversial Topics 关于有争议话题的可视化标题的框架和倾斜
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems Pub Date : 2018-04-21 DOI: 10.1145/3173574.3174012
H. Kong, Zhicheng Liu, Karrie Karahalios
{"title":"Frames and Slants in Titles of Visualizations on Controversial Topics","authors":"H. Kong, Zhicheng Liu, Karrie Karahalios","doi":"10.1145/3173574.3174012","DOIUrl":"https://doi.org/10.1145/3173574.3174012","url":null,"abstract":"Slanted framing in news article titles induce bias and influence recall. While recent studies found that viewers focus extensively on titles when reading visualizations, the impact of titles in visualization remains underexplored. We study frames in visualization titles, and how the slanted framing of titles and the viewer's pre-existing attitude impact recall, perception of bias, and change of attitude. When asked to compose visualization titles, people used five existing news frames, an open-ended frame, and a statistics frame. We found that the slant of the title influenced the perceived main message of a visualization, with viewers deriving opposing messages from the same visualization. The results did not show any significant effect on attitude change. We highlight the danger of subtle statistics frames and viewers' unwarranted conviction of the neutrality of visualizations. Finally, we present a design implication for the generation of visualization titles and one for the viewing of titles.","PeriodicalId":20512,"journal":{"name":"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems","volume":"130 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73810505","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 72
Fostering Commonfare. Infrastructuring Autonomous Social Collaboration 培育Commonfare。基础设施自治社会协作
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems Pub Date : 2018-04-21 DOI: 10.1145/3173574.3174026
Peter Lyle, Mariacristina Sciannamblo, Maurizio Teli
{"title":"Fostering Commonfare. Infrastructuring Autonomous Social Collaboration","authors":"Peter Lyle, Mariacristina Sciannamblo, Maurizio Teli","doi":"10.1145/3173574.3174026","DOIUrl":"https://doi.org/10.1145/3173574.3174026","url":null,"abstract":"Recently, HCI scholars have started questioning the relationship between computing and political economy, with both general analyses of such relationships, and specific design cases describing design interventions. This paper contributes to this stream of reflections, and argues that IT designers and HCI scholars can critically engage with the contemporary phase of capitalism by infrastructuring the emergence of new institutional forms of autonomous social collaboration through IT projects. More specifically, we discuss strategies and tactics that are available for IT designers embracing an activist agenda while infrastructuring autonomous social collaborations. We draw on empirical data from an H2020 EU funded project -- Commonfare -- that seeks to foster the emergence of alternative forms of welfare provision rooted in social collaboration. In this context, we discuss how the necessary multiple relations that unfold in a project with such ambitions shape both the language and the technologies of the project itself.","PeriodicalId":20512,"journal":{"name":"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems","volume":"20 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72860363","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 26
How “Wide Walls” Can Increase Engagement: Evidence From a Natural Experiment in Scratch “宽墙”如何提高参与度:来自Scratch自然实验的证据
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems Pub Date : 2018-04-21 DOI: 10.1145/3173574.3173935
Sayamindu Dasgupta, Benjamin Mako Hill
{"title":"How “Wide Walls” Can Increase Engagement: Evidence From a Natural Experiment in Scratch","authors":"Sayamindu Dasgupta, Benjamin Mako Hill","doi":"10.1145/3173574.3173935","DOIUrl":"https://doi.org/10.1145/3173574.3173935","url":null,"abstract":"A core aim for designing constructionist learning systems and toolkits is enabling \"wide walls\"-a metaphor used to describe supporting a diverse range of creative outcomes. Ensuring that a broad design space is afforded to learners by a toolkit is a common approach to achieving wide walls. We use econometric methods to provide an empirical test of the wide walls theory through a natural experiment in the Scratch online community. We estimate the causal effect of a policy change that gave a large number of Scratch users access to a more powerful version of Scratch data structures, effectively widening the walls for learners. We show that access to and use of these more powerful new data structures caused learners to use data structures more frequently. Our findings provide support for the theory that wide walls can increase engagement and learning.","PeriodicalId":20512,"journal":{"name":"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74951461","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 18
Investigating Perceptual Congruence between Data and Display Dimensions in Sonification 研究数据和显示维度在超声处理中的感知一致性
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems Pub Date : 2018-04-21 DOI: 10.1145/3173574.3174185
Jamie Ferguson, S. Brewster
{"title":"Investigating Perceptual Congruence between Data and Display Dimensions in Sonification","authors":"Jamie Ferguson, S. Brewster","doi":"10.1145/3173574.3174185","DOIUrl":"https://doi.org/10.1145/3173574.3174185","url":null,"abstract":"The relationships between sounds and their perceived meaning and connotations are complex, making auditory perception an important factor to consider when designing sonification systems. Listeners often have a mental model of how a data variable should sound during sonification and this model is not considered in most data:sound mappings. This can lead to mappings that are difficult to use and can cause confusion. To investigate this issue, we conducted a magnitude estimation experiment to map how roughness, noise and pitch relate to the perceived magnitude of stress, error and danger. These parameters were chosen due to previous findings which suggest perceptual congruency between these auditory sensations and conceptual variables. Results from this experiment show that polarity and scaling preference are dependent on the data:sound mapping. This work provides polarity and scaling values that may be directly utilised by sonification designers to improve auditory displays in areas such as accessible and mobile computing, process-monitoring and biofeedback.","PeriodicalId":20512,"journal":{"name":"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems","volume":"25 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75181782","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
NavigaTone: Seamlessly Embedding Navigation Cues in Mobile Music Listening NavigaTone:在移动音乐收听中无缝嵌入导航提示
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems Pub Date : 2018-04-21 DOI: 10.1145/3173574.3174211
Florian Heller, Johannes Schöning
{"title":"NavigaTone: Seamlessly Embedding Navigation Cues in Mobile Music Listening","authors":"Florian Heller, Johannes Schöning","doi":"10.1145/3173574.3174211","DOIUrl":"https://doi.org/10.1145/3173574.3174211","url":null,"abstract":"As humans, we have the natural capability of localizing the origin of sounds. Spatial audio rendering leverages this skill by applying special filters to recorded audio to create the impression that a sound emanates from a certain position in the physical space. A main application for spatial audio on mobile devices is to provide non-visual navigation cues. Current systems require users to either listen to artificial beacon sounds, or the entire audio source (e.g., a song) is repositioned in space, which impacts the listening experience. We present NavigaTone, a system that takes advantage of multi-track recordings and provides directional cues by moving a single track in the auditory space. While minimizing the impact of the navigation component on the listening experience, a user study showed that participants could localize sources as good as with stereo panning while the listening experience was rated to be closer to common music listening.","PeriodicalId":20512,"journal":{"name":"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems","volume":"238 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77029919","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 23
CatAR: A Novel Stereoscopic Augmented Reality Cataract Surgery Training System with Dexterous Instruments Tracking Technology CatAR:一种具有灵巧器械跟踪技术的新型立体增强现实白内障手术训练系统
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems Pub Date : 2018-04-21 DOI: 10.1145/3173574.3174039
Yu-Hsuan Huang, Hao-Yu Chang, Wan-ling Yang, Yu-Kai Chiu, Tzu-Chieh Yu, Pei-Hsuan Tsai, M. Ouhyoung
{"title":"CatAR: A Novel Stereoscopic Augmented Reality Cataract Surgery Training System with Dexterous Instruments Tracking Technology","authors":"Yu-Hsuan Huang, Hao-Yu Chang, Wan-ling Yang, Yu-Kai Chiu, Tzu-Chieh Yu, Pei-Hsuan Tsai, M. Ouhyoung","doi":"10.1145/3173574.3174039","DOIUrl":"https://doi.org/10.1145/3173574.3174039","url":null,"abstract":"We propose CatAR, a novel stereoscopic augmented reality (AR) cataract surgery training system. It provides dexterous instrument tracking ability using a specially designed infrared optical system with 2 cameras and 1 reflective marker. The tracking accuracy on the instrument tip is 20 µm, much higher than previous simulators. Moreover, our system allows trainees to use and to see real surgical instruments while practicing. Five training modules with 31 parameters were designed and 28 participants were enrolled to conduct efficacy and validity tests. The results revealed significant differences between novice and experienced surgeons. Improvements in surgical skills after practicing with CatAR were also significant.","PeriodicalId":20512,"journal":{"name":"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems","volume":"81 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76266773","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
'You Can Always Do Better!": The Impact of Social Proof on Participant Response Bias “你总是可以做得更好!”:社会认同对参与者反应偏差的影响
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems Pub Date : 2018-04-21 DOI: 10.1145/3173574.3174126
Aditya Vashistha, F. Okeke, Richard J. Anderson, Nicola Dell
{"title":"'You Can Always Do Better!\": The Impact of Social Proof on Participant Response Bias","authors":"Aditya Vashistha, F. Okeke, Richard J. Anderson, Nicola Dell","doi":"10.1145/3173574.3174126","DOIUrl":"https://doi.org/10.1145/3173574.3174126","url":null,"abstract":"Evaluations of technological artifacts in HCI4D contexts are known to suffer from high levels of participant response bias---where participants only provide positive feedback that they think will please the researcher. This paper describes a practical, low-cost intervention that uses the concept of social proof to influence participant response bias and successfully elicit critical feedback from study participants. We subtly exposed participants to feedback that they perceived to be provided by people 'like them', and experimentally controlled the tone and content of the feedback to provide either positive, negative, or no social proof. We then measured how participants' quantitative and qualitative evaluations of an HCI artifact changed based on the feedback to which they were exposed. We conducted two controlled experiments: an online experiment with 245 MTurk workers and a field experiment with 63 women in rural India. Our findings reveal significant differences between participants in the positive, negative, and no social proof conditions, both online and in the field. Participants in the negative condition provided lower ratings and a greater amount of critical feedback, while participants in the positive condition provided higher ratings and a greater amount of positive feedback. Taken together, our findings demonstrate that social proof is a practical and generalizable technique that could be used by HCI researchers to influence participant response bias in a wide range of contexts and domains.","PeriodicalId":20512,"journal":{"name":"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems","volume":"33 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76721136","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 16
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信