Towards Better User Studies in Computer Graphics and Vision

IF 3.8 Q2 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
Z. Bylinskii, L. Herman, Aaron Hertzmann, Stefanie Hutka, Yile Zhang
{"title":"Towards Better User Studies in Computer Graphics and Vision","authors":"Z. Bylinskii, L. Herman, Aaron Hertzmann, Stefanie Hutka, Yile Zhang","doi":"10.1561/0600000106","DOIUrl":null,"url":null,"abstract":"Online crowdsourcing platforms have made it increasingly easy to perform evaluations of algorithm outputs with survey questions like\"which image is better, A or B?\", leading to their proliferation in vision and graphics research papers. Results of these studies are often used as quantitative evidence in support of a paper's contributions. On the one hand we argue that, when conducted hastily as an afterthought, such studies lead to an increase of uninformative, and, potentially, misleading conclusions. On the other hand, in these same communities, user research is underutilized in driving project direction and forecasting user needs and reception. We call for increased attention to both the design and reporting of user studies in computer vision and graphics papers towards (1) improved replicability and (2) improved project direction. Together with this call, we offer an overview of methodologies from user experience research (UXR), human-computer interaction (HCI), and applied perception to increase exposure to the available methodologies and best practices. We discuss foundational user research methods (e.g., needfinding) that are presently underutilized in computer vision and graphics research, but can provide valuable project direction. We provide further pointers to the literature for readers interested in exploring other UXR methodologies. Finally, we describe broader open issues and recommendations for the research community.","PeriodicalId":45662,"journal":{"name":"Foundations and Trends in Computer Graphics and Vision","volume":"106 1","pages":"201-252"},"PeriodicalIF":3.8000,"publicationDate":"2022-06-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Foundations and Trends in Computer Graphics and Vision","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1561/0600000106","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 8

Abstract

Online crowdsourcing platforms have made it increasingly easy to perform evaluations of algorithm outputs with survey questions like"which image is better, A or B?", leading to their proliferation in vision and graphics research papers. Results of these studies are often used as quantitative evidence in support of a paper's contributions. On the one hand we argue that, when conducted hastily as an afterthought, such studies lead to an increase of uninformative, and, potentially, misleading conclusions. On the other hand, in these same communities, user research is underutilized in driving project direction and forecasting user needs and reception. We call for increased attention to both the design and reporting of user studies in computer vision and graphics papers towards (1) improved replicability and (2) improved project direction. Together with this call, we offer an overview of methodologies from user experience research (UXR), human-computer interaction (HCI), and applied perception to increase exposure to the available methodologies and best practices. We discuss foundational user research methods (e.g., needfinding) that are presently underutilized in computer vision and graphics research, but can provide valuable project direction. We provide further pointers to the literature for readers interested in exploring other UXR methodologies. Finally, we describe broader open issues and recommendations for the research community.
迈向更好的计算机图形学和视觉用户研究
在线众包平台使得对算法输出进行评估变得越来越容易,比如“哪个图像更好,A还是B?”这样的调查问题,导致它们在视觉和图形研究论文中激增。这些研究的结果通常被用作支持论文贡献的定量证据。一方面,我们认为,如果事后仓促进行这样的研究,会导致缺乏信息的结论增加,而且可能会导致误导性的结论。另一方面,在这些社区中,用户研究在推动项目方向和预测用户需求和接收方面没有得到充分利用。我们呼吁增加对计算机视觉和图形论文中用户研究的设计和报告的关注,以实现(1)改进可复制性和(2)改进项目方向。与本次电话会议一起,我们提供了用户体验研究(UXR),人机交互(HCI)和应用感知的方法概述,以增加对可用方法和最佳实践的了解。我们讨论了目前在计算机视觉和图形研究中未充分利用的基本用户研究方法(例如,需求发现),但可以提供有价值的项目方向。我们为有兴趣探索其他UXR方法的读者提供了进一步的文献指向。最后,我们描述了更广泛的开放问题和对研究界的建议。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Foundations and Trends in Computer Graphics and Vision
Foundations and Trends in Computer Graphics and Vision COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS-
CiteScore
31.20
自引率
0.00%
发文量
1
期刊介绍: The growth in all aspects of research in the last decade has led to a multitude of new publications and an exponential increase in published research. Finding a way through the excellent existing literature and keeping up to date has become a major time-consuming problem. Electronic publishing has given researchers instant access to more articles than ever before. But which articles are the essential ones that should be read to understand and keep abreast with developments of any topic? To address this problem Foundations and Trends® in Computer Graphics and Vision publishes high-quality survey and tutorial monographs of the field. Each issue of Foundations and Trends® in Computer Graphics and Vision comprises a 50-100 page monograph written by research leaders in the field. Monographs that give tutorial coverage of subjects, research retrospectives as well as survey papers that offer state-of-the-art reviews fall within the scope of the journal.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信