在信息可视化中设计无偏复制研究

P. Sukumar, Ronald A. Metoyer
{"title":"在信息可视化中设计无偏复制研究","authors":"P. Sukumar, Ronald A. Metoyer","doi":"10.1109/BELIV.2018.8634261","DOIUrl":null,"url":null,"abstract":"Experimenter bias and expectancy effects have been well studied in the social sciences and even in human-computer interaction. They refer to the nonideal study-design choices made by experimenters which can unfairly influence the outcomes of their studies. While these biases need to be considered when designing any empirical study, they can be particularly significant in the context of replication studies which can stray from the studies being replicated in only a few admissible ways. Although there are general guidelines for making valid, unbiased choices in each of the several steps in experimental design, making such choices when conducting replication studies has not been well explored.We reviewed 16 replication studies in information visualization published in four top venues between 2008 to present to characterize how the study designs of the replication studies differed from those of the studies they replicated. We present our characterization categories which include the prevalence of crowdsourcing, and the commonly-found replication types and study-design differences. We draw guidelines based on these categories towards helping researchers make meaningful and unbiased decisions when designing replication studies. Our paper presents the first steps in gaining a larger understanding of this topic and contributes to the ongoing efforts of encouraging researchers to conduct and publish more replication studies in information visualization.","PeriodicalId":269472,"journal":{"name":"2018 IEEE Evaluation and Beyond - Methodological Approaches for Visualization (BELIV)","volume":"79 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Towards Designing Unbiased Replication Studies in Information Visualization\",\"authors\":\"P. Sukumar, Ronald A. Metoyer\",\"doi\":\"10.1109/BELIV.2018.8634261\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Experimenter bias and expectancy effects have been well studied in the social sciences and even in human-computer interaction. They refer to the nonideal study-design choices made by experimenters which can unfairly influence the outcomes of their studies. While these biases need to be considered when designing any empirical study, they can be particularly significant in the context of replication studies which can stray from the studies being replicated in only a few admissible ways. Although there are general guidelines for making valid, unbiased choices in each of the several steps in experimental design, making such choices when conducting replication studies has not been well explored.We reviewed 16 replication studies in information visualization published in four top venues between 2008 to present to characterize how the study designs of the replication studies differed from those of the studies they replicated. We present our characterization categories which include the prevalence of crowdsourcing, and the commonly-found replication types and study-design differences. We draw guidelines based on these categories towards helping researchers make meaningful and unbiased decisions when designing replication studies. Our paper presents the first steps in gaining a larger understanding of this topic and contributes to the ongoing efforts of encouraging researchers to conduct and publish more replication studies in information visualization.\",\"PeriodicalId\":269472,\"journal\":{\"name\":\"2018 IEEE Evaluation and Beyond - Methodological Approaches for Visualization (BELIV)\",\"volume\":\"79 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE Evaluation and Beyond - Methodological Approaches for Visualization (BELIV)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/BELIV.2018.8634261\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE Evaluation and Beyond - Methodological Approaches for Visualization (BELIV)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BELIV.2018.8634261","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

实验者偏见和期望效应在社会科学甚至人机交互中都得到了很好的研究。他们指的是实验者做出的非理想的研究设计选择,这些选择可能不公平地影响他们的研究结果。虽然在设计任何实证研究时都需要考虑这些偏差,但在复制研究的背景下,它们可能特别重要,因为这些研究可能会偏离只有几种可接受的方式被复制的研究。虽然在实验设计的每一个步骤中都有有效的、公正的选择的一般指导方针,但在进行复制研究时做出这样的选择还没有得到很好的探索。我们回顾了从2008年到现在在四个顶级场所发表的16项信息可视化的重复研究,以描述重复研究的研究设计与他们复制的研究的不同之处。我们提出了我们的特征分类,包括众包的流行,以及常见的复制类型和研究设计差异。我们根据这些类别制定指导方针,以帮助研究人员在设计复制研究时做出有意义和公正的决定。我们的论文提出了获得对这一主题更大理解的第一步,并有助于鼓励研究人员在信息可视化方面进行和发表更多的复制研究。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Towards Designing Unbiased Replication Studies in Information Visualization
Experimenter bias and expectancy effects have been well studied in the social sciences and even in human-computer interaction. They refer to the nonideal study-design choices made by experimenters which can unfairly influence the outcomes of their studies. While these biases need to be considered when designing any empirical study, they can be particularly significant in the context of replication studies which can stray from the studies being replicated in only a few admissible ways. Although there are general guidelines for making valid, unbiased choices in each of the several steps in experimental design, making such choices when conducting replication studies has not been well explored.We reviewed 16 replication studies in information visualization published in four top venues between 2008 to present to characterize how the study designs of the replication studies differed from those of the studies they replicated. We present our characterization categories which include the prevalence of crowdsourcing, and the commonly-found replication types and study-design differences. We draw guidelines based on these categories towards helping researchers make meaningful and unbiased decisions when designing replication studies. Our paper presents the first steps in gaining a larger understanding of this topic and contributes to the ongoing efforts of encouraging researchers to conduct and publish more replication studies in information visualization.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信