{"title":"Field experiment methodology for pair analytics","authors":"Linda T. Kaastra, Brian D. Fisher","doi":"10.1145/2669557.2669572","DOIUrl":"https://doi.org/10.1145/2669557.2669572","url":null,"abstract":"This paper describes a qualitative research methodology developed for experimental studies of collaborative visual analysis. In much of this work we build upon Herbert H. Clark's Joint Activity Theory to infer cognitive processes from field experiments testing collaborative decision making over data. As is true of any methodology, it provides the underlying conceptual structure and analytic processes that can be adapted by other researchers to devise their own studies and analyze their results. Our focus is on collaborative use of visual information systems for aircraft safety analysis, however the methods can be/have been extended to other tasks and analysts.","PeriodicalId":179584,"journal":{"name":"Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127968639","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Evaluation methodology for comparing memory and communication of analytic processes in visual analytics","authors":"E. Ragan, J. Goodall","doi":"10.1145/2669557.2669563","DOIUrl":"https://doi.org/10.1145/2669557.2669563","url":null,"abstract":"Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analysts' memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit memory of an analysis for evaluation. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. We discuss the methodology in the context of a case study in using the evaluation methods for a user study. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.","PeriodicalId":179584,"journal":{"name":"Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128068611","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Oopsy-daisy: failure stories in quantitative evaluation studies for visualizations","authors":"Sung-Hee Kim, Ji Soo Yi, N. Elmqvist","doi":"10.1145/2669557.2669576","DOIUrl":"https://doi.org/10.1145/2669557.2669576","url":null,"abstract":"Designing, conducting, and interpreting evaluation studies with human participants is challenging. While researchers in cognitive psychology, social science, and human-computer interaction view competence in evaluation study methodology a key job skill, it is only recently that visualization researchers have begun to feel the need to learn this skill as well. Acquiring such competence is a lengthy and difficult process fraught with much trial and error. Recent work on patterns for visualization evaluation is now providing much-needed best practices for how to evaluate a visualization technique with human participants. However, negative examples of evaluation methods that fail, yield no usable results, or simply do not work are still missing, mainly because of the difficulty and lack of incentive for publishing negative results or failed research. In this paper, we take the position that there are many good ideas with the best intentions for how to evaluate a visualization tool that simply do not work. We call upon the community to help collect these negative examples in order to show the other side of the coin: what not to do when trying to evaluate visualization.","PeriodicalId":179584,"journal":{"name":"Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123727678","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Correll, E. Alexander, Danielle Albers, Alper Sarikaya, Michael Gleicher
{"title":"Navigating reductionism and holism in evaluation","authors":"M. Correll, E. Alexander, Danielle Albers, Alper Sarikaya, Michael Gleicher","doi":"10.1145/2669557.2669577","DOIUrl":"https://doi.org/10.1145/2669557.2669577","url":null,"abstract":"In this position paper, we enumerate two approaches to the evaluation of visualizations which are associated with two approaches to knowledge formation in science: reductionism, which holds that the understanding of complex phenomena is based on the understanding of simpler components; and holism, which states that complex phenomena have characteristics more than the sum of their parts and must be understood as complete, irreducible units. While we believe that each approach has benefits for evaluating visualizations, we claim that strict adherence to one perspective or the other can make it difficult to generate a full evaluative picture of visualization tools and techniques. We argue for movement between and among these perspectives in order to generate knowledge that is both grounded (i.e. its constituent parts work) and validated (i.e. the whole operates correctly). We conclude with examples of techniques which we believe represent movements of this sort from our own work, highlighting areas where we have both \"built up\" reductionist techniques into larger contexts, and \"broken down\" holistic techniques to create generalizable knowledge.","PeriodicalId":179584,"journal":{"name":"Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128163835","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Heidi Lam, Petra Isenberg, Tobias Isenberg, M. Sedlmair
{"title":"Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization","authors":"Heidi Lam, Petra Isenberg, Tobias Isenberg, M. Sedlmair","doi":"10.1145/2669557","DOIUrl":"https://doi.org/10.1145/2669557","url":null,"abstract":"Visualization has shown its ability to produce powerful tools for analyzing, understanding, and communicating data and making it accessible for several different tasks and purposes. Impact of visualization to everyday work and personal lives is demonstrated by many successes stories---such as the increasing prevalence of Tableau, the interactive visualizations produced by the New York Times, or toolkits like VTK/Paraview to name just a few. A large community of casual and professional users are increasingly consuming and producing both interactive and static visualizations.While interactive visualizations move from research into practice at an increasing rate, it still remains an important challenge to find appropriate methods to evaluate their utility and usability. There is a growing need in the community to develop special approaches and metrics for evaluation at all stages of the development life cycle that address specific needs in visualization. This need is reflected, for example, in the increasing number of papers on visualization evaluation---not just at BELIV but also in other venues such as the IEEE VIS conferences and EuroVis. The goal of the BELIV workshop is to continue to provide a dedicated event for discussing visualization evaluation and to spread the word on alternative and novel evaluation methods and methodologies in our community.","PeriodicalId":179584,"journal":{"name":"Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130862559","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Visualizing dimensionally-reduced data: interviews with analysts and a characterization of task sequences","authors":"M. Brehmer, M. Sedlmair, S. Ingram, T. Munzner","doi":"10.1145/2669557.2669559","DOIUrl":"https://doi.org/10.1145/2669557.2669559","url":null,"abstract":"We characterize five task sequences related to visualizing dimensionally-reduced data, drawing from data collected from interviews with ten data analysts spanning six application domains, and from our understanding of the technique literature. Our characterization of visualization task sequences for dimensionally-reduced data fills a gap created by the abundance of proposed techniques and tools that combine high-dimensional data analysis, dimensionality reduction, and visualization, and is intended to be used in the design and evaluation of future techniques and tools. We discuss implications for the evaluation of existing work practices, for the design of controlled experiments, and for the analysis of post-deployment field observations.","PeriodicalId":179584,"journal":{"name":"Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129390445","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Kurzhals, Brian D. Fisher, Michael Burch, D. Weiskopf
{"title":"Evaluating visual analytics with eye tracking","authors":"K. Kurzhals, Brian D. Fisher, Michael Burch, D. Weiskopf","doi":"10.1145/2669557.2669560","DOIUrl":"https://doi.org/10.1145/2669557.2669560","url":null,"abstract":"The application of eye tracking for the evaluation of humans' viewing behavior is a common approach in psychological research. So far, the use of this technique for the evaluation of visual analytics and visualization is less prominent. We investigate recent scientific publications from the main visualization and visual analytics conferences and journals that include an evaluation by eye tracking. Furthermore, we provide an overview of evaluation goals that can be achieved by eye tracking and state-of-the-art analysis techniques for eye tracking data. Ideally, visual analytics leads to a mixed-initiative cognitive system where the mechanism of distribution is the interaction of the user with visualization environments. Therefore, we also include a discussion of cognitive approaches and models to include the user in the evaluation process. Based on our review of the current use of eye tracking evaluation in our field and the cognitive theory, we propose directions of future research on evaluation methodology, leading to the grand challenge of developing an evaluation approach to the mixed-initiative cognitive system of visual analytics.","PeriodicalId":179584,"journal":{"name":"Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132221769","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Considerations for characterizing domain problems","authors":"Kirsten M. Winters, D. Lach, J. Cushing","doi":"10.1145/2669557.2669573","DOIUrl":"https://doi.org/10.1145/2669557.2669573","url":null,"abstract":"The nested blocks and guidelines model is a useful template for creating design and evaluation criteria, because it aligns design to need [17]. Characterizing the outermost block of the nested model---the domain problem---is challenging, mainly due to the nature of contemporary inquiries in various domains, which are dynamic and, by definition, difficult to problematize. We offer here our emerging conceptual framework, based on the central question in our research study---what visualization works for whom and in which situation, to consider when characterizing the outermost block, the domain problem, of the nested model [18].","PeriodicalId":179584,"journal":{"name":"Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127660091","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Simone Kriglstein, M. Pohl, Nikolaus Suchy, J. Gärtner, T. Gschwandtner, S. Miksch
{"title":"Experiences and challenges with evaluation methods in practice: a case study","authors":"Simone Kriglstein, M. Pohl, Nikolaus Suchy, J. Gärtner, T. Gschwandtner, S. Miksch","doi":"10.1145/2669557.2669571","DOIUrl":"https://doi.org/10.1145/2669557.2669571","url":null,"abstract":"The development of information visualizations for companies poses specific challenges, especially for evaluation processes. It is advisable to test these visualizations under realistic circumstances. Because of various constraints, this can be quite difficult. In this paper, we discuss three different methods which can be used to conduct evaluations in companies. These methods are appropriate for different stages in the software life cycle (design phase, development, deployment) and reflect an iterative approach in evaluation. Based on an overview of available evaluation methods we argue that this combination of fairly lightweight methods is especially appropriate for evaluations of information visualizations in companies. These methods complement each other and emphasize different aspects of the evaluation. Based on this case study, we try to generalize our lessons learned from our experiences of conducting evaluations in this context.","PeriodicalId":179584,"journal":{"name":"Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127895155","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Evaluating user behavior and strategy during visual exploration","authors":"K. Reda, Andrew E. Johnson, J. Leigh, M. Papka","doi":"10.1145/2669557.2669575","DOIUrl":"https://doi.org/10.1145/2669557.2669575","url":null,"abstract":"Visualization practitioners have traditionally focused on evaluating the outcome of the visual analytic process, as opposed to studying how that process unfolds. Since user strategy would likely influence the outcome of visual analysis and the nature of insights acquired, it is important to understand how the analytic behavior of users is shaped by variations in the design of the visualization interface. This paper presents a technique for evaluating user behavior in exploratory visual analysis scenarios. We characterize visual exploration as a fluid activity involving transitions between mental and interaction states. We show how micro-patterns in these transitions can be captured and analyzed quantitatively to reveal differences in the exploratory behavior of users, given variations in the visualization interface.","PeriodicalId":179584,"journal":{"name":"Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualization","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124012851","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}