{"title":"Designing the evaluation of operator-enabled interactive data exploration in VALIDE","authors":"Yogendra Patil, S. Amer-Yahia, S. Subramanian","doi":"10.1145/3546930.3547509","DOIUrl":null,"url":null,"abstract":"Interactive Data Exploration (IDE) systems are technologies that facilitate the understanding of large datasets by providing high level easy-to-use operators. Compared to traditional querying systems, where users have to express each query, IDE systems allows users to perform expressive data exploration following the click-select-execute paradigm. Today, there exists no full-fledged evaluation framework for operator-enabled IDE. Most previous works are based on either logging user actions implicitly to compute quantitative metrics or running user studies to collect explicit feedback. Hence, there is a pressing need to articulate an evaluation framework that collects and compares quantitative human feedback along with system and data-centric evaluations. In this paper, we develop VALIDE, a preliminary design of a unified framework consisting of a methodology and metrics for IDE systems. VALIDE combines research from database benchmarking and human-computer interaction and will be demonstrated with a real IDE system.","PeriodicalId":92279,"journal":{"name":"Proceedings of the 2nd Workshop on Human-In-the-Loop Data Analytics. Workshop on Human-In-the-Loop Data Analytics (2nd : 2017 : Chicago, Ill.)","volume":"36 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2nd Workshop on Human-In-the-Loop Data Analytics. Workshop on Human-In-the-Loop Data Analytics (2nd : 2017 : Chicago, Ill.)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3546930.3547509","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Interactive Data Exploration (IDE) systems are technologies that facilitate the understanding of large datasets by providing high level easy-to-use operators. Compared to traditional querying systems, where users have to express each query, IDE systems allows users to perform expressive data exploration following the click-select-execute paradigm. Today, there exists no full-fledged evaluation framework for operator-enabled IDE. Most previous works are based on either logging user actions implicitly to compute quantitative metrics or running user studies to collect explicit feedback. Hence, there is a pressing need to articulate an evaluation framework that collects and compares quantitative human feedback along with system and data-centric evaluations. In this paper, we develop VALIDE, a preliminary design of a unified framework consisting of a methodology and metrics for IDE systems. VALIDE combines research from database benchmarking and human-computer interaction and will be demonstrated with a real IDE system.