{"title":"Understanding two graphical visualizations from observer's pupillary responses and neural network","authors":"Md. Zakir Hossain, Tom Gedeon, Atiqul Islam","doi":"10.1145/3292147.3292187","DOIUrl":null,"url":null,"abstract":"This paper investigates observers' pupillary responses while they viewed two graphical visualizations (circular and organizational). The graphical visualizations are snapshots of the kind of data used in checking the degree of compliance with corporate governance best practice. Six very similar questions were asked from 24 observers for each visualization. In particular, we developed a neural network based classification model to understand these two visualizations from temporal features of observers' pupillary responses. We predicted that whether each observer is more accurate in understanding the two visualizations from their unconscious pupillary responses or conscious verbal responses, by answering relevant questions. We found that observers were physiologically 96.5% and 95.1% accurate, and verbally 80.6% and 81.3% accurate, for the circular and organizational visualizations, respectively.","PeriodicalId":309502,"journal":{"name":"Proceedings of the 30th Australian Conference on Computer-Human Interaction","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 30th Australian Conference on Computer-Human Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3292147.3292187","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
This paper investigates observers' pupillary responses while they viewed two graphical visualizations (circular and organizational). The graphical visualizations are snapshots of the kind of data used in checking the degree of compliance with corporate governance best practice. Six very similar questions were asked from 24 observers for each visualization. In particular, we developed a neural network based classification model to understand these two visualizations from temporal features of observers' pupillary responses. We predicted that whether each observer is more accurate in understanding the two visualizations from their unconscious pupillary responses or conscious verbal responses, by answering relevant questions. We found that observers were physiologically 96.5% and 95.1% accurate, and verbally 80.6% and 81.3% accurate, for the circular and organizational visualizations, respectively.