Zhihang Dong, Tongshuang Sherry Wu, Sicheng Song, M. Zhang
{"title":"Interactive Attention Model Explorer for Natural Language Processing Tasks with Unbalanced Data Sizes","authors":"Zhihang Dong, Tongshuang Sherry Wu, Sicheng Song, M. Zhang","doi":"10.1109/PacificVis48177.2020.1031","DOIUrl":null,"url":null,"abstract":"Conventional attention visualization tools compromise either the readability or the information conveyed when documents are lengthy, especially when these documents have imbalanced sizes. Our work strives toward a more intuitive visualization for a subset of Natural Language Processing tasks, where attention is mapped between documents with imbalanced sizes. We extend the flow map visualization to enhance the readability of the attention-augmented documents. Through interaction, our design enables semantic filtering that helps users prioritize important tokens and meaningful matching for an in-depth exploration. Case studies and informal user studies in machine comprehension prove that our visualization effectively helps users gain initial understandings about what their models are \"paying attention to.\" We discuss how the work can be extended to other domains, as well as being plugged into more end-to-end systems for model error analysis.","PeriodicalId":322092,"journal":{"name":"2020 IEEE Pacific Visualization Symposium (PacificVis)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE Pacific Visualization Symposium (PacificVis)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PacificVis48177.2020.1031","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
Conventional attention visualization tools compromise either the readability or the information conveyed when documents are lengthy, especially when these documents have imbalanced sizes. Our work strives toward a more intuitive visualization for a subset of Natural Language Processing tasks, where attention is mapped between documents with imbalanced sizes. We extend the flow map visualization to enhance the readability of the attention-augmented documents. Through interaction, our design enables semantic filtering that helps users prioritize important tokens and meaningful matching for an in-depth exploration. Case studies and informal user studies in machine comprehension prove that our visualization effectively helps users gain initial understandings about what their models are "paying attention to." We discuss how the work can be extended to other domains, as well as being plugged into more end-to-end systems for model error analysis.