{"title":"Exploring the interplay of visual and haptic modalities in a pattern-matching task","authors":"Katie Seaborn, B. Riecke, A. Antle","doi":"10.1109/HAVE.2010.5623997","DOIUrl":null,"url":null,"abstract":"It is not well understood how working memory deals with coupled haptic and visual presentation modes. Present theoretical understandings of human cognition indicate that these modes are processed by the visuospatial sketchpad. If this is accurate, then there may be no efficiency in distributing information between the haptic and visual modalities in situations of visual overload [1]. However, this needs to be empirically explored. In this paper, we describe an evaluation of human performance in a pattern-matching task involving a fingertip interface that can present both haptic and visual information. Our purpose was to explore the interplay of visual and haptic processing in working memory, in particular how presentation mode affects performance. We designed a comparative study involving a pattern-matching task. Users were presented with a sequence of two patterns through different modalities using a fingertip interface and were asked to differentiate between them. While no significant difference was found between the visual and visual+haptic presentation modes, the results indicate a strong partiality for the coupling of visual and haptic modalities. This suggests that working memory is not hampered by using both visual and haptic channels, and that recall may be strengthened by dual-coding both visual and haptic modes.","PeriodicalId":361251,"journal":{"name":"2010 IEEE International Symposium on Haptic Audio Visual Environments and Games","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 IEEE International Symposium on Haptic Audio Visual Environments and Games","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HAVE.2010.5623997","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
It is not well understood how working memory deals with coupled haptic and visual presentation modes. Present theoretical understandings of human cognition indicate that these modes are processed by the visuospatial sketchpad. If this is accurate, then there may be no efficiency in distributing information between the haptic and visual modalities in situations of visual overload [1]. However, this needs to be empirically explored. In this paper, we describe an evaluation of human performance in a pattern-matching task involving a fingertip interface that can present both haptic and visual information. Our purpose was to explore the interplay of visual and haptic processing in working memory, in particular how presentation mode affects performance. We designed a comparative study involving a pattern-matching task. Users were presented with a sequence of two patterns through different modalities using a fingertip interface and were asked to differentiate between them. While no significant difference was found between the visual and visual+haptic presentation modes, the results indicate a strong partiality for the coupling of visual and haptic modalities. This suggests that working memory is not hampered by using both visual and haptic channels, and that recall may be strengthened by dual-coding both visual and haptic modes.