Saku Sourulahti, Christian P Janssen, Jussi PP Jokinen
{"title":"Modeling Rational Adaptation of Visual Search to Hierarchical Structures","authors":"Saku Sourulahti, Christian P Janssen, Jussi PP Jokinen","doi":"arxiv-2409.08967","DOIUrl":null,"url":null,"abstract":"Efficient attention deployment in visual search is limited by human visual\nmemory, yet this limitation can be offset by exploiting the environment's\nstructure. This paper introduces a computational cognitive model that simulates\nhow the human visual system uses visual hierarchies to prevent refixations in\nsequential attention deployment. The model adopts computational rationality,\npositing behaviors as adaptations to cognitive constraints and environmental\nstructures. In contrast to earlier models that predict search performance for\nhierarchical information, our model does not include predefined assumptions\nabout particular search strategies. Instead, our model's search strategy\nemerges as a result of adapting to the environment through reinforcement\nlearning algorithms. In an experiment with human participants we test the\nmodel's prediction that structured environments reduce visual search times\ncompared to random tasks. Our model's predictions correspond well with human\nsearch performance across various set sizes for both structured and\nunstructured visual layouts. Our work improves understanding of the adaptive\nnature of visual search in hierarchically structured environments and informs\nthe design of optimized search spaces.","PeriodicalId":501541,"journal":{"name":"arXiv - CS - Human-Computer Interaction","volume":"15 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Human-Computer Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.08967","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Efficient attention deployment in visual search is limited by human visual
memory, yet this limitation can be offset by exploiting the environment's
structure. This paper introduces a computational cognitive model that simulates
how the human visual system uses visual hierarchies to prevent refixations in
sequential attention deployment. The model adopts computational rationality,
positing behaviors as adaptations to cognitive constraints and environmental
structures. In contrast to earlier models that predict search performance for
hierarchical information, our model does not include predefined assumptions
about particular search strategies. Instead, our model's search strategy
emerges as a result of adapting to the environment through reinforcement
learning algorithms. In an experiment with human participants we test the
model's prediction that structured environments reduce visual search times
compared to random tasks. Our model's predictions correspond well with human
search performance across various set sizes for both structured and
unstructured visual layouts. Our work improves understanding of the adaptive
nature of visual search in hierarchically structured environments and informs
the design of optimized search spaces.