熟悉-分类法:基于视图的快照导航双边方法

IF 1.2 4区 计算机科学 Q4 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Fabian Steinbeck, Efstathios Kagioulis, Alex D. M. Dewar, A. Philippides, Thomas Nowotny, Paul Graham
{"title":"熟悉-分类法:基于视图的快照导航双边方法","authors":"Fabian Steinbeck, Efstathios Kagioulis, Alex D. M. Dewar, A. Philippides, Thomas Nowotny, Paul Graham","doi":"10.1177/10597123231221312","DOIUrl":null,"url":null,"abstract":"Many insects use view-based navigation, or snapshot matching, to return to familiar locations, or navigate routes. This relies on egocentric memories being matched to current views of the world. Previous Snapshot navigation algorithms have used full panoramic vision for the comparison of memorised images with query images to establish a measure of familiarity, which leads to a recovery of the original heading direction from when the snapshot was taken. Many aspects of insect sensory systems are lateralised with steering being derived from the comparison of left and right signals like a classic Braitenberg vehicle. Here, we investigate whether view-based route navigation can be implemented using bilateral visual familiarity comparisons. We found that the difference in familiarity between estimates from left and right fields of view can be used as a steering signal to recover the original heading direction. This finding extends across many different sizes of field of view and visual resolutions. In insects, steering computations are implemented in a brain region called the Lateral Accessory Lobe, within the Central Complex. In a simple simulation, we show with an SNN model of the LAL an existence proof of how bilateral visual familiarity could drive a search for a visually defined goal.","PeriodicalId":55552,"journal":{"name":"Adaptive Behavior","volume":"59 5","pages":""},"PeriodicalIF":1.2000,"publicationDate":"2024-01-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Familiarity-taxis: A bilateral approach to view-based snapshot navigation\",\"authors\":\"Fabian Steinbeck, Efstathios Kagioulis, Alex D. M. Dewar, A. Philippides, Thomas Nowotny, Paul Graham\",\"doi\":\"10.1177/10597123231221312\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Many insects use view-based navigation, or snapshot matching, to return to familiar locations, or navigate routes. This relies on egocentric memories being matched to current views of the world. Previous Snapshot navigation algorithms have used full panoramic vision for the comparison of memorised images with query images to establish a measure of familiarity, which leads to a recovery of the original heading direction from when the snapshot was taken. Many aspects of insect sensory systems are lateralised with steering being derived from the comparison of left and right signals like a classic Braitenberg vehicle. Here, we investigate whether view-based route navigation can be implemented using bilateral visual familiarity comparisons. We found that the difference in familiarity between estimates from left and right fields of view can be used as a steering signal to recover the original heading direction. This finding extends across many different sizes of field of view and visual resolutions. In insects, steering computations are implemented in a brain region called the Lateral Accessory Lobe, within the Central Complex. In a simple simulation, we show with an SNN model of the LAL an existence proof of how bilateral visual familiarity could drive a search for a visually defined goal.\",\"PeriodicalId\":55552,\"journal\":{\"name\":\"Adaptive Behavior\",\"volume\":\"59 5\",\"pages\":\"\"},\"PeriodicalIF\":1.2000,\"publicationDate\":\"2024-01-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Adaptive Behavior\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1177/10597123231221312\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Adaptive Behavior","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1177/10597123231221312","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

许多昆虫利用基于视图的导航或快照匹配来返回熟悉的地点或导航路线。这依赖于以自我为中心的记忆与当前的世界观相匹配。以前的快照导航算法利用全景视觉将记忆图像与查询图像进行比较,以确定熟悉程度,从而恢复拍摄快照时的原始航向。昆虫感官系统的许多方面都是横向的,转向来自于左右信号的比较,就像经典的布赖滕伯格车辆一样。在这里,我们研究了是否可以利用双侧视觉熟悉度比较来实现基于视图的路线导航。我们发现,来自左右视野的估计值之间的熟悉度差异可用作转向信号,以恢复原来的航向。这一发现适用于多种不同大小的视野和视觉分辨率。在昆虫中,转向计算是在中央复合体中一个名为外侧附属叶的脑区实现的。在一个简单的模拟中,我们通过 LAL 的 SNN 模型展示了双侧视觉熟悉性如何驱动对视觉定义目标的搜索。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Familiarity-taxis: A bilateral approach to view-based snapshot navigation
Many insects use view-based navigation, or snapshot matching, to return to familiar locations, or navigate routes. This relies on egocentric memories being matched to current views of the world. Previous Snapshot navigation algorithms have used full panoramic vision for the comparison of memorised images with query images to establish a measure of familiarity, which leads to a recovery of the original heading direction from when the snapshot was taken. Many aspects of insect sensory systems are lateralised with steering being derived from the comparison of left and right signals like a classic Braitenberg vehicle. Here, we investigate whether view-based route navigation can be implemented using bilateral visual familiarity comparisons. We found that the difference in familiarity between estimates from left and right fields of view can be used as a steering signal to recover the original heading direction. This finding extends across many different sizes of field of view and visual resolutions. In insects, steering computations are implemented in a brain region called the Lateral Accessory Lobe, within the Central Complex. In a simple simulation, we show with an SNN model of the LAL an existence proof of how bilateral visual familiarity could drive a search for a visually defined goal.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Adaptive Behavior
Adaptive Behavior 工程技术-计算机:人工智能
CiteScore
4.30
自引率
18.80%
发文量
34
审稿时长
>12 weeks
期刊介绍: _Adaptive Behavior_ publishes articles on adaptive behaviour in living organisms and autonomous artificial systems. The official journal of the _International Society of Adaptive Behavior_, _Adaptive Behavior_, addresses topics such as perception and motor control, embodied cognition, learning and evolution, neural mechanisms, artificial intelligence, behavioral sequences, motivation and emotion, characterization of environments, decision making, collective and social behavior, navigation, foraging, communication and signalling. Print ISSN: 1059-7123
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信