基于混合视觉神经网络的苍蝇运动跟踪与注视行为模拟

Qinbing Fu, Shigang Yue
{"title":"基于混合视觉神经网络的苍蝇运动跟踪与注视行为模拟","authors":"Qinbing Fu, Shigang Yue","doi":"10.1109/ROBIO.2017.8324652","DOIUrl":null,"url":null,"abstract":"How do animals like insects perceive meaningful visual motion cues involving directional and locational information of moving objects in visual clutter accurately and efficiently? In this paper, with respect to latest biological research progress made in underlying motion detection circuitry in the fly's preliminary visual system, we conduct a novel hybrid visual neural network, combining the functionality of two bio-plausible, namely the motion and the position pathways, for mimicking motion tracking and fixation behaviors. This modeling study extends a former direction selective neurons model to the higher level of behavior. The motivated algorithms can be used to guide a system that extracts location information of moving objects in a scene regardless of background clutter, using entirely low-level visual processing. We tested it against translational movements in synthetic and real-world scenes. The results demonstrated the following contributions: (1) The proposed computational structure fulfills the characteristics of a putative signal tuning map of the fly's physiology. (2) It also satisfies a biological implication that visual fixation behaviors could be simply tuned via the position pathway; nevertheless, the motion-detecting pathway improves the tracking precision. (3) Contrary to segmentation and registration based computer vision techniques, its computational simplicity benefits the building of neuromorphic visual sensor for robots.","PeriodicalId":197159,"journal":{"name":"2017 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":"{\"title\":\"Mimicking fly motion tracking and fixation behaviors with a hybrid visual neural network\",\"authors\":\"Qinbing Fu, Shigang Yue\",\"doi\":\"10.1109/ROBIO.2017.8324652\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"How do animals like insects perceive meaningful visual motion cues involving directional and locational information of moving objects in visual clutter accurately and efficiently? In this paper, with respect to latest biological research progress made in underlying motion detection circuitry in the fly's preliminary visual system, we conduct a novel hybrid visual neural network, combining the functionality of two bio-plausible, namely the motion and the position pathways, for mimicking motion tracking and fixation behaviors. This modeling study extends a former direction selective neurons model to the higher level of behavior. The motivated algorithms can be used to guide a system that extracts location information of moving objects in a scene regardless of background clutter, using entirely low-level visual processing. We tested it against translational movements in synthetic and real-world scenes. The results demonstrated the following contributions: (1) The proposed computational structure fulfills the characteristics of a putative signal tuning map of the fly's physiology. (2) It also satisfies a biological implication that visual fixation behaviors could be simply tuned via the position pathway; nevertheless, the motion-detecting pathway improves the tracking precision. (3) Contrary to segmentation and registration based computer vision techniques, its computational simplicity benefits the building of neuromorphic visual sensor for robots.\",\"PeriodicalId\":197159,\"journal\":{\"name\":\"2017 IEEE International Conference on Robotics and Biomimetics (ROBIO)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-12-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"12\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 IEEE International Conference on Robotics and Biomimetics (ROBIO)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ROBIO.2017.8324652\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE International Conference on Robotics and Biomimetics (ROBIO)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROBIO.2017.8324652","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12

摘要

像昆虫这样的动物如何在视觉混乱中准确有效地感知有意义的视觉运动线索,包括运动物体的方向和位置信息?本文针对蝇类初步视觉系统底层运动检测电路的最新生物学研究进展,结合运动路径和位置路径两种生物似是而非的功能,构建了一种新型的混合视觉神经网络,用于模拟运动跟踪和固定行为。该模型研究将以前的方向选择神经元模型扩展到更高层次的行为。该算法可用于指导系统提取场景中移动物体的位置信息,而不考虑背景杂乱,使用完全低级的视觉处理。我们在合成和真实世界的场景中测试了它的平移运动。结果证明了以下贡献:(1)提出的计算结构满足假定的果蝇生理信号调谐图的特征。(2)这也满足了一个生物学意义,即视觉固定行为可以简单地通过位置通路调节;然而,运动检测路径提高了跟踪精度。(3)与基于分割和配准的计算机视觉技术相反,其计算简单,有利于机器人神经形态视觉传感器的构建。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Mimicking fly motion tracking and fixation behaviors with a hybrid visual neural network
How do animals like insects perceive meaningful visual motion cues involving directional and locational information of moving objects in visual clutter accurately and efficiently? In this paper, with respect to latest biological research progress made in underlying motion detection circuitry in the fly's preliminary visual system, we conduct a novel hybrid visual neural network, combining the functionality of two bio-plausible, namely the motion and the position pathways, for mimicking motion tracking and fixation behaviors. This modeling study extends a former direction selective neurons model to the higher level of behavior. The motivated algorithms can be used to guide a system that extracts location information of moving objects in a scene regardless of background clutter, using entirely low-level visual processing. We tested it against translational movements in synthetic and real-world scenes. The results demonstrated the following contributions: (1) The proposed computational structure fulfills the characteristics of a putative signal tuning map of the fly's physiology. (2) It also satisfies a biological implication that visual fixation behaviors could be simply tuned via the position pathway; nevertheless, the motion-detecting pathway improves the tracking precision. (3) Contrary to segmentation and registration based computer vision techniques, its computational simplicity benefits the building of neuromorphic visual sensor for robots.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信