Temporal information as top-down context in binocular disparity detection

M. Solgi, J. Weng
{"title":"Temporal information as top-down context in binocular disparity detection","authors":"M. Solgi, J. Weng","doi":"10.1109/DEVLRN.2009.5175533","DOIUrl":null,"url":null,"abstract":"Recently, it has been shown that motor initiated context through top-down connections boosts the performance of network models in object recognition applications. Moreover, models of the 6-layer architecture of the laminar cortex have been shown to have computational advantage over single-layer models of the cortex. In this study, we present a temporal model of the laminar cortex that applies expectation feedback signals as top-down temporal context in a binocular network supervised to learn disparities. The work reported here shows that the 6-layer architecture drastically reduced the disparity detection error by as much as 7 times with context enabled. Top-down context reduced the error by a factor of 2 in the same 6-layer architecture. For the first time, an end-to-end model inspired by the 6-layer architecture with emergent binocular representation has reached a sub-pixel accuracy in the challenging problem of binocular disparity detection from natural images. In addition, our model demonstrates biologically-plausible gradually changing topographic maps; the representation of disparity sensitivity changes smoothly along the cortex.","PeriodicalId":192225,"journal":{"name":"2009 IEEE 8th International Conference on Development and Learning","volume":"151 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 IEEE 8th International Conference on Development and Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DEVLRN.2009.5175533","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Recently, it has been shown that motor initiated context through top-down connections boosts the performance of network models in object recognition applications. Moreover, models of the 6-layer architecture of the laminar cortex have been shown to have computational advantage over single-layer models of the cortex. In this study, we present a temporal model of the laminar cortex that applies expectation feedback signals as top-down temporal context in a binocular network supervised to learn disparities. The work reported here shows that the 6-layer architecture drastically reduced the disparity detection error by as much as 7 times with context enabled. Top-down context reduced the error by a factor of 2 in the same 6-layer architecture. For the first time, an end-to-end model inspired by the 6-layer architecture with emergent binocular representation has reached a sub-pixel accuracy in the challenging problem of binocular disparity detection from natural images. In addition, our model demonstrates biologically-plausible gradually changing topographic maps; the representation of disparity sensitivity changes smoothly along the cortex.
双目视差检测中自上而下背景下的时间信息
最近,有研究表明,运动通过自上而下的连接启动上下文可以提高网络模型在物体识别应用中的性能。此外,层流皮层的六层结构模型已被证明比单层皮层模型具有计算优势。在这项研究中,我们提出了一个层流皮层的时间模型,该模型将期望反馈信号作为自上而下的时间背景,在一个监督学习差异的双眼网络中。这里报告的工作表明,在启用上下文的情况下,6层架构大大减少了多达7倍的视差检测误差。在相同的6层架构中,自上而下的上下文将错误减少了2倍。在具有紧急双目表示的6层架构的启发下,端到端模型首次在具有挑战性的自然图像双目视差检测问题中达到了亚像素精度。此外,我们的模型展示了生物学上可信的逐渐变化的地形图;视差敏感度的表征沿皮层平滑变化。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信