Applying the efficient coding principle to understand encoding of multisensory and multimodality sensory signals

IF 1.5 4区 心理学 Q4 NEUROSCIENCES
Li Zhaoping
{"title":"Applying the efficient coding principle to understand encoding of multisensory and multimodality sensory signals","authors":"Li Zhaoping","doi":"10.1016/j.visres.2024.108489","DOIUrl":null,"url":null,"abstract":"<div><div>Sensory neurons often encode multisensory or multimodal signals. For example, many medial superior temporal (MST) neurons are tuned to heading direction of self-motion based on visual (optic flow) signals and vestibular signals. Middle temporal (MT) cortical neurons are tuned to object depth from signals of two visual modalities: motion parallax and binocular disparity. A MST neuron’s preferred heading directions from different senses can be congruent (matched) or opposite from each other. Similarly, the preferred depths of a MT neuron from the two modalities are congruent in some neurons and opposite in other neurons. While the congruent tuning appears natural for cue integration, the functions of the opposite tuning have been puzzling. This paper explains these tunings from the efficient coding principle that sensory encoding extracts as much sensory information as possible while minimizing neural cost. It extends the previous applications of this principle to understand neural receptive fields in retina and the primary visual cortex, particularly multimodal encoding of cone signals or binocular signals. Congruent and opposite sensory signals that excite the congruent and opposite neurons, respectively, are the decorrelated sensory components that provide a general purpose, efficient, representation of sensory inputs before task specific object segmentation and recognition. It can be extended to encoding signals from more than two sensory sources, e.g., from three cone types. This framework also predicts a wider tuning width for the opposite than congruent neurons, neurons that are neither congruent nor opposite, and how neural receptive fields adapt to statistical changes of sensory environments.</div></div>","PeriodicalId":23670,"journal":{"name":"Vision Research","volume":"226 ","pages":"Article 108489"},"PeriodicalIF":1.5000,"publicationDate":"2024-11-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Vision Research","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0042698924001330","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

Sensory neurons often encode multisensory or multimodal signals. For example, many medial superior temporal (MST) neurons are tuned to heading direction of self-motion based on visual (optic flow) signals and vestibular signals. Middle temporal (MT) cortical neurons are tuned to object depth from signals of two visual modalities: motion parallax and binocular disparity. A MST neuron’s preferred heading directions from different senses can be congruent (matched) or opposite from each other. Similarly, the preferred depths of a MT neuron from the two modalities are congruent in some neurons and opposite in other neurons. While the congruent tuning appears natural for cue integration, the functions of the opposite tuning have been puzzling. This paper explains these tunings from the efficient coding principle that sensory encoding extracts as much sensory information as possible while minimizing neural cost. It extends the previous applications of this principle to understand neural receptive fields in retina and the primary visual cortex, particularly multimodal encoding of cone signals or binocular signals. Congruent and opposite sensory signals that excite the congruent and opposite neurons, respectively, are the decorrelated sensory components that provide a general purpose, efficient, representation of sensory inputs before task specific object segmentation and recognition. It can be extended to encoding signals from more than two sensory sources, e.g., from three cone types. This framework also predicts a wider tuning width for the opposite than congruent neurons, neurons that are neither congruent nor opposite, and how neural receptive fields adapt to statistical changes of sensory environments.
应用高效编码原理理解多感官和多模态感官信号的编码
感觉神经元经常编码多感觉或多模态信号。例如,许多内侧上颞叶(MST)神经元根据视觉(视流)信号和前庭信号对自身运动的方向进行调谐。中颞叶(MT)皮层神经元根据两种视觉模式的信号(运动视差和双眼视差)对物体深度进行调谐。中颞叶神经元从不同感官获得的首选方向可能是一致的(匹配),也可能是相反的。类似地,MT 神经元在两种模式下的偏好深度在某些神经元中是一致的,而在另一些神经元中则相反。对于线索整合来说,一致的调谐似乎很自然,但相反调谐的功能却令人费解。本文从高效编码原理来解释这些调谐,即感觉编码在提取尽可能多的感觉信息的同时将神经成本降至最低。本文扩展了这一原理的应用,以理解视网膜和初级视觉皮层的神经感受野,尤其是锥体信号或双眼信号的多模态编码。分别激发同向和反向神经元的同向和反向感觉信号是装饰相关的感觉成分,它们在特定任务的物体分割和识别之前提供了通用、高效的感觉输入表征。它可以扩展到对来自两个以上感觉源的信号进行编码,例如来自三种锥体类型的信号。这一框架还预测了相反神经元比同源神经元、既非同源也非相反神经元更宽的调谐宽度,以及神经感受野如何适应感觉环境的统计变化。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Vision Research
Vision Research 医学-神经科学
CiteScore
3.70
自引率
16.70%
发文量
111
审稿时长
66 days
期刊介绍: Vision Research is a journal devoted to the functional aspects of human, vertebrate and invertebrate vision and publishes experimental and observational studies, reviews, and theoretical and computational analyses. Vision Research also publishes clinical studies relevant to normal visual function and basic research relevant to visual dysfunction or its clinical investigation. Functional aspects of vision is interpreted broadly, ranging from molecular and cellular function to perception and behavior. Detailed descriptions are encouraged but enough introductory background should be included for non-specialists. Theoretical and computational papers should give a sense of order to the facts or point to new verifiable observations. Papers dealing with questions in the history of vision science should stress the development of ideas in the field.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信