使用浅层神经网络估计声音如何调节初级视觉皮层的方位表示。

IF 3.1 4区 医学 Q2 NEUROSCIENCES
Frontiers in Systems Neuroscience Pub Date : 2022-05-09 eCollection Date: 2022-01-01 DOI:10.3389/fnsys.2022.869705
John P McClure, O Batuhan Erkat, Julien Corbo, Pierre-Olivier Polack
{"title":"使用浅层神经网络估计声音如何调节初级视觉皮层的方位表示。","authors":"John P McClure,&nbsp;O Batuhan Erkat,&nbsp;Julien Corbo,&nbsp;Pierre-Olivier Polack","doi":"10.3389/fnsys.2022.869705","DOIUrl":null,"url":null,"abstract":"<p><p>Audiovisual perception results from the interaction between visual and auditory processing. Hence, presenting auditory and visual inputs simultaneously usually improves the accuracy of the unimodal percepts, but can also lead to audiovisual illusions. Cross-talks between visual and auditory inputs during sensory processing were recently shown to occur as early as in the primary visual cortex (V1). In a previous study, we demonstrated that sounds improve the representation of the orientation of visual stimuli in the naïve mouse V1 by promoting the recruitment of neurons better tuned to the orientation and direction of the visual stimulus. However, we did not test if this type of modulation was still present when the auditory and visual stimuli were both behaviorally relevant. To determine the effect of sounds on active visual processing, we performed calcium imaging in V1 while mice were performing an audiovisual task. We then compared the representations of the task stimuli orientations in the unimodal visual and audiovisual context using shallow neural networks (SNNs). SNNs were chosen because of the biological plausibility of their computational structure and the possibility of identifying <i>post hoc</i> the biological neurons having the strongest influence on the classification decision. We first showed that SNNs can categorize the activity of V1 neurons evoked by drifting gratings of 12 different orientations. Then, we demonstrated using the connection weight approach that SNN training assigns the largest computational weight to the V1 neurons having the best orientation and direction selectivity. Finally, we showed that it is possible to use SNNs to determine how V1 neurons represent the orientations of stimuli that do not belong to the set of orientations used for SNN training. Once the SNN approach was established, we replicated the previous finding that sounds improve orientation representation in the V1 of naïve mice. Then, we showed that, in mice performing an audiovisual detection task, task tones improve the representation of the visual cues associated with the reward while deteriorating the representation of non-rewarded cues. Altogether, our results suggest that the direction of sound modulation in V1 depends on the behavioral relevance of the visual cue.</p>","PeriodicalId":12649,"journal":{"name":"Frontiers in Systems Neuroscience","volume":"16 ","pages":"869705"},"PeriodicalIF":3.1000,"publicationDate":"2022-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9124944/pdf/","citationCount":"3","resultStr":"{\"title\":\"Estimating How Sounds Modulate Orientation Representation in the Primary Visual Cortex Using Shallow Neural Networks.\",\"authors\":\"John P McClure,&nbsp;O Batuhan Erkat,&nbsp;Julien Corbo,&nbsp;Pierre-Olivier Polack\",\"doi\":\"10.3389/fnsys.2022.869705\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Audiovisual perception results from the interaction between visual and auditory processing. Hence, presenting auditory and visual inputs simultaneously usually improves the accuracy of the unimodal percepts, but can also lead to audiovisual illusions. Cross-talks between visual and auditory inputs during sensory processing were recently shown to occur as early as in the primary visual cortex (V1). In a previous study, we demonstrated that sounds improve the representation of the orientation of visual stimuli in the naïve mouse V1 by promoting the recruitment of neurons better tuned to the orientation and direction of the visual stimulus. However, we did not test if this type of modulation was still present when the auditory and visual stimuli were both behaviorally relevant. To determine the effect of sounds on active visual processing, we performed calcium imaging in V1 while mice were performing an audiovisual task. We then compared the representations of the task stimuli orientations in the unimodal visual and audiovisual context using shallow neural networks (SNNs). SNNs were chosen because of the biological plausibility of their computational structure and the possibility of identifying <i>post hoc</i> the biological neurons having the strongest influence on the classification decision. We first showed that SNNs can categorize the activity of V1 neurons evoked by drifting gratings of 12 different orientations. Then, we demonstrated using the connection weight approach that SNN training assigns the largest computational weight to the V1 neurons having the best orientation and direction selectivity. Finally, we showed that it is possible to use SNNs to determine how V1 neurons represent the orientations of stimuli that do not belong to the set of orientations used for SNN training. Once the SNN approach was established, we replicated the previous finding that sounds improve orientation representation in the V1 of naïve mice. Then, we showed that, in mice performing an audiovisual detection task, task tones improve the representation of the visual cues associated with the reward while deteriorating the representation of non-rewarded cues. Altogether, our results suggest that the direction of sound modulation in V1 depends on the behavioral relevance of the visual cue.</p>\",\"PeriodicalId\":12649,\"journal\":{\"name\":\"Frontiers in Systems Neuroscience\",\"volume\":\"16 \",\"pages\":\"869705\"},\"PeriodicalIF\":3.1000,\"publicationDate\":\"2022-05-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9124944/pdf/\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Frontiers in Systems Neuroscience\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.3389/fnsys.2022.869705\",\"RegionNum\":4,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2022/1/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"Q2\",\"JCRName\":\"NEUROSCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Systems Neuroscience","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.3389/fnsys.2022.869705","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2022/1/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 3

摘要

视听感知是视觉和听觉处理相互作用的结果。因此,同时呈现听觉和视觉输入通常可以提高单峰感知的准确性,但也可能导致视听错觉。最近发现,在感觉处理过程中,视觉和听觉输入之间的串扰早在初级视觉皮层(V1)就发生了。在之前的一项研究中,我们证明,声音通过促进更好地适应视觉刺激的方向和方向的神经元的募集,改善了天真小鼠V1对视觉刺激方向的表征。然而,当听觉和视觉刺激都与行为相关时,我们没有测试这种类型的调制是否仍然存在。为了确定声音对主动视觉处理的影响,我们在小鼠执行视听任务时对V1进行了钙成像。然后,我们使用浅层神经网络(SNN)比较了单模式视觉和视听环境中任务刺激方向的表示。之所以选择SNN,是因为其计算结构的生物学合理性,以及事后识别对分类决策影响最大的生物神经元的可能性。我们首先证明了SNNs可以对由12个不同方向的漂移光栅引起的V1神经元的活动进行分类。然后,我们使用连接权重方法证明,SNN训练将最大的计算权重分配给具有最佳方向和方向选择性的V1神经元。最后,我们证明了使用SNN来确定V1神经元如何表示不属于用于SNN训练的方向集的刺激的方向是可能的。一旦SNN方法建立起来,我们就复制了之前的发现,即声音可以改善天真小鼠V1的定向表现。然后,我们发现,在执行视听检测任务的小鼠中,任务音调改善了与奖励相关的视觉线索的表现,同时恶化了非奖励线索的表现。总之,我们的结果表明,V1中声音调制的方向取决于视觉线索的行为相关性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Estimating How Sounds Modulate Orientation Representation in the Primary Visual Cortex Using Shallow Neural Networks.

Estimating How Sounds Modulate Orientation Representation in the Primary Visual Cortex Using Shallow Neural Networks.

Estimating How Sounds Modulate Orientation Representation in the Primary Visual Cortex Using Shallow Neural Networks.

Estimating How Sounds Modulate Orientation Representation in the Primary Visual Cortex Using Shallow Neural Networks.

Audiovisual perception results from the interaction between visual and auditory processing. Hence, presenting auditory and visual inputs simultaneously usually improves the accuracy of the unimodal percepts, but can also lead to audiovisual illusions. Cross-talks between visual and auditory inputs during sensory processing were recently shown to occur as early as in the primary visual cortex (V1). In a previous study, we demonstrated that sounds improve the representation of the orientation of visual stimuli in the naïve mouse V1 by promoting the recruitment of neurons better tuned to the orientation and direction of the visual stimulus. However, we did not test if this type of modulation was still present when the auditory and visual stimuli were both behaviorally relevant. To determine the effect of sounds on active visual processing, we performed calcium imaging in V1 while mice were performing an audiovisual task. We then compared the representations of the task stimuli orientations in the unimodal visual and audiovisual context using shallow neural networks (SNNs). SNNs were chosen because of the biological plausibility of their computational structure and the possibility of identifying post hoc the biological neurons having the strongest influence on the classification decision. We first showed that SNNs can categorize the activity of V1 neurons evoked by drifting gratings of 12 different orientations. Then, we demonstrated using the connection weight approach that SNN training assigns the largest computational weight to the V1 neurons having the best orientation and direction selectivity. Finally, we showed that it is possible to use SNNs to determine how V1 neurons represent the orientations of stimuli that do not belong to the set of orientations used for SNN training. Once the SNN approach was established, we replicated the previous finding that sounds improve orientation representation in the V1 of naïve mice. Then, we showed that, in mice performing an audiovisual detection task, task tones improve the representation of the visual cues associated with the reward while deteriorating the representation of non-rewarded cues. Altogether, our results suggest that the direction of sound modulation in V1 depends on the behavioral relevance of the visual cue.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Frontiers in Systems Neuroscience
Frontiers in Systems Neuroscience Neuroscience-Developmental Neuroscience
CiteScore
6.00
自引率
3.30%
发文量
144
审稿时长
14 weeks
期刊介绍: Frontiers in Systems Neuroscience publishes rigorously peer-reviewed research that advances our understanding of whole systems of the brain, including those involved in sensation, movement, learning and memory, attention, reward, decision-making, reasoning, executive functions, and emotions.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信