HaDi MaBouDi, Mark Roper, Marie-Geneviève Guiraud, Mikko Juusola, Lars Chittka, James A R Marshall
{"title":"主动视觉的神经形态模型显示了小叶神经元的时空编码如何有助于蜜蜂的模式识别。","authors":"HaDi MaBouDi, Mark Roper, Marie-Geneviève Guiraud, Mikko Juusola, Lars Chittka, James A R Marshall","doi":"10.7554/eLife.89929","DOIUrl":null,"url":null,"abstract":"<p><p>Bees' remarkable visual learning abilities make them ideal for studying active information acquisition and representation. Here, we develop a biologically inspired model to examine how flight behaviours during visual scanning shape neural representation in the insect brain, exploring the interplay between scanning behaviour, neural connectivity, and visual encoding efficiency. Incorporating non-associative learning-adaptive changes without reinforcement-and exposing the model to sequential natural images during scanning, we obtain results that closely match neurobiological observations. Active scanning and non-associative learning dynamically shape neural activity, optimising information flow and representation. Lobula neurons, crucial for visual integration, self-organise into orientation-selective cells with sparse, decorrelated responses to orthogonal bar movements. They encode a range of orientations, biased by input speed and contrast, suggesting co-evolution with scanning behaviour to enhance visual representation and support efficient coding. To assess the significance of this spatiotemporal coding, we extend the model with circuitry analogous to the mushroom body, a region linked to associative learning. The model demonstrates robust performance in pattern recognition, implying a similar encoding mechanism in insects. Integrating behavioural, neurobiological, and computational insights, this study highlights how spatiotemporal coding in the lobula efficiently compresses visual features, offering broader insights into active vision strategies and bio-inspired automation.</p>","PeriodicalId":11640,"journal":{"name":"eLife","volume":"14 ","pages":""},"PeriodicalIF":6.4000,"publicationDate":"2025-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12212919/pdf/","citationCount":"0","resultStr":"{\"title\":\"A neuromorphic model of active vision shows how spatiotemporal encoding in lobula neurons can aid pattern recognition in bees.\",\"authors\":\"HaDi MaBouDi, Mark Roper, Marie-Geneviève Guiraud, Mikko Juusola, Lars Chittka, James A R Marshall\",\"doi\":\"10.7554/eLife.89929\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Bees' remarkable visual learning abilities make them ideal for studying active information acquisition and representation. Here, we develop a biologically inspired model to examine how flight behaviours during visual scanning shape neural representation in the insect brain, exploring the interplay between scanning behaviour, neural connectivity, and visual encoding efficiency. Incorporating non-associative learning-adaptive changes without reinforcement-and exposing the model to sequential natural images during scanning, we obtain results that closely match neurobiological observations. Active scanning and non-associative learning dynamically shape neural activity, optimising information flow and representation. Lobula neurons, crucial for visual integration, self-organise into orientation-selective cells with sparse, decorrelated responses to orthogonal bar movements. They encode a range of orientations, biased by input speed and contrast, suggesting co-evolution with scanning behaviour to enhance visual representation and support efficient coding. To assess the significance of this spatiotemporal coding, we extend the model with circuitry analogous to the mushroom body, a region linked to associative learning. The model demonstrates robust performance in pattern recognition, implying a similar encoding mechanism in insects. Integrating behavioural, neurobiological, and computational insights, this study highlights how spatiotemporal coding in the lobula efficiently compresses visual features, offering broader insights into active vision strategies and bio-inspired automation.</p>\",\"PeriodicalId\":11640,\"journal\":{\"name\":\"eLife\",\"volume\":\"14 \",\"pages\":\"\"},\"PeriodicalIF\":6.4000,\"publicationDate\":\"2025-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12212919/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"eLife\",\"FirstCategoryId\":\"99\",\"ListUrlMain\":\"https://doi.org/10.7554/eLife.89929\",\"RegionNum\":1,\"RegionCategory\":\"生物学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"BIOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"eLife","FirstCategoryId":"99","ListUrlMain":"https://doi.org/10.7554/eLife.89929","RegionNum":1,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"BIOLOGY","Score":null,"Total":0}
A neuromorphic model of active vision shows how spatiotemporal encoding in lobula neurons can aid pattern recognition in bees.
Bees' remarkable visual learning abilities make them ideal for studying active information acquisition and representation. Here, we develop a biologically inspired model to examine how flight behaviours during visual scanning shape neural representation in the insect brain, exploring the interplay between scanning behaviour, neural connectivity, and visual encoding efficiency. Incorporating non-associative learning-adaptive changes without reinforcement-and exposing the model to sequential natural images during scanning, we obtain results that closely match neurobiological observations. Active scanning and non-associative learning dynamically shape neural activity, optimising information flow and representation. Lobula neurons, crucial for visual integration, self-organise into orientation-selective cells with sparse, decorrelated responses to orthogonal bar movements. They encode a range of orientations, biased by input speed and contrast, suggesting co-evolution with scanning behaviour to enhance visual representation and support efficient coding. To assess the significance of this spatiotemporal coding, we extend the model with circuitry analogous to the mushroom body, a region linked to associative learning. The model demonstrates robust performance in pattern recognition, implying a similar encoding mechanism in insects. Integrating behavioural, neurobiological, and computational insights, this study highlights how spatiotemporal coding in the lobula efficiently compresses visual features, offering broader insights into active vision strategies and bio-inspired automation.
期刊介绍:
eLife is a distinguished, not-for-profit, peer-reviewed open access scientific journal that specializes in the fields of biomedical and life sciences. eLife is known for its selective publication process, which includes a variety of article types such as:
Research Articles: Detailed reports of original research findings.
Short Reports: Concise presentations of significant findings that do not warrant a full-length research article.
Tools and Resources: Descriptions of new tools, technologies, or resources that facilitate scientific research.
Research Advances: Brief reports on significant scientific advancements that have immediate implications for the field.
Scientific Correspondence: Short communications that comment on or provide additional information related to published articles.
Review Articles: Comprehensive overviews of a specific topic or field within the life sciences.