{"title":"A Bio-Inspired Learning Dendritic Motion Detection Framework with Direction-Selective Horizontal Cells.","authors":"Tianqi Chen, Yuki Todo, Zhiyu Qiu, Yuxiao Hua, Hiroki Sugiura, Zheng Tang","doi":"10.3390/biomimetics10050286","DOIUrl":null,"url":null,"abstract":"<p><p>Motion direction detection is an essential task for both computer vision and neuroscience. Inspired by the biological theory of the human visual system, we proposed a learnable horizontal-cell-based dendritic neuron model (HCdM) that captures motion direction with high efficiency while remaining highly robust. Unlike present deep learning models, which rely on extension of computation and extraction of global features, the HCdM mimics the localized processing of dendritic neurons, enabling efficient motion feature integration. Through synaptic learning that prunes unnecessary parts, our model maintains high accuracy in noised images, particularly against salt-and-pepper noise. Experimental results show that the HCdM reached over 99.5% test accuracy, maintained robust performance under 10% salt-and-pepper noise, and achieved cross-dataset generalization exceeding 80% in certain conditions. Comparisons with state-of-the-art (SOTA) models like vision transformers (ViTs) and convolutional neural networks (CNNs) demonstrate the HCdM's robustness and efficiency. Additionally, in contrast to previous artificial visual systems (AVSs), our findings suggest that lateral geniculate nucleus (LGN) structures, though present in biological vision, may not be essential for motion direction detection. This insight provides a new direction for bio-inspired computational models. Future research will focus on hybridizing the HCdM with SOTA models that perform well on complex visual scenes to enhance its adaptability.</p>","PeriodicalId":8907,"journal":{"name":"Biomimetics","volume":"10 5","pages":""},"PeriodicalIF":3.4000,"publicationDate":"2025-05-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12109067/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Biomimetics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.3390/biomimetics10050286","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Motion direction detection is an essential task for both computer vision and neuroscience. Inspired by the biological theory of the human visual system, we proposed a learnable horizontal-cell-based dendritic neuron model (HCdM) that captures motion direction with high efficiency while remaining highly robust. Unlike present deep learning models, which rely on extension of computation and extraction of global features, the HCdM mimics the localized processing of dendritic neurons, enabling efficient motion feature integration. Through synaptic learning that prunes unnecessary parts, our model maintains high accuracy in noised images, particularly against salt-and-pepper noise. Experimental results show that the HCdM reached over 99.5% test accuracy, maintained robust performance under 10% salt-and-pepper noise, and achieved cross-dataset generalization exceeding 80% in certain conditions. Comparisons with state-of-the-art (SOTA) models like vision transformers (ViTs) and convolutional neural networks (CNNs) demonstrate the HCdM's robustness and efficiency. Additionally, in contrast to previous artificial visual systems (AVSs), our findings suggest that lateral geniculate nucleus (LGN) structures, though present in biological vision, may not be essential for motion direction detection. This insight provides a new direction for bio-inspired computational models. Future research will focus on hybridizing the HCdM with SOTA models that perform well on complex visual scenes to enhance its adaptability.