{"title":"推进神经计算:前馈树网络中树突学习的实验验证与优化。","authors":"Seyed-Ali Sadegh-Zadeh, Pooya Hazegh","doi":"10.62347/FIQW7087","DOIUrl":null,"url":null,"abstract":"<p><strong>Objectives: </strong>This study aims to explore the capabilities of dendritic learning within feedforward tree networks (FFTN) in comparison to traditional synaptic plasticity models, particularly in the context of digit recognition tasks using the MNIST dataset.</p><p><strong>Methods: </strong>We employed FFTNs with nonlinear dendritic segment amplification and Hebbian learning rules to enhance computational efficiency. The MNIST dataset, consisting of 70,000 images of handwritten digits, was used for training and testing. Key performance metrics, including accuracy, precision, recall, and F1-score, were analysed.</p><p><strong>Results: </strong>The dendritic models significantly outperformed synaptic plasticity-based models across all metrics. Specifically, the dendritic learning framework achieved a test accuracy of 91%, compared to 88% for synaptic models, demonstrating superior performance in digit classification.</p><p><strong>Conclusions: </strong>Dendritic learning offers a more powerful computational framework by closely mimicking biological neural processes, providing enhanced learning efficiency and scalability. These findings have important implications for advancing both artificial intelligence systems and computational neuroscience.</p>","PeriodicalId":72170,"journal":{"name":"American journal of neurodegenerative disease","volume":"13 5","pages":"49-69"},"PeriodicalIF":0.0000,"publicationDate":"2024-12-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11751443/pdf/","citationCount":"0","resultStr":"{\"title\":\"Advancing neural computation: experimental validation and optimization of dendritic learning in feedforward tree networks.\",\"authors\":\"Seyed-Ali Sadegh-Zadeh, Pooya Hazegh\",\"doi\":\"10.62347/FIQW7087\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Objectives: </strong>This study aims to explore the capabilities of dendritic learning within feedforward tree networks (FFTN) in comparison to traditional synaptic plasticity models, particularly in the context of digit recognition tasks using the MNIST dataset.</p><p><strong>Methods: </strong>We employed FFTNs with nonlinear dendritic segment amplification and Hebbian learning rules to enhance computational efficiency. The MNIST dataset, consisting of 70,000 images of handwritten digits, was used for training and testing. Key performance metrics, including accuracy, precision, recall, and F1-score, were analysed.</p><p><strong>Results: </strong>The dendritic models significantly outperformed synaptic plasticity-based models across all metrics. Specifically, the dendritic learning framework achieved a test accuracy of 91%, compared to 88% for synaptic models, demonstrating superior performance in digit classification.</p><p><strong>Conclusions: </strong>Dendritic learning offers a more powerful computational framework by closely mimicking biological neural processes, providing enhanced learning efficiency and scalability. These findings have important implications for advancing both artificial intelligence systems and computational neuroscience.</p>\",\"PeriodicalId\":72170,\"journal\":{\"name\":\"American journal of neurodegenerative disease\",\"volume\":\"13 5\",\"pages\":\"49-69\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-12-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11751443/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"American journal of neurodegenerative disease\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.62347/FIQW7087\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/1/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"American journal of neurodegenerative disease","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.62347/FIQW7087","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/1 0:00:00","PubModel":"eCollection","JCR":"","JCRName":"","Score":null,"Total":0}
Advancing neural computation: experimental validation and optimization of dendritic learning in feedforward tree networks.
Objectives: This study aims to explore the capabilities of dendritic learning within feedforward tree networks (FFTN) in comparison to traditional synaptic plasticity models, particularly in the context of digit recognition tasks using the MNIST dataset.
Methods: We employed FFTNs with nonlinear dendritic segment amplification and Hebbian learning rules to enhance computational efficiency. The MNIST dataset, consisting of 70,000 images of handwritten digits, was used for training and testing. Key performance metrics, including accuracy, precision, recall, and F1-score, were analysed.
Results: The dendritic models significantly outperformed synaptic plasticity-based models across all metrics. Specifically, the dendritic learning framework achieved a test accuracy of 91%, compared to 88% for synaptic models, demonstrating superior performance in digit classification.
Conclusions: Dendritic learning offers a more powerful computational framework by closely mimicking biological neural processes, providing enhanced learning efficiency and scalability. These findings have important implications for advancing both artificial intelligence systems and computational neuroscience.