{"title":"Dendritic Neural Network: A Novel Extension of Dendritic Neuron Model","authors":"Cheng Tang;Junkai Ji;Yuki Todo;Atsushi Shimada;Weiping Ding;Akimasa Hirata","doi":"10.1109/TETCI.2024.3367819","DOIUrl":null,"url":null,"abstract":"The conventional dendritic neuron model (DNM) is a single-neuron model inspired by biological dendritic neurons that has been applied successfully in various fields. However, an increasing number of input features results in inefficient learning and gradient vanishing problems in the DNM. Thus, the DNM struggles to handle more complex tasks, including multiclass classification and multivariate time-series forecasting problems. In this study, we extended the conventional DNM to overcome these limitations. In the proposed dendritic neural network (DNN), the flexibility of both synapses and dendritic branches is considered and formulated, which can improve the model's nonlinear capabilities on high-dimensional problems. Then, multiple output layers are stacked to accommodate the various loss functions of complex tasks, and a dropout mechanism is implemented to realize a better balance between the underfitting and overfitting problems, which enhances the network's generalizability. The performance and computational efficiency of the proposed DNN compared to state-of-the-art machine learning algorithms were verified on 10 multiclass classification and 2 high-dimensional binary classification datasets. The experimental results demonstrate that the proposed DNN is a promising and practical neural network architecture.","PeriodicalId":13135,"journal":{"name":"IEEE Transactions on Emerging Topics in Computational Intelligence","volume":"8 3","pages":"2228-2239"},"PeriodicalIF":5.3000,"publicationDate":"2024-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10460122","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Emerging Topics in Computational Intelligence","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10460122/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
The conventional dendritic neuron model (DNM) is a single-neuron model inspired by biological dendritic neurons that has been applied successfully in various fields. However, an increasing number of input features results in inefficient learning and gradient vanishing problems in the DNM. Thus, the DNM struggles to handle more complex tasks, including multiclass classification and multivariate time-series forecasting problems. In this study, we extended the conventional DNM to overcome these limitations. In the proposed dendritic neural network (DNN), the flexibility of both synapses and dendritic branches is considered and formulated, which can improve the model's nonlinear capabilities on high-dimensional problems. Then, multiple output layers are stacked to accommodate the various loss functions of complex tasks, and a dropout mechanism is implemented to realize a better balance between the underfitting and overfitting problems, which enhances the network's generalizability. The performance and computational efficiency of the proposed DNN compared to state-of-the-art machine learning algorithms were verified on 10 multiclass classification and 2 high-dimensional binary classification datasets. The experimental results demonstrate that the proposed DNN is a promising and practical neural network architecture.
期刊介绍:
The IEEE Transactions on Emerging Topics in Computational Intelligence (TETCI) publishes original articles on emerging aspects of computational intelligence, including theory, applications, and surveys.
TETCI is an electronics only publication. TETCI publishes six issues per year.
Authors are encouraged to submit manuscripts in any emerging topic in computational intelligence, especially nature-inspired computing topics not covered by other IEEE Computational Intelligence Society journals. A few such illustrative examples are glial cell networks, computational neuroscience, Brain Computer Interface, ambient intelligence, non-fuzzy computing with words, artificial life, cultural learning, artificial endocrine networks, social reasoning, artificial hormone networks, computational intelligence for the IoT and Smart-X technologies.