{"title":"迈向新一代,生物学上可信的深度神经网络学习","authors":"Anirudh Apparaju, Ognjen Arandjelovíc","doi":"10.3390/sci4040046","DOIUrl":null,"url":null,"abstract":"Artificial neural networks in their various different forms convincingly dominate machine learning of the present day. Nevertheless, the manner in which these networks are trained, in particular by using end-to-end backpropagation, presents a major limitation in practice and hampers research, and raises questions with regard to the very fundamentals of the learning algorithm design. Motivated by these challenges and the contrast between the phenomenology of biological (natural) neural networks that artificial ones are inspired by and the learning processes underlying the former, there has been an increasing amount of research on the design of biologically plausible means of training artificial neural networks. In this paper we (i) describe a biologically plausible learning method that takes advantage of various biological processes, such as Hebbian synaptic plasticity, and includes both supervised and unsupervised elements, (ii) conduct a series of experiments aimed at elucidating the advantages and disadvantages of the described biologically plausible learning as compared with end-to-end backpropagation, and (iii) discuss the findings which should serve as a means of illuminating the algorithmic fundamentals of interest and directing future research. Among our findings is the greater resilience of biologically plausible learning to data scarcity, which conforms to our expectations, but also its lesser robustness to additive, zero mean Gaussian noise.","PeriodicalId":10987,"journal":{"name":"Decis. Sci.","volume":"152 1 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Towards New Generation, Biologically Plausible Deep Neural Network Learning\",\"authors\":\"Anirudh Apparaju, Ognjen Arandjelovíc\",\"doi\":\"10.3390/sci4040046\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Artificial neural networks in their various different forms convincingly dominate machine learning of the present day. Nevertheless, the manner in which these networks are trained, in particular by using end-to-end backpropagation, presents a major limitation in practice and hampers research, and raises questions with regard to the very fundamentals of the learning algorithm design. Motivated by these challenges and the contrast between the phenomenology of biological (natural) neural networks that artificial ones are inspired by and the learning processes underlying the former, there has been an increasing amount of research on the design of biologically plausible means of training artificial neural networks. In this paper we (i) describe a biologically plausible learning method that takes advantage of various biological processes, such as Hebbian synaptic plasticity, and includes both supervised and unsupervised elements, (ii) conduct a series of experiments aimed at elucidating the advantages and disadvantages of the described biologically plausible learning as compared with end-to-end backpropagation, and (iii) discuss the findings which should serve as a means of illuminating the algorithmic fundamentals of interest and directing future research. Among our findings is the greater resilience of biologically plausible learning to data scarcity, which conforms to our expectations, but also its lesser robustness to additive, zero mean Gaussian noise.\",\"PeriodicalId\":10987,\"journal\":{\"name\":\"Decis. Sci.\",\"volume\":\"152 1 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Decis. Sci.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3390/sci4040046\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Decis. Sci.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/sci4040046","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Towards New Generation, Biologically Plausible Deep Neural Network Learning
Artificial neural networks in their various different forms convincingly dominate machine learning of the present day. Nevertheless, the manner in which these networks are trained, in particular by using end-to-end backpropagation, presents a major limitation in practice and hampers research, and raises questions with regard to the very fundamentals of the learning algorithm design. Motivated by these challenges and the contrast between the phenomenology of biological (natural) neural networks that artificial ones are inspired by and the learning processes underlying the former, there has been an increasing amount of research on the design of biologically plausible means of training artificial neural networks. In this paper we (i) describe a biologically plausible learning method that takes advantage of various biological processes, such as Hebbian synaptic plasticity, and includes both supervised and unsupervised elements, (ii) conduct a series of experiments aimed at elucidating the advantages and disadvantages of the described biologically plausible learning as compared with end-to-end backpropagation, and (iii) discuss the findings which should serve as a means of illuminating the algorithmic fundamentals of interest and directing future research. Among our findings is the greater resilience of biologically plausible learning to data scarcity, which conforms to our expectations, but also its lesser robustness to additive, zero mean Gaussian noise.