{"title":"Homeostatic development of dynamic neural fields","authors":"Claudius Gläser, F. Joublin, C. Goerick","doi":"10.1109/DEVLRN.2008.4640816","DOIUrl":null,"url":null,"abstract":"Dynamic neural field theory has become a popular technique for modeling the spatio-temporal evolution of activity within the cortex. When using neural fields the right balance between excitation and inhibition within the field is crucial for a stable operation. Finding this balance is a severe problem, particularly in face of experience-driven changes of synaptic strengths. Homeostatic plasticity, where the objective function for each unit is to reach some target firing rate, seems to counteract this problem. Here we present a recurrent neural network model composed of excitatory and inhibitory units which can self-organize via a learning regime incorporating Hebbian plasticity, homeostatic synaptic scaling, and self-regulatory changes in the intrinsic excitability of neurons. Furthermore, we do not define a neural field topology by a fixed lateral connectivity; rather we learn lateral connections as well.","PeriodicalId":366099,"journal":{"name":"2008 7th IEEE International Conference on Development and Learning","volume":"139 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2008-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2008 7th IEEE International Conference on Development and Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DEVLRN.2008.4640816","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
Dynamic neural field theory has become a popular technique for modeling the spatio-temporal evolution of activity within the cortex. When using neural fields the right balance between excitation and inhibition within the field is crucial for a stable operation. Finding this balance is a severe problem, particularly in face of experience-driven changes of synaptic strengths. Homeostatic plasticity, where the objective function for each unit is to reach some target firing rate, seems to counteract this problem. Here we present a recurrent neural network model composed of excitatory and inhibitory units which can self-organize via a learning regime incorporating Hebbian plasticity, homeostatic synaptic scaling, and self-regulatory changes in the intrinsic excitability of neurons. Furthermore, we do not define a neural field topology by a fixed lateral connectivity; rather we learn lateral connections as well.