Vasilisa Y. Stepasyuk, V. A. Makarov, S. Lobov, V. Kazantsev
{"title":"Synaptic scaling as an essential component of Hebbian learning","authors":"Vasilisa Y. Stepasyuk, V. A. Makarov, S. Lobov, V. Kazantsev","doi":"10.1109/DCNA56428.2022.9923054","DOIUrl":null,"url":null,"abstract":"Hebbian plasticity is a prominent learning mechanism for brain neural networks. However, its formal definition based on the time-matching of pre and postsynaptic activity can lead to a saturation of synaptic weights. On the one hand, the so-called forgetting function formally allows bounding the synaptic weights, but its biological basis remains unclear. On the other hand, biological neurons exhibit homeostatic plasticity, particularly synaptic scaling, which helps a neuron control (scale) the synaptic effectiveness across the synapses. This work proposes a mathematical model of Hebbian learning with synaptic scaling in a spiking neuron. Numerical simulations show that this biologically justified model exhibits behavior similar to the standard model with the forgetting function. We illustrate the results in a test-bed problem of learning frequency patterns by a high-dimensional neuron.","PeriodicalId":110836,"journal":{"name":"2022 6th Scientific School Dynamics of Complex Networks and their Applications (DCNA)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 6th Scientific School Dynamics of Complex Networks and their Applications (DCNA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DCNA56428.2022.9923054","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Hebbian plasticity is a prominent learning mechanism for brain neural networks. However, its formal definition based on the time-matching of pre and postsynaptic activity can lead to a saturation of synaptic weights. On the one hand, the so-called forgetting function formally allows bounding the synaptic weights, but its biological basis remains unclear. On the other hand, biological neurons exhibit homeostatic plasticity, particularly synaptic scaling, which helps a neuron control (scale) the synaptic effectiveness across the synapses. This work proposes a mathematical model of Hebbian learning with synaptic scaling in a spiking neuron. Numerical simulations show that this biologically justified model exhibits behavior similar to the standard model with the forgetting function. We illustrate the results in a test-bed problem of learning frequency patterns by a high-dimensional neuron.