{"title":"CoLaNET -- 用于分类的柱状分层结构尖峰神经网络","authors":"Mikhail Kiselev","doi":"arxiv-2409.01230","DOIUrl":null,"url":null,"abstract":"In the present paper, I describe a spiking neural network (SNN) architecture\nwhich, can be used in wide range of supervised learning classification tasks.\nIt is assumed, that all participating signals (the classified object\ndescription, correct class label and SNN decision) have spiking nature. The\ndistinctive feature of this architecture is a combination of prototypical\nnetwork structures corresponding to different classes and significantly\ndistinctive instances of one class (=columns) and functionally differing\npopulations of neurons inside columns (=layers). The other distinctive feature\nis a novel combination of anti-Hebbian and dopamine-modulated plasticity. The\nplasticity rules are local and do not use the backpropagation principle.\nBesides that, as in my previous studies, I was guided by the requirement that\nthe all neuron/plasticity models should be easily implemented on modern\nneurochips. I illustrate the high performance of my network on the MNIST\nbenchmark.","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":"23 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"CoLaNET -- A Spiking Neural Network with Columnar Layered Architecture for Classification\",\"authors\":\"Mikhail Kiselev\",\"doi\":\"arxiv-2409.01230\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In the present paper, I describe a spiking neural network (SNN) architecture\\nwhich, can be used in wide range of supervised learning classification tasks.\\nIt is assumed, that all participating signals (the classified object\\ndescription, correct class label and SNN decision) have spiking nature. The\\ndistinctive feature of this architecture is a combination of prototypical\\nnetwork structures corresponding to different classes and significantly\\ndistinctive instances of one class (=columns) and functionally differing\\npopulations of neurons inside columns (=layers). The other distinctive feature\\nis a novel combination of anti-Hebbian and dopamine-modulated plasticity. The\\nplasticity rules are local and do not use the backpropagation principle.\\nBesides that, as in my previous studies, I was guided by the requirement that\\nthe all neuron/plasticity models should be easily implemented on modern\\nneurochips. I illustrate the high performance of my network on the MNIST\\nbenchmark.\",\"PeriodicalId\":501347,\"journal\":{\"name\":\"arXiv - CS - Neural and Evolutionary Computing\",\"volume\":\"23 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Neural and Evolutionary Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.01230\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.01230","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
CoLaNET -- A Spiking Neural Network with Columnar Layered Architecture for Classification
In the present paper, I describe a spiking neural network (SNN) architecture
which, can be used in wide range of supervised learning classification tasks.
It is assumed, that all participating signals (the classified object
description, correct class label and SNN decision) have spiking nature. The
distinctive feature of this architecture is a combination of prototypical
network structures corresponding to different classes and significantly
distinctive instances of one class (=columns) and functionally differing
populations of neurons inside columns (=layers). The other distinctive feature
is a novel combination of anti-Hebbian and dopamine-modulated plasticity. The
plasticity rules are local and do not use the backpropagation principle.
Besides that, as in my previous studies, I was guided by the requirement that
the all neuron/plasticity models should be easily implemented on modern
neurochips. I illustrate the high performance of my network on the MNIST
benchmark.