Enea Ceolini, Daniel Neil, T. Delbrück, Shih-Chii Liu
{"title":"自组织递归网络的时间序列识别","authors":"Enea Ceolini, Daniel Neil, T. Delbrück, Shih-Chii Liu","doi":"10.1109/EBCCSP.2016.7605258","DOIUrl":null,"url":null,"abstract":"A big challenge of reservoir-based Recurrent Neural Networks (RNNs) is the optimization of the connection weights within the network so that the network performance is optimal for the intended task of temporal sequence recognition. One particular RNN called the Self-Organizing Recurrent Network (SORN) avoids the mathematical normalization required after each initialization. Instead, three types of cortical plasticity mechanisms optimize the weights within the network during the initial part of the training. The success of this unsupervised training method was demonstrated on temporal sequences that use input symbols with a binary encoding and that activate only one input pool in each time step. This work extends the analysis towards different types of symbol encoding ranging from encoding methods that activate multiple input pools and that use encoding levels that are not strictly binary but analog in nature. Preliminary results show that the SORN model is able to classify well temporal sequences with symbols using these encoding methods and the advantages of this network over a static network in a classification task is still retained.","PeriodicalId":411767,"journal":{"name":"2016 Second International Conference on Event-based Control, Communication, and Signal Processing (EBCCSP)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Temporal sequence recognition in a self-organizing recurrent network\",\"authors\":\"Enea Ceolini, Daniel Neil, T. Delbrück, Shih-Chii Liu\",\"doi\":\"10.1109/EBCCSP.2016.7605258\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A big challenge of reservoir-based Recurrent Neural Networks (RNNs) is the optimization of the connection weights within the network so that the network performance is optimal for the intended task of temporal sequence recognition. One particular RNN called the Self-Organizing Recurrent Network (SORN) avoids the mathematical normalization required after each initialization. Instead, three types of cortical plasticity mechanisms optimize the weights within the network during the initial part of the training. The success of this unsupervised training method was demonstrated on temporal sequences that use input symbols with a binary encoding and that activate only one input pool in each time step. This work extends the analysis towards different types of symbol encoding ranging from encoding methods that activate multiple input pools and that use encoding levels that are not strictly binary but analog in nature. Preliminary results show that the SORN model is able to classify well temporal sequences with symbols using these encoding methods and the advantages of this network over a static network in a classification task is still retained.\",\"PeriodicalId\":411767,\"journal\":{\"name\":\"2016 Second International Conference on Event-based Control, Communication, and Signal Processing (EBCCSP)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-06-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 Second International Conference on Event-based Control, Communication, and Signal Processing (EBCCSP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/EBCCSP.2016.7605258\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 Second International Conference on Event-based Control, Communication, and Signal Processing (EBCCSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/EBCCSP.2016.7605258","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Temporal sequence recognition in a self-organizing recurrent network
A big challenge of reservoir-based Recurrent Neural Networks (RNNs) is the optimization of the connection weights within the network so that the network performance is optimal for the intended task of temporal sequence recognition. One particular RNN called the Self-Organizing Recurrent Network (SORN) avoids the mathematical normalization required after each initialization. Instead, three types of cortical plasticity mechanisms optimize the weights within the network during the initial part of the training. The success of this unsupervised training method was demonstrated on temporal sequences that use input symbols with a binary encoding and that activate only one input pool in each time step. This work extends the analysis towards different types of symbol encoding ranging from encoding methods that activate multiple input pools and that use encoding levels that are not strictly binary but analog in nature. Preliminary results show that the SORN model is able to classify well temporal sequences with symbols using these encoding methods and the advantages of this network over a static network in a classification task is still retained.