{"title":"An analysis of Dynamic Cortex Memory networks","authors":"S. Otte, A. Zell, M. Liwicki","doi":"10.1109/IJCNN.2015.7280753","DOIUrl":null,"url":null,"abstract":"The recently introduced Dynamic Cortex Memory (DCM) is an extension of the Long Short Term Memory (LSTM) providing a systematic inter-gate connection infrastructure. In this paper the behavior of DCM networks is studied in more detail and their potential in the field of gradient-based sequence learning is investigated. Hereby, DCM networks are analyzed regarding particular key features of neural signal processing systems, namely, their robustness to noise and their ability of time warping. Throughout all experiments we show that DCMs converge faster and yield better results than LSTMs. Hereby, DCM networks require overall less weights than pure LSTM networks to achieve the same or even better results. Besides, a promising neurally implemented just-in-time online signal filter approach is presented, which is latency-free and still provides an accurate filtering performance much better than conventional low-pass filters. We also show that the neural networks can do explicit time warping even better than the Dynamic Time Warping (DTW) algorithm, which is a specialized method developed for this task.","PeriodicalId":6539,"journal":{"name":"2015 International Joint Conference on Neural Networks (IJCNN)","volume":"2 1","pages":"1-8"},"PeriodicalIF":0.0000,"publicationDate":"2015-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2015.7280753","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
The recently introduced Dynamic Cortex Memory (DCM) is an extension of the Long Short Term Memory (LSTM) providing a systematic inter-gate connection infrastructure. In this paper the behavior of DCM networks is studied in more detail and their potential in the field of gradient-based sequence learning is investigated. Hereby, DCM networks are analyzed regarding particular key features of neural signal processing systems, namely, their robustness to noise and their ability of time warping. Throughout all experiments we show that DCMs converge faster and yield better results than LSTMs. Hereby, DCM networks require overall less weights than pure LSTM networks to achieve the same or even better results. Besides, a promising neurally implemented just-in-time online signal filter approach is presented, which is latency-free and still provides an accurate filtering performance much better than conventional low-pass filters. We also show that the neural networks can do explicit time warping even better than the Dynamic Time Warping (DTW) algorithm, which is a specialized method developed for this task.