稀疏矩阵表示在分层时间记忆序列建模中的性能分析

IF 0.9 Q4 TELECOMMUNICATIONS
Csongor Pilinszki-Nagy, Bálint Gyires-Tóth
{"title":"稀疏矩阵表示在分层时间记忆序列建模中的性能分析","authors":"Csongor Pilinszki-Nagy, Bálint Gyires-Tóth","doi":"10.36244/icj.2020.2.6","DOIUrl":null,"url":null,"abstract":"Hierarchical Temporal Memory (HTM) is a special type of artificial neural network (ANN), that differs from the widely used approaches. It is suited to efficiently model sequential data (including time series). The network implements a variable order sequence memory, it is trained by Hebbian learning and all of the network’s activations are binary and sparse. The network consists of four separable units. First, the encoder layer translates the numerical input into sparse binary vectors. The Spatial Pooler performs normalization and models the spatial features of the encoded input. The Temporal Memory is responsible for learning the Spatial Pooler’s normalized output sequence. Finally, the decoder takes the Temporal Memory’s outputs and translates it to the target. The connections in the network are also sparse, which requires prudent design and implementation. In this paper a sparse matrix implementation is elaborated, it is compared to the dense implementation. Furthermore, the HTM’s performance is evaluated in terms of accuracy, speed and memory complexity and compared to the deep neural network-based LSTM (Long Short-Term Memory).","PeriodicalId":42504,"journal":{"name":"Infocommunications Journal","volume":"38 1","pages":""},"PeriodicalIF":0.9000,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Performance Analysis of Sparse Matrix Representation in Hierarchical Temporal Memory for Sequence Modeling\",\"authors\":\"Csongor Pilinszki-Nagy, Bálint Gyires-Tóth\",\"doi\":\"10.36244/icj.2020.2.6\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Hierarchical Temporal Memory (HTM) is a special type of artificial neural network (ANN), that differs from the widely used approaches. It is suited to efficiently model sequential data (including time series). The network implements a variable order sequence memory, it is trained by Hebbian learning and all of the network’s activations are binary and sparse. The network consists of four separable units. First, the encoder layer translates the numerical input into sparse binary vectors. The Spatial Pooler performs normalization and models the spatial features of the encoded input. The Temporal Memory is responsible for learning the Spatial Pooler’s normalized output sequence. Finally, the decoder takes the Temporal Memory’s outputs and translates it to the target. The connections in the network are also sparse, which requires prudent design and implementation. In this paper a sparse matrix implementation is elaborated, it is compared to the dense implementation. Furthermore, the HTM’s performance is evaluated in terms of accuracy, speed and memory complexity and compared to the deep neural network-based LSTM (Long Short-Term Memory).\",\"PeriodicalId\":42504,\"journal\":{\"name\":\"Infocommunications Journal\",\"volume\":\"38 1\",\"pages\":\"\"},\"PeriodicalIF\":0.9000,\"publicationDate\":\"2020-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Infocommunications Journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.36244/icj.2020.2.6\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"TELECOMMUNICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Infocommunications Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.36244/icj.2020.2.6","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"TELECOMMUNICATIONS","Score":null,"Total":0}
引用次数: 3

摘要

分层时间记忆(HTM)是一种特殊类型的人工神经网络(ANN),不同于目前广泛使用的方法。它适合于有效地对序列数据(包括时间序列)建模。该网络实现了一种变阶序列记忆,采用Hebbian学习方法进行训练,所有的网络激活都是二值化和稀疏化的。网络由四个可分离的单元组成。首先,编码器层将数值输入转换成稀疏的二值向量。Spatial Pooler执行归一化并对编码输入的空间特征建模。时间存储器负责学习空间池的规范化输出序列。最后,解码器接受暂时存储器的输出并将其转换为目标。网络中的连接也是稀疏的,这需要谨慎的设计和实现。本文阐述了稀疏矩阵的实现,并与密集矩阵的实现进行了比较。此外,HTM的性能在准确性、速度和记忆复杂性方面进行了评估,并与基于深度神经网络的LSTM(长短期记忆)进行了比较。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Performance Analysis of Sparse Matrix Representation in Hierarchical Temporal Memory for Sequence Modeling
Hierarchical Temporal Memory (HTM) is a special type of artificial neural network (ANN), that differs from the widely used approaches. It is suited to efficiently model sequential data (including time series). The network implements a variable order sequence memory, it is trained by Hebbian learning and all of the network’s activations are binary and sparse. The network consists of four separable units. First, the encoder layer translates the numerical input into sparse binary vectors. The Spatial Pooler performs normalization and models the spatial features of the encoded input. The Temporal Memory is responsible for learning the Spatial Pooler’s normalized output sequence. Finally, the decoder takes the Temporal Memory’s outputs and translates it to the target. The connections in the network are also sparse, which requires prudent design and implementation. In this paper a sparse matrix implementation is elaborated, it is compared to the dense implementation. Furthermore, the HTM’s performance is evaluated in terms of accuracy, speed and memory complexity and compared to the deep neural network-based LSTM (Long Short-Term Memory).
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Infocommunications Journal
Infocommunications Journal TELECOMMUNICATIONS-
CiteScore
1.90
自引率
27.30%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信