{"title":"一种分层时间记忆网络在噪声序列学习中的性能","authors":"Daniel E. Padilla, R. Brinkworth, M. McDonnell","doi":"10.1109/CYBERNETICSCOM.2013.6865779","DOIUrl":null,"url":null,"abstract":"As neurobiological evidence points to the neocortex as the brain region mainly involved in high-level cognitive functions, an innovative model of neocortical information processing has been recently proposed. Based on a simplified model of a neocortical neuron, and inspired by experimental evidence of neocortical organisation, the Hierarchical Temporal Memory (HTM) model attempts at understanding intelligence, but also at building learning machines. This paper focuses on analysing HTM's ability for online, adaptive learning of sequences. In particular, we seek to determine whether the approach is robust to noise in its inputs, and to compare and contrast its performance and attributes to an alternative Hidden Markov Model (HMM) approach. We reproduce a version of a HTM network and apply it to a visual pattern recognition task under various learning conditions. Our first set of experiments explore the HTM network's capability to learn repetitive patterns and sequences of patterns within random data streams. Further experimentation involves assessing the network's learning performance in terms of inference and prediction under different noise conditions. HTM results are compared with those of a HMM trained at the same tasks. Online learning performance results demonstrate the HTM's capacity to make use of context in order to generate stronger predictions, whereas results on robustness to noise reveal an ability to deal with noisy environments. Our comparisons also, however, emphasise a manner in which HTM differs significantly from HMM, which is that HTM generates predicted observations rather than hidden states, and each observation is a sparse distributed representation.","PeriodicalId":351051,"journal":{"name":"2013 IEEE International Conference on Computational Intelligence and Cybernetics (CYBERNETICSCOM)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"18","resultStr":"{\"title\":\"Performance of a hierarchical temporal memory network in noisy sequence learning\",\"authors\":\"Daniel E. Padilla, R. Brinkworth, M. McDonnell\",\"doi\":\"10.1109/CYBERNETICSCOM.2013.6865779\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As neurobiological evidence points to the neocortex as the brain region mainly involved in high-level cognitive functions, an innovative model of neocortical information processing has been recently proposed. Based on a simplified model of a neocortical neuron, and inspired by experimental evidence of neocortical organisation, the Hierarchical Temporal Memory (HTM) model attempts at understanding intelligence, but also at building learning machines. This paper focuses on analysing HTM's ability for online, adaptive learning of sequences. In particular, we seek to determine whether the approach is robust to noise in its inputs, and to compare and contrast its performance and attributes to an alternative Hidden Markov Model (HMM) approach. We reproduce a version of a HTM network and apply it to a visual pattern recognition task under various learning conditions. Our first set of experiments explore the HTM network's capability to learn repetitive patterns and sequences of patterns within random data streams. Further experimentation involves assessing the network's learning performance in terms of inference and prediction under different noise conditions. HTM results are compared with those of a HMM trained at the same tasks. Online learning performance results demonstrate the HTM's capacity to make use of context in order to generate stronger predictions, whereas results on robustness to noise reveal an ability to deal with noisy environments. Our comparisons also, however, emphasise a manner in which HTM differs significantly from HMM, which is that HTM generates predicted observations rather than hidden states, and each observation is a sparse distributed representation.\",\"PeriodicalId\":351051,\"journal\":{\"name\":\"2013 IEEE International Conference on Computational Intelligence and Cybernetics (CYBERNETICSCOM)\",\"volume\":\"39 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"18\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2013 IEEE International Conference on Computational Intelligence and Cybernetics (CYBERNETICSCOM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CYBERNETICSCOM.2013.6865779\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 IEEE International Conference on Computational Intelligence and Cybernetics (CYBERNETICSCOM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CYBERNETICSCOM.2013.6865779","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Performance of a hierarchical temporal memory network in noisy sequence learning
As neurobiological evidence points to the neocortex as the brain region mainly involved in high-level cognitive functions, an innovative model of neocortical information processing has been recently proposed. Based on a simplified model of a neocortical neuron, and inspired by experimental evidence of neocortical organisation, the Hierarchical Temporal Memory (HTM) model attempts at understanding intelligence, but also at building learning machines. This paper focuses on analysing HTM's ability for online, adaptive learning of sequences. In particular, we seek to determine whether the approach is robust to noise in its inputs, and to compare and contrast its performance and attributes to an alternative Hidden Markov Model (HMM) approach. We reproduce a version of a HTM network and apply it to a visual pattern recognition task under various learning conditions. Our first set of experiments explore the HTM network's capability to learn repetitive patterns and sequences of patterns within random data streams. Further experimentation involves assessing the network's learning performance in terms of inference and prediction under different noise conditions. HTM results are compared with those of a HMM trained at the same tasks. Online learning performance results demonstrate the HTM's capacity to make use of context in order to generate stronger predictions, whereas results on robustness to noise reveal an ability to deal with noisy environments. Our comparisons also, however, emphasise a manner in which HTM differs significantly from HMM, which is that HTM generates predicted observations rather than hidden states, and each observation is a sparse distributed representation.