Open the Black Box of Recurrent Neural Network by Decoding the Internal Dynamics

Jiacheng Tang, Hao Yin, Qi Kang
{"title":"Open the Black Box of Recurrent Neural Network by Decoding the Internal Dynamics","authors":"Jiacheng Tang, Hao Yin, Qi Kang","doi":"10.1109/ICNSC55942.2022.10004061","DOIUrl":null,"url":null,"abstract":"With the development of the neural network, the complexity of the model goes far beyond the imagination. The number of neurons in the network is growing, and the black box problem requires to be solved. Although technics can record the internal dynamics of hidden neurons, the high dimension and complexity of the data bring poor interpretability. This paper introduces Tensor Component Analysis (TCA) to obtain low-dimensional information from the internal dynamics of recurrent neural networks (RNN). The proposed method extracts three interrelated neural factors: neuron factors, temporal factors, and input factors, to decode the forward propagation. This paper designs a variety of experiments to analyze the activity of RNN, and low-dimensional factors are used to explain the model's decision. The experiment shows the broad applicability of the TCA, which can accurately find the functional clustering of neurons and predict most of the classification.","PeriodicalId":230499,"journal":{"name":"2022 IEEE International Conference on Networking, Sensing and Control (ICNSC)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Networking, Sensing and Control (ICNSC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICNSC55942.2022.10004061","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

With the development of the neural network, the complexity of the model goes far beyond the imagination. The number of neurons in the network is growing, and the black box problem requires to be solved. Although technics can record the internal dynamics of hidden neurons, the high dimension and complexity of the data bring poor interpretability. This paper introduces Tensor Component Analysis (TCA) to obtain low-dimensional information from the internal dynamics of recurrent neural networks (RNN). The proposed method extracts three interrelated neural factors: neuron factors, temporal factors, and input factors, to decode the forward propagation. This paper designs a variety of experiments to analyze the activity of RNN, and low-dimensional factors are used to explain the model's decision. The experiment shows the broad applicability of the TCA, which can accurately find the functional clustering of neurons and predict most of the classification.
破解递归神经网络的内部动力学,打开递归神经网络的黑匣子
随着神经网络的发展,模型的复杂性远远超出了人们的想象。网络中的神经元数量不断增长,黑箱问题亟待解决。虽然技术可以记录隐藏神经元的内部动态,但数据的高维和复杂性带来了较差的可解释性。本文引入张量分量分析(TCA)从递归神经网络(RNN)的内部动态中获取低维信息。该方法提取三个相互关联的神经因子:神经元因子、时间因子和输入因子,对前向传播进行解码。本文设计了各种实验来分析RNN的活动,并使用低维因子来解释模型的决策。实验表明了TCA的广泛适用性,它可以准确地找到神经元的功能聚类并预测大部分分类。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信