Ethan C. Jackson, J. Hughes, Mark Daley, M. Winter
{"title":"基于图和张量的神经网络的代数推广","authors":"Ethan C. Jackson, J. Hughes, Mark Daley, M. Winter","doi":"10.1109/CIBCB.2017.8058548","DOIUrl":null,"url":null,"abstract":"Despite significant effort, there is currently no formal or de facto standard framework or format for constructing, representing, or manipulating general neural networks. In computational neuroscience, there have been some attempts to formalize connectionist notations and generative operations for neural networks, including Connection Set Algebra, but none are truly formal or general. In computational intelligence (CI), though the use of linear algebra and tensor-based models are widespread, graph-based frameworks are also popular and there is a lack of tools supporting the transfer of information between systems. To address these gaps, we exploited existing results about the connection between linear and relation algebras to define a concise, formal algebraic framework that generalizes graph and tensor-based neural networks. For simplicity and compatibility, this framework is purposefully defined as a minimal extension to linear algebra. We demonstrate the merits of this approach first by defining new operations for network composition along with proofs of their most important properties. An implementation of the algebraic framework is presented and applied to create an instance of an artificial neural network that is compatible with both graph and tensor based CI frameworks. The result is an algebraic framework for neural networks that generalizes the formats used in at least two systems, together with an example implementation.","PeriodicalId":283115,"journal":{"name":"2017 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"An algebraic generalization for graph and tensor-based neural networks\",\"authors\":\"Ethan C. Jackson, J. Hughes, Mark Daley, M. Winter\",\"doi\":\"10.1109/CIBCB.2017.8058548\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Despite significant effort, there is currently no formal or de facto standard framework or format for constructing, representing, or manipulating general neural networks. In computational neuroscience, there have been some attempts to formalize connectionist notations and generative operations for neural networks, including Connection Set Algebra, but none are truly formal or general. In computational intelligence (CI), though the use of linear algebra and tensor-based models are widespread, graph-based frameworks are also popular and there is a lack of tools supporting the transfer of information between systems. To address these gaps, we exploited existing results about the connection between linear and relation algebras to define a concise, formal algebraic framework that generalizes graph and tensor-based neural networks. For simplicity and compatibility, this framework is purposefully defined as a minimal extension to linear algebra. We demonstrate the merits of this approach first by defining new operations for network composition along with proofs of their most important properties. An implementation of the algebraic framework is presented and applied to create an instance of an artificial neural network that is compatible with both graph and tensor based CI frameworks. The result is an algebraic framework for neural networks that generalizes the formats used in at least two systems, together with an example implementation.\",\"PeriodicalId\":283115,\"journal\":{\"name\":\"2017 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CIBCB.2017.8058548\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CIBCB.2017.8058548","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An algebraic generalization for graph and tensor-based neural networks
Despite significant effort, there is currently no formal or de facto standard framework or format for constructing, representing, or manipulating general neural networks. In computational neuroscience, there have been some attempts to formalize connectionist notations and generative operations for neural networks, including Connection Set Algebra, but none are truly formal or general. In computational intelligence (CI), though the use of linear algebra and tensor-based models are widespread, graph-based frameworks are also popular and there is a lack of tools supporting the transfer of information between systems. To address these gaps, we exploited existing results about the connection between linear and relation algebras to define a concise, formal algebraic framework that generalizes graph and tensor-based neural networks. For simplicity and compatibility, this framework is purposefully defined as a minimal extension to linear algebra. We demonstrate the merits of this approach first by defining new operations for network composition along with proofs of their most important properties. An implementation of the algebraic framework is presented and applied to create an instance of an artificial neural network that is compatible with both graph and tensor based CI frameworks. The result is an algebraic framework for neural networks that generalizes the formats used in at least two systems, together with an example implementation.