{"title":"Self-Attention Based Semantic Decomposition in Vector Symbolic Architectures","authors":"Calvin Yeung, Prathyush Poduval, Mohsen Imani","doi":"arxiv-2403.13218","DOIUrl":null,"url":null,"abstract":"Vector Symbolic Architectures (VSAs) have emerged as a novel framework for\nenabling interpretable machine learning algorithms equipped with the ability to\nreason and explain their decision processes. The basic idea is to represent\ndiscrete information through high dimensional random vectors. Complex data\nstructures can be built up with operations over vectors such as the \"binding\"\noperation involving element-wise vector multiplication, which associates data\ntogether. The reverse task of decomposing the associated elements is a\ncombinatorially hard task, with an exponentially large search space. The main\nalgorithm for performing this search is the resonator network, inspired by\nHopfield network-based memory search operations. In this work, we introduce a new variant of the resonator network, based on\nself-attention based update rules in the iterative search problem. This update\nrule, based on the Hopfield network with log-sum-exp energy function and\nnorm-bounded states, is shown to substantially improve the performance and rate\nof convergence. As a result, our algorithm enables a larger capacity for\nassociative memory, enabling applications in many tasks like perception based\npattern recognition, scene decomposition, and object reasoning. We substantiate\nour algorithm with a thorough evaluation and comparisons to baselines.","PeriodicalId":501033,"journal":{"name":"arXiv - CS - Symbolic Computation","volume":"42 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Symbolic Computation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2403.13218","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Vector Symbolic Architectures (VSAs) have emerged as a novel framework for
enabling interpretable machine learning algorithms equipped with the ability to
reason and explain their decision processes. The basic idea is to represent
discrete information through high dimensional random vectors. Complex data
structures can be built up with operations over vectors such as the "binding"
operation involving element-wise vector multiplication, which associates data
together. The reverse task of decomposing the associated elements is a
combinatorially hard task, with an exponentially large search space. The main
algorithm for performing this search is the resonator network, inspired by
Hopfield network-based memory search operations. In this work, we introduce a new variant of the resonator network, based on
self-attention based update rules in the iterative search problem. This update
rule, based on the Hopfield network with log-sum-exp energy function and
norm-bounded states, is shown to substantially improve the performance and rate
of convergence. As a result, our algorithm enables a larger capacity for
associative memory, enabling applications in many tasks like perception based
pattern recognition, scene decomposition, and object reasoning. We substantiate
our algorithm with a thorough evaluation and comparisons to baselines.