{"title":"Neuromorphic Downsampling of Event-Based Camera Output","authors":"Charles Rizzo, C. Schuman, J. Plank","doi":"10.1145/3584954.3584962","DOIUrl":"https://doi.org/10.1145/3584954.3584962","url":null,"abstract":"In this work, we address the problem of training a neuromorphic agent to work on data from event-based cameras. Although event-based camera data is much sparser than standard video frames, the sheer number of events can make the observation space too complex to effectively train an agent. We construct multiple neuromorphic networks that downsample the camera data so as to make training more effective. We then perform a case study of training an agent to play the Atari Pong game by converting each frame to events and downsampling them. The final network combines both the downsampling and the agent. We discuss some practical considerations as well.","PeriodicalId":375527,"journal":{"name":"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference","volume":"412 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126690937","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Translation and Scale Invariance for Event-Based Object tracking","authors":"Jens Egholm Pedersen, Raghav Singhal, J. Conradt","doi":"10.1145/3584954.3584996","DOIUrl":"https://doi.org/10.1145/3584954.3584996","url":null,"abstract":"Without temporal averaging, such as rate codes, it remains challenging to train spiking neural networks for temporal regression tasks. In this work, we present a novel method to accurately predict spatial coordinates from event data with a fully spiking convolutional neural network (SCNN) without temporal averaging. Our method performs on-par with artificial neural networks (ANN) of similar complexity. Additionally, we demonstrate faster convergence in half the time using translation- and scale-invariant receptive fields. To permit comparison with conventional frame-based ANNs, we base our results on a simulated event-based dataset with an unrealistic high density. Therefore, we hypothesize that our method significantly outperform ANNs in settings with lower event density, as seen in real-life event-based data. Our model is fully spiking and can be ported directly to neuromorphic hardware.","PeriodicalId":375527,"journal":{"name":"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127828188","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Spiking LCA in a Neural Circuit with Dictionary Learning and Synaptic Normalization","authors":"Diego Chavez Arana, Alpha Renner, A. Sornborger","doi":"10.1145/3584954.3584968","DOIUrl":"https://doi.org/10.1145/3584954.3584968","url":null,"abstract":"The Locally Competitive Algorithm (LCA) [17, 18] was put forward as a model of primary visual cortex [14, 17] and has been used extensively as a sparse coding algorithm for multivariate data. LCA has seen implementations on neuromorphic processors, including IBM’s TrueNorth processor [10], and Intel’s neuromorphic research processor, Loihi, which show that it can be very efficient with respect to the power resources it consumes [8]. When combined with dictionary learning [13], the LCA algorithm encounters synaptic instability [24], where, as a synapse’s strength grows, its activity increases, further enhancing synaptic strength, leading to a runaway condition, where synapses become saturated [3, 15]. A number of approaches have been suggested to stabilize this phenomenon [1, 2, 5, 7, 12]. Previous work demonstrated that, by extending the cost function used to generate LCA updates, synaptic normalization could be achieved, eliminating synaptic runaway [7]. It was also shown that the resulting algorithm could be implemented in a firing rate model [7]. Here, we implement a probabilistic approximation to this firing rate model as a spiking LCA algorithm that includes dictionary learning and synaptic normalization. The algorithm is based on a synfire-gated synfire chain-based information control network in concert with Hebbian synapses [16, 19]. We show that this algorithm results in correct classification on numeric data taken from the MNIST dataset. LA-UR-22-33004","PeriodicalId":375527,"journal":{"name":"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132873242","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sebastian Siegel, Tobias Ziegler, Younes Bouhadjar, T. Tetzlaff, R. Waser, R. Dittmann, D. Wouters
{"title":"Demonstration of neuromorphic sequence learning on a memristive array","authors":"Sebastian Siegel, Tobias Ziegler, Younes Bouhadjar, T. Tetzlaff, R. Waser, R. Dittmann, D. Wouters","doi":"10.1145/3584954.3585000","DOIUrl":"https://doi.org/10.1145/3584954.3585000","url":null,"abstract":"Sequence learning and prediction are considered principle computations performed by biological brains. Machine learning algorithms solve this type of task, but they require large amounts of training data and a substantial energy budget. An approach to overcome these issues and enable sequence learning with brain-like performance is neuromorphic hardware with brain-inspired learning algorithms. The Hierarchical Temporal Memory (HTM) is an algorithm inspired by the working principles of the neocortex and is able to learn and predict continuous sequences of elements. In a previous study, we showed that memristive devices, an emerging non-volatile memory technology, that is considered for energy efficient neuromorphic hardware, can be used as synapses in a biologically plausible version of the temporal memory algorithm of the HTM model. We subsequently presented a simulation study of an analog-mixed signal memristive hardware architecture that can implement the temporal learning algorithm. This architecture, which we refer to as MemSpikingTM, is based on a memristive crossbar array and a control circuitry implementing the neurons and the learning mechanism. In the study presented here, we demonstrate the functionality of the MemSpikingTM algorithm on a real memristive crossbar array, taped out in a commercially available 130nm CMOS technology node co-integrated with HfO based memristive devices. We explain the algorithm and the functionality of the crossbar array and peripheral circuitry and finally demonstrate context-dependent sequence learning using high-order sequences.","PeriodicalId":375527,"journal":{"name":"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125171782","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Additive manufacture of polymeric organometallic ferroelectric diodes (POMFeDs) for structural neuromorphic hardware","authors":"Davin Browner, S. Sareh, Paul Anderson","doi":"10.1145/3584954.3584998","DOIUrl":"https://doi.org/10.1145/3584954.3584998","url":null,"abstract":"Hardware design and implementation for online machine learning applications is complicated by a number of facets of conventional artificial neural networks (ANN), e.g. deep neural networks (DNNs), such as reliance on atemporal locality, offline learning using large datasets, potential difficulties in transfer from model to substrates, and issues with processing of noisy sensory data using energy-efficient and asynchronous information processing modalities. Analog or mixed-signal spiking neural networks (SNNs) have promise for lower power, temporally localised, and stimuli selective sensing and inference but are difficult fabricate at low cost. Investigation of beyond-CMOS alternative organic substrates may be worthwhile for development of unconventional neuromorphic hardware with pseudo-spiking dynamics for structural electronics integration in bio-signal processing and robotics. Here, polymeric organometallic ferroelectric diodes (POMFeDs) are introduced for development of printable ferroelectric in-sensor SNNs.","PeriodicalId":375527,"journal":{"name":"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125790710","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Impact of Noisy Input on Evolved Spiking Neural Networks for Neuromorphic Systems","authors":"Karan P. Patel, Catherine D. Schuman","doi":"10.1145/3584954.3584969","DOIUrl":"https://doi.org/10.1145/3584954.3584969","url":null,"abstract":"In this work we leverage a simple spiking neuromorphic processor and an evolutionary-based training method to train and test networks in classification and control applications with noise injection in order to explore the resilience and robustness of spiking neural networks on neuromorphic systems. Through our implementation, we were able to observe that injecting noise within the training phase produces more robust networks that are more resilient to noise within the testing phase. Compared to the performance of other popular classifiers on simple data classification tasks, SNNs perform behind nearest neighbors and linear SVM, and above decision trees and traditional neural networks, with respect to performance in the presence of input noise.","PeriodicalId":375527,"journal":{"name":"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference","volume":"213 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134057083","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Madeleine Abernot, S. Gauthier, T. Gonos, A. Todri-Sanial
{"title":"SIFT-ONN: SIFT Feature Detection Algorithm Employing ONNs for Edge Detection","authors":"Madeleine Abernot, S. Gauthier, T. Gonos, A. Todri-Sanial","doi":"10.1145/3584954.3584999","DOIUrl":"https://doi.org/10.1145/3584954.3584999","url":null,"abstract":"Mobile robot navigation tasks can be applied in various domains, such as in space, underwater, and transportation industries, among others. In navigation, robots analyze their environment from sensors and navigate safely up to target points by avoiding obstacles. Numerous methods exist to perform each navigation task. In this work, we focus on robot localization based on feature extraction algorithms using images as sensory data. ORB, and SURF are state-of-the-art algorithms for feature-based robot localization thanks to their fast computation time, even if ORB lacks precision. SIFT is state-of-the-art for high precision feature detection but it is slow and not compatible with real-time robotic applications. Thus, in our work, we explore how to speed up SIFT algorithm for real-time robot localization by employing an unconventional computing paradigm with oscillatory neural networks (ONNs). We present a hybrid SIFT-ONN algorithm that replaces the computation of Difference of Gaussian in SIFT with ONNs by performing image edge detection. We report on SIFT-ONN algorithm performances, which are similar to the state-of-the-art ORB algorithm.","PeriodicalId":375527,"journal":{"name":"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131762133","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Safa, Jonah Van Assche, C. Frenkel, A. Bourdoux, F. Catthoor, G. Gielen
{"title":"Exploring Information-Theoretic Criteria to Accelerate the Tuning of Neuromorphic Level-Crossing ADCs","authors":"A. Safa, Jonah Van Assche, C. Frenkel, A. Bourdoux, F. Catthoor, G. Gielen","doi":"10.1145/3584954.3584994","DOIUrl":"https://doi.org/10.1145/3584954.3584994","url":null,"abstract":"Level-crossing analog-to-digital converters (LC-ADCs) are neuromorphic, event-driven data converters that are gaining much attention for resource-constrained applications where intelligent sensing must be provided at the extreme edge, with tight energy and area budgets. LC-ADCs translate real-world analog signals (such as ECG, EEG, etc.) into sparse spiking signals, providing significant data bandwidth reduction and inducing savings of up to two orders of magnitude in area and energy consumption at the system level compared to the use of conventional ADCs. In addition, the spiking nature of LC-ADCs make their use a natural choice for ultra-low-power, event-driven spiking neural networks (SNNs). Still, the compressed nature of LC-ADC spiking signals can jeopardize the performance of downstream tasks such as signal classification accuracy, which is highly sensitive to the LC-ADC tuning parameters. In this paper, we explore the use of popular information criteria found in model selection theory for the tuning of the LC-ADC parameters. We experimentally demonstrate that information metrics such as the Bayesian, Akaike and corrected Akaike criteria can be used to tune the LC-ADC parameters in order to maximize downstream SNN classification accuracy. We conduct our experiments using both full-resolution weights and 4-bit quantized SNNs, on two different bio-signal classification tasks. We believe that our findings can accelerate the tuning of LC-ADC parameters without resorting to computationally-expensive grid searches that require many SNN training passes.","PeriodicalId":375527,"journal":{"name":"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125026870","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Shunting Inhibition as a Neural-Inspired Mechanism for Multiplication in Neuromorphic Architectures","authors":"Frances S. Chance, S. Cardwell","doi":"10.1145/3584954.3584965","DOIUrl":"https://doi.org/10.1145/3584954.3584965","url":null,"abstract":"Shunting inhibition is a potential mechanism by which biological systems multiply two time-varying signals, most recently proposed in single neurons of the fly visual system. Our work demonstrates this effect in a biological neuron model and the equivalent circuit in neuromorphic hardware modeling dendrites. We present a multi-compartment neuromorphic dendritic model that produces a multiplication-like effect using the shunting inhibition mechanism by varying leakage along the dendritic cable. Dendritic computation in neuromorphic architectures has the potential to increase complexity in single neurons and reduce the energy footprint for neural networks by enabling computation in the interconnect.","PeriodicalId":375527,"journal":{"name":"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130491256","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Modeling Coordinate Transformations in the Dragonfly Nervous System","authors":"Claire Plunkett, Frances S. Chance","doi":"10.1145/3584954.3584959","DOIUrl":"https://doi.org/10.1145/3584954.3584959","url":null,"abstract":"Coordinate transformations are a fundamental operation that must be performed by any animal relying upon sensory information to interact with the external world. We present a neural network model that performs a coordinate transformation from the dragonfly eye’s frame of reference to the body’s frame of reference while hunting. We demonstrate that the model successfully calculates turns required for interception, and discuss how future work will compare our model with biological dragonfly neural circuitry and guide neural-inspired neuromorphic implementations of coordinate transformations.","PeriodicalId":375527,"journal":{"name":"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130644467","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}