Spike context: A neuromorphic descriptor for pattern recognition

Bharath Ramesh, Ngoc Anh Le Thi, G. Orchard, C. Xiang
{"title":"Spike context: A neuromorphic descriptor for pattern recognition","authors":"Bharath Ramesh, Ngoc Anh Le Thi, G. Orchard, C. Xiang","doi":"10.1109/BIOCAS.2017.8325188","DOIUrl":null,"url":null,"abstract":"Although the neuromorphic vision community has developed useful event-based descriptors in the recent past, a robust general purpose descriptor that can handle scale and rotation variations has been elusive to achieve. This is partly because event cameras do not output frames at fixed intervals (like standard cameras) that are easy to work with, but an asynchronous sequence of spikes at microsecond to millisecond time resolutions. In this paper, we present Spike Context, a spatio-temporal neuromorphic descriptor that is inspired by the distribution of photo-receptors in the primate fovea. To demonstrate the effectiveness of the spike context descriptors, they are employed as semi-local features in the bag-of-features classification framework. In the first set of experiments on the N-MNIST dataset, we obtained very high results compared to existing works. In addition, we outperformed the state-of-the-art algorithms on the smaller MNIST-DVS dataset. Finally, we demonstrate the ability of the descriptor in handling scale variations by using the leave-one-scale-out protocol on the MNIST-DVS dataset.","PeriodicalId":361477,"journal":{"name":"2017 IEEE Biomedical Circuits and Systems Conference (BioCAS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE Biomedical Circuits and Systems Conference (BioCAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BIOCAS.2017.8325188","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

Although the neuromorphic vision community has developed useful event-based descriptors in the recent past, a robust general purpose descriptor that can handle scale and rotation variations has been elusive to achieve. This is partly because event cameras do not output frames at fixed intervals (like standard cameras) that are easy to work with, but an asynchronous sequence of spikes at microsecond to millisecond time resolutions. In this paper, we present Spike Context, a spatio-temporal neuromorphic descriptor that is inspired by the distribution of photo-receptors in the primate fovea. To demonstrate the effectiveness of the spike context descriptors, they are employed as semi-local features in the bag-of-features classification framework. In the first set of experiments on the N-MNIST dataset, we obtained very high results compared to existing works. In addition, we outperformed the state-of-the-art algorithms on the smaller MNIST-DVS dataset. Finally, we demonstrate the ability of the descriptor in handling scale variations by using the leave-one-scale-out protocol on the MNIST-DVS dataset.
脉冲上下文:用于模式识别的神经形态描述符
尽管近年来神经形态视觉界已经开发出了有用的基于事件的描述符,但一个能够处理尺度和旋转变化的鲁棒通用描述符一直难以实现。这在一定程度上是因为事件相机不以固定的间隔输出帧(像标准相机一样),这很容易处理,而是以微秒到毫秒的时间分辨率输出异步的尖峰序列。在本文中,我们提出了Spike上下文,这是一个时空神经形态描述符,灵感来自灵长类动物中央凹中光感受器的分布。为了证明尖峰上下文描述符的有效性,它们被用作特征袋分类框架中的半局部特征。在N-MNIST数据集上的第一组实验中,我们获得了与现有工作相比非常高的结果。此外,我们在较小的mist - dvs数据集上的性能优于最先进的算法。最后,我们通过在mist - dvs数据集上使用leave-one-scale-out协议来演示描述符处理规模变化的能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信