Xavier Lagorce, Cedric Meyer, S. Ieng, David Filliat, R. Benosman
{"title":"现场演示:基于神经形态事件的多核高速视觉特征跟踪算法","authors":"Xavier Lagorce, Cedric Meyer, S. Ieng, David Filliat, R. Benosman","doi":"10.1109/BioCAS.2014.6981681","DOIUrl":null,"url":null,"abstract":"This demo presents a method of visual tracking using the output of an event-based asynchronous neuromorphic event-based camera. The approach is event-based thus adapted to the scene-driven properties of these sensors. The method allows to track multiple visual features in real time at a frequency of several hundreds kilohertz. It adapts to scene contents, combining both spatial and temporal correlations of events in an asynchronous iterative framework. Various kernels are used to track features from incoming events such as Gaussian, Gabor, combinations of Gabor functions and any hand-made kernel with very weak constraints. The proposed features tracking method can deal with feature variations in position, scale and orientation. The tracking performance is evaluated experimentally for each kernel to prove the robustness of the proposed solution.","PeriodicalId":414575,"journal":{"name":"2014 IEEE Biomedical Circuits and Systems Conference (BioCAS) Proceedings","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Live demonstration: Neuromorphic event-based multi-kernel algorithm for high speed visual features tracking\",\"authors\":\"Xavier Lagorce, Cedric Meyer, S. Ieng, David Filliat, R. Benosman\",\"doi\":\"10.1109/BioCAS.2014.6981681\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This demo presents a method of visual tracking using the output of an event-based asynchronous neuromorphic event-based camera. The approach is event-based thus adapted to the scene-driven properties of these sensors. The method allows to track multiple visual features in real time at a frequency of several hundreds kilohertz. It adapts to scene contents, combining both spatial and temporal correlations of events in an asynchronous iterative framework. Various kernels are used to track features from incoming events such as Gaussian, Gabor, combinations of Gabor functions and any hand-made kernel with very weak constraints. The proposed features tracking method can deal with feature variations in position, scale and orientation. The tracking performance is evaluated experimentally for each kernel to prove the robustness of the proposed solution.\",\"PeriodicalId\":414575,\"journal\":{\"name\":\"2014 IEEE Biomedical Circuits and Systems Conference (BioCAS) Proceedings\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-12-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2014 IEEE Biomedical Circuits and Systems Conference (BioCAS) Proceedings\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/BioCAS.2014.6981681\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE Biomedical Circuits and Systems Conference (BioCAS) Proceedings","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BioCAS.2014.6981681","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Live demonstration: Neuromorphic event-based multi-kernel algorithm for high speed visual features tracking
This demo presents a method of visual tracking using the output of an event-based asynchronous neuromorphic event-based camera. The approach is event-based thus adapted to the scene-driven properties of these sensors. The method allows to track multiple visual features in real time at a frequency of several hundreds kilohertz. It adapts to scene contents, combining both spatial and temporal correlations of events in an asynchronous iterative framework. Various kernels are used to track features from incoming events such as Gaussian, Gabor, combinations of Gabor functions and any hand-made kernel with very weak constraints. The proposed features tracking method can deal with feature variations in position, scale and orientation. The tracking performance is evaluated experimentally for each kernel to prove the robustness of the proposed solution.