{"title":"简单高效的基于峰值的mlGeNN机器学习","authors":"James C. Knight, T. Nowotny","doi":"10.1145/3584954.3585001","DOIUrl":null,"url":null,"abstract":"Intuitive and easy to use application programming interfaces such as Keras have played a large part in the rapid acceleration of machine learning with artificial neural networks. Building on our recent works translating ANNs to SNNs and directly training classifiers with e-prop, we here present the mlGeNN interface as an easy way to define, train and test spiking neural networks on our efficient GPU based GeNN framework. We illustrate the use of mlGeNN by investigating the performance of a number of one and two layer recurrent spiking neural networks trained to recognise hand gestures from the DVS gesture dataset with the e-prop learning rule. We find that not only is mlGeNN vastly more convenient to use than the lower level PyGeNN interface, the new freedom to effortlessly and rapidly prototype different network architectures also gave us an unprecedented overview over how e-prop compares to other recently published results on the DVS gesture dataset across architectural details.","PeriodicalId":375527,"journal":{"name":"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference","volume":"290 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Easy and efficient spike-based Machine Learning with mlGeNN\",\"authors\":\"James C. Knight, T. Nowotny\",\"doi\":\"10.1145/3584954.3585001\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Intuitive and easy to use application programming interfaces such as Keras have played a large part in the rapid acceleration of machine learning with artificial neural networks. Building on our recent works translating ANNs to SNNs and directly training classifiers with e-prop, we here present the mlGeNN interface as an easy way to define, train and test spiking neural networks on our efficient GPU based GeNN framework. We illustrate the use of mlGeNN by investigating the performance of a number of one and two layer recurrent spiking neural networks trained to recognise hand gestures from the DVS gesture dataset with the e-prop learning rule. We find that not only is mlGeNN vastly more convenient to use than the lower level PyGeNN interface, the new freedom to effortlessly and rapidly prototype different network architectures also gave us an unprecedented overview over how e-prop compares to other recently published results on the DVS gesture dataset across architectural details.\",\"PeriodicalId\":375527,\"journal\":{\"name\":\"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference\",\"volume\":\"290 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-04-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3584954.3585001\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3584954.3585001","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Easy and efficient spike-based Machine Learning with mlGeNN
Intuitive and easy to use application programming interfaces such as Keras have played a large part in the rapid acceleration of machine learning with artificial neural networks. Building on our recent works translating ANNs to SNNs and directly training classifiers with e-prop, we here present the mlGeNN interface as an easy way to define, train and test spiking neural networks on our efficient GPU based GeNN framework. We illustrate the use of mlGeNN by investigating the performance of a number of one and two layer recurrent spiking neural networks trained to recognise hand gestures from the DVS gesture dataset with the e-prop learning rule. We find that not only is mlGeNN vastly more convenient to use than the lower level PyGeNN interface, the new freedom to effortlessly and rapidly prototype different network architectures also gave us an unprecedented overview over how e-prop compares to other recently published results on the DVS gesture dataset across architectural details.