{"title":"学习算法的特定配置的量子","authors":"S. Montigny, Richard Labib","doi":"10.1109/IJCNN.2011.6033271","DOIUrl":null,"url":null,"abstract":"The quantron is a new artificial neuron model, able to solve nonlinear classification problems, for which an efficient learning algorithm has yet to be developed. Using surrogate potentials, constraints on some parameters and an infinite number of potentials, we obtain analytical expressions involving ceiling functions for the activation function of the quantron. We then show how to retrieve the parameters of a neuron from the images it produced.","PeriodicalId":415833,"journal":{"name":"The 2011 International Joint Conference on Neural Networks","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Learning algorithms for a specific configuration of the quantron\",\"authors\":\"S. Montigny, Richard Labib\",\"doi\":\"10.1109/IJCNN.2011.6033271\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The quantron is a new artificial neuron model, able to solve nonlinear classification problems, for which an efficient learning algorithm has yet to be developed. Using surrogate potentials, constraints on some parameters and an infinite number of potentials, we obtain analytical expressions involving ceiling functions for the activation function of the quantron. We then show how to retrieve the parameters of a neuron from the images it produced.\",\"PeriodicalId\":415833,\"journal\":{\"name\":\"The 2011 International Joint Conference on Neural Networks\",\"volume\":\"27 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-10-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The 2011 International Joint Conference on Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.2011.6033271\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The 2011 International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2011.6033271","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Learning algorithms for a specific configuration of the quantron
The quantron is a new artificial neuron model, able to solve nonlinear classification problems, for which an efficient learning algorithm has yet to be developed. Using surrogate potentials, constraints on some parameters and an infinite number of potentials, we obtain analytical expressions involving ceiling functions for the activation function of the quantron. We then show how to retrieve the parameters of a neuron from the images it produced.