{"title":"玻尔兹曼机器学习算法的数学理论","authors":"H. Sussmann","doi":"10.1109/IJCNN.1989.118278","DOIUrl":null,"url":null,"abstract":"The author analyzes a version of a well-known learning algorithm for Boltzmann machines, based on the usual alternation between learning and hallucinating phases. He outlines the rigorous proof that, for suitable choices of the parameters, the evolution of the weights follows very closely, with very high probability, an integral trajectory of the gradient of the likelihood function whose global maxima are exactly the desired weight patterns.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"33 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1989-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"The mathematical theory of learning algorithms for Boltzmann machines\",\"authors\":\"H. Sussmann\",\"doi\":\"10.1109/IJCNN.1989.118278\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The author analyzes a version of a well-known learning algorithm for Boltzmann machines, based on the usual alternation between learning and hallucinating phases. He outlines the rigorous proof that, for suitable choices of the parameters, the evolution of the weights follows very closely, with very high probability, an integral trajectory of the gradient of the likelihood function whose global maxima are exactly the desired weight patterns.<<ETX>>\",\"PeriodicalId\":199877,\"journal\":{\"name\":\"International 1989 Joint Conference on Neural Networks\",\"volume\":\"33 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1989-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International 1989 Joint Conference on Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.1989.118278\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International 1989 Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.1989.118278","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
The mathematical theory of learning algorithms for Boltzmann machines
The author analyzes a version of a well-known learning algorithm for Boltzmann machines, based on the usual alternation between learning and hallucinating phases. He outlines the rigorous proof that, for suitable choices of the parameters, the evolution of the weights follows very closely, with very high probability, an integral trajectory of the gradient of the likelihood function whose global maxima are exactly the desired weight patterns.<>