{"title":"近似牛顿算法用于Ising模型推理,加快了收敛速度,实现了最优性能,避免了过拟合","authors":"U. Ferrari","doi":"10.1103/PhysRevE.94.023301","DOIUrl":null,"url":null,"abstract":"Inverse problems consist in inferring parameters of model distributions that are able to fit properly chosen features of experimental data-sets. The Inverse Ising problem specifically consists of searching for the maximal entropy distribution reproducing frequencies and correlations of a binary data-set. In order to solve this task, we propose an algorithm that takes advantage of the provided by the data knowledge of the log-likelihood function around the solution. We show that the present algorithm is faster than standard gradient ascent methods. Moreover, by looking at the algorithm convergence as a stochastic process, we properly define over-fitting and we show how the present algorithm avoids it by construction.","PeriodicalId":8438,"journal":{"name":"arXiv: Disordered Systems and Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2015-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"18","resultStr":"{\"title\":\"Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting\",\"authors\":\"U. Ferrari\",\"doi\":\"10.1103/PhysRevE.94.023301\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Inverse problems consist in inferring parameters of model distributions that are able to fit properly chosen features of experimental data-sets. The Inverse Ising problem specifically consists of searching for the maximal entropy distribution reproducing frequencies and correlations of a binary data-set. In order to solve this task, we propose an algorithm that takes advantage of the provided by the data knowledge of the log-likelihood function around the solution. We show that the present algorithm is faster than standard gradient ascent methods. Moreover, by looking at the algorithm convergence as a stochastic process, we properly define over-fitting and we show how the present algorithm avoids it by construction.\",\"PeriodicalId\":8438,\"journal\":{\"name\":\"arXiv: Disordered Systems and Neural Networks\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-07-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"18\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv: Disordered Systems and Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1103/PhysRevE.94.023301\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv: Disordered Systems and Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1103/PhysRevE.94.023301","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Approximated Newton Algorithm for the Ising Model Inference Speeds Up Convergence, Performs Optimally and Avoids Over-fitting
Inverse problems consist in inferring parameters of model distributions that are able to fit properly chosen features of experimental data-sets. The Inverse Ising problem specifically consists of searching for the maximal entropy distribution reproducing frequencies and correlations of a binary data-set. In order to solve this task, we propose an algorithm that takes advantage of the provided by the data knowledge of the log-likelihood function around the solution. We show that the present algorithm is faster than standard gradient ascent methods. Moreover, by looking at the algorithm convergence as a stochastic process, we properly define over-fitting and we show how the present algorithm avoids it by construction.