{"title":"2 Ill-posed Inverse Problem Solution and the Maximum Entropy Principle","authors":"S. Bwanakare","doi":"10.1515/9783110550443-005","DOIUrl":null,"url":null,"abstract":"As explained in the introduction, many economic relationships are characterized by indeterminacy. This may be because of long-range feedback and complex correlations between source and targets, thus rendering causal relationships more difficult to investigate. In this part of the work, the formal definition of the inverse problem will be discussed. A Moore-Penrose approach will be presented for solving this kind of problem and its limits will be stressed. The next step will be to present the concept of the maximum entropy principle in the context of the Gibbs-Shannon model. Extensions of the model by Jaynes and Kullback-Leibler will be presented and a generalisation of the model will be implemented to take into account random disturbance. The next step will concern the non-ergodic form of entropy known in the literature of thermodynamics as non-extensive entropy or non-additive statistics. There will be a focus on Tsallis entropy, and its main properties will be presented in the context of information theory. To establish a footing in the context of real world problems, non-extensive entropy will be generalized and then random disturbances will be introduced into the model. This part of the work will be concluded with the proposition of a statistical inference in the context of information theory.","PeriodicalId":133118,"journal":{"name":"Non-Extensive Entropy Econometrics for Low Frequency Series","volume":"88 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Non-Extensive Entropy Econometrics for Low Frequency Series","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1515/9783110550443-005","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
As explained in the introduction, many economic relationships are characterized by indeterminacy. This may be because of long-range feedback and complex correlations between source and targets, thus rendering causal relationships more difficult to investigate. In this part of the work, the formal definition of the inverse problem will be discussed. A Moore-Penrose approach will be presented for solving this kind of problem and its limits will be stressed. The next step will be to present the concept of the maximum entropy principle in the context of the Gibbs-Shannon model. Extensions of the model by Jaynes and Kullback-Leibler will be presented and a generalisation of the model will be implemented to take into account random disturbance. The next step will concern the non-ergodic form of entropy known in the literature of thermodynamics as non-extensive entropy or non-additive statistics. There will be a focus on Tsallis entropy, and its main properties will be presented in the context of information theory. To establish a footing in the context of real world problems, non-extensive entropy will be generalized and then random disturbances will be introduced into the model. This part of the work will be concluded with the proposition of a statistical inference in the context of information theory.