{"title":"Maximum Entropy","authors":"J. Harte","doi":"10.1002/047174882x.ch12","DOIUrl":"https://doi.org/10.1002/047174882x.ch12","url":null,"abstract":"A major goal of ecology is to predict patterns and changes in the abundance, distribution, and energetics of individuals and species in ecosystems. The maximum entropy theory of ecology (METE) predicts the functional forms and parameter values describing the central metrics of macroecology, including the distribution of abundances over all the species, metabolic rates over all individuals, spatial aggregation of individuals within species, and the dependence of species diversity on areas of habitat. In METE, the maximum entropy inference procedure is implemented using the constraints imposed by a few macroscopic state variables, including the number of species, total abundance, and total metabolic rate in an ecological community. Although the theory adequately predicts pervasive empirical patterns in relatively static ecosystems, there is mounting evidence that in ecosystems in which the state variables are changing rapidly, many of the predictions of METE systematically fail. Here we discuss the underlying logic and predictions of the static theory and then describe progress toward achieving a dynamic theory (DynaMETE) of macroecology capable of describing ecosystems undergoing rapid change as a result of disturbance. An emphasis throughout is on the tension between, and reconciliation of, two legitimate perspectives on ecology: that of the natural historian who studies the uniqueness of every ecosystem and the theorist seeking unification and generality.","PeriodicalId":161177,"journal":{"name":"Advances in Info-Metrics","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132691720","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Estimating Macroeconomic Uncertainty and Discord","authors":"K. Lahiri, Wuwei Wang","doi":"10.1093/oso/9780190636685.003.0011","DOIUrl":"https://doi.org/10.1093/oso/9780190636685.003.0011","url":null,"abstract":"We apply generalized beta and triangular distributions to histograms from the Survey of Professional Forecasters (SPF) to estimate forecast uncertainty, shocks. and discord using an information framework, and we compare these with moment-based estimates. We find that these two approaches produce analogous results, except in cases where the underlying densities deviate significantly from normality. Even though the Shannon entropy is more inclusive of different facets of a forecast density, we find that with SPF forecasts it is largely driven by the variance of the densities. We use Jenson–Shannon Information to measure ex ante “news” or “uncertainty shocks” in real time, and we find that this “news” is closely related to revisions in forecast means, is countercyclical, and raises uncertainty. Using standard vector autoregression analysis, we confirm that uncertainty affects the real sector of the economy negatively.","PeriodicalId":161177,"journal":{"name":"Advances in Info-Metrics","volume":"91 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129137466","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Reduced Perplexity","authors":"Kenric P. Nelson","doi":"10.1093/oso/9780190636685.003.0012","DOIUrl":"https://doi.org/10.1093/oso/9780190636685.003.0012","url":null,"abstract":"This chapter introduces a simple, intuitive approach to the assessment of probabilistic inferences. The Shannon information metrics are translated to the probability domain. The translation shows that the negative logarithmic score and the geometric mean are equivalent measures of the accuracy of a probabilistic inference. The geometric mean of forecasted probabilities is thus a measure of forecast accuracy and represents the central tendency of the forecasts. The reciprocal of the geometric mean is referred to as the perplexity and defines the number of independent choices needed to resolve the uncertainty. The assessment method introduced in this chapter is intended to reduce the ‘qualitative’ perplexity relative to the potpourri of scoring rules currently used to evaluate machine learning and other probabilistic algorithms. Utilization of this assessment will provide insight into designing algorithms with reduced the ‘quantitative’ perplexity and thus improved the accuracy of probabilistic forecasts. The translation of information metrics to the probability domain is incorporating the generalized entropy functions developed Rényi and Tsallis. Both generalizations translate to the weighted generalized mean. The generalized mean of probabilistic forecasts forms a spectrum of performance metrics referred to as a Risk Profile. The arithmetic mean is used to measure the decisiveness, while the –2/3 mean is used to measure the robustness.","PeriodicalId":161177,"journal":{"name":"Advances in Info-Metrics","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133416156","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}