Sébastien Berquet, Hassan Aleem, Norberto M Grzywacz
{"title":"A Fisher Information Theory of Aesthetic Preference for Complexity.","authors":"Sébastien Berquet, Hassan Aleem, Norberto M Grzywacz","doi":"10.3390/e26110901","DOIUrl":null,"url":null,"abstract":"<p><p>When evaluating sensory stimuli, people tend to prefer those with not too little or not too much complexity. A recent theoretical proposal for this phenomenon is that preference has a direct link to the Observed Fisher Information that a stimulus carries about the environment. To make this theory complete, one must specify the model that the brain has about complexities in the world. Here, we develop this model by first obtaining the distributions of three indices of complexity measured as normalized Shannon Entropy in real-world images from seven environments. We then search for a parametric model that accounts for these distributions. Finally, we measure the Observed Fisher Information that each image has about the parameters of this model. The results show that with few exceptions, the distributions of image complexities are unimodal, have negative skewness, and are leptokurtotic. Moreover, the sign and magnitude of the skewness varies systematically with the location of the mode. After investigating tens of models for these distributions, we show that the Logit-Losev function, a generalization of the hyperbolic-secant distribution, fits them well. The Observed Fisher Information for this model shows the inverted-U-shape behavior of complexity preference. Finally, we discuss ways to test our Fisher-Information theory.</p>","PeriodicalId":11694,"journal":{"name":"Entropy","volume":"26 11","pages":""},"PeriodicalIF":2.1000,"publicationDate":"2024-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11593017/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Entropy","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.3390/e26110901","RegionNum":3,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PHYSICS, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
When evaluating sensory stimuli, people tend to prefer those with not too little or not too much complexity. A recent theoretical proposal for this phenomenon is that preference has a direct link to the Observed Fisher Information that a stimulus carries about the environment. To make this theory complete, one must specify the model that the brain has about complexities in the world. Here, we develop this model by first obtaining the distributions of three indices of complexity measured as normalized Shannon Entropy in real-world images from seven environments. We then search for a parametric model that accounts for these distributions. Finally, we measure the Observed Fisher Information that each image has about the parameters of this model. The results show that with few exceptions, the distributions of image complexities are unimodal, have negative skewness, and are leptokurtotic. Moreover, the sign and magnitude of the skewness varies systematically with the location of the mode. After investigating tens of models for these distributions, we show that the Logit-Losev function, a generalization of the hyperbolic-secant distribution, fits them well. The Observed Fisher Information for this model shows the inverted-U-shape behavior of complexity preference. Finally, we discuss ways to test our Fisher-Information theory.
在评估感官刺激时,人们往往倾向于选择那些复杂程度不太高或不太高的刺激。对于这一现象,最近的一个理论建议是,偏好与刺激物所携带的关于环境的费舍尔观察信息有直接联系。要使这一理论完整,我们必须明确大脑关于世界复杂性的模型。在这里,我们首先从七个环境的真实世界图像中获得了以归一化香农熵衡量的三个复杂性指数的分布,从而建立了这一模型。然后,我们寻找能解释这些分布的参数模型。最后,我们测量每幅图像关于该模型参数的观测费雪信息。结果表明,除少数例外,图像复杂度的分布都是单峰、负偏斜和畸高的。此外,偏度的符号和大小随模态位置的不同而系统变化。在对这些分布的数十种模型进行研究后,我们发现双曲-正割分布的广义模型 Logit-Losev 函数能很好地拟合这些分布。该模型的观察费舍尔信息显示了复杂性偏好的倒 U 型行为。最后,我们讨论了检验费雪信息理论的方法。
期刊介绍:
Entropy (ISSN 1099-4300), an international and interdisciplinary journal of entropy and information studies, publishes reviews, regular research papers and short notes. Our aim is to encourage scientists to publish as much as possible their theoretical and experimental details. There is no restriction on the length of the papers. If there are computation and the experiment, the details must be provided so that the results can be reproduced.