{"title":"Entropy, information and energy","authors":"S. Tiwari","doi":"10.1093/oso/9780198759867.003.0002","DOIUrl":null,"url":null,"abstract":"Chapter 2 brings forth the links between entropy and energy through their intimate link to information. Probabilities—as a statistical tool when there are unknowns—connect to information as well as to the various forms of entropy. Entropy is a variable introduced to characterize circumstances involving unknowns. Boltzmann entropy, von Neumann entropy, Shannon entropy and others can be viewed through this common viewpoint. This chapter broadens this discussion to include Fisher entropy—a measure that stresses locality—and the principle of minimum negentropy (or maximum entropy) to show how a variety of physical descriptions represented by equations such as the Schrödinger equation, diffusion equations, Maxwell-Boltzmann distributions, et cetera, can be seen through a probabilistic information-centric perspective.","PeriodicalId":44695,"journal":{"name":"Semiconductor Physics Quantum Electronics & Optoelectronics","volume":null,"pages":null},"PeriodicalIF":1.1000,"publicationDate":"2020-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Semiconductor Physics Quantum Electronics & Optoelectronics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/oso/9780198759867.003.0002","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"QUANTUM SCIENCE & TECHNOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
Chapter 2 brings forth the links between entropy and energy through their intimate link to information. Probabilities—as a statistical tool when there are unknowns—connect to information as well as to the various forms of entropy. Entropy is a variable introduced to characterize circumstances involving unknowns. Boltzmann entropy, von Neumann entropy, Shannon entropy and others can be viewed through this common viewpoint. This chapter broadens this discussion to include Fisher entropy—a measure that stresses locality—and the principle of minimum negentropy (or maximum entropy) to show how a variety of physical descriptions represented by equations such as the Schrödinger equation, diffusion equations, Maxwell-Boltzmann distributions, et cetera, can be seen through a probabilistic information-centric perspective.