As information, in the information theoretic sense, requires that context to become intelligible, it cannot play a primary role. The art of experimental science is contained neither in the mathematical description nor in the data collected. 2 2 2Comments like “It is wrong, moreover, to regard this or that physical quantity as sitting out there with this or that numerical value”, “the information thus solicited makes physics and comes in bits” by Wheeler Wheeler ( 1989) or “I am proposing that the ultimate form of the implementable laws of physics requires only operations avaialble (in principle) in our actual universe” by Landauer Landauer go in this direction But it is also true that that data requires the context in order to be understood: we need to know what the subject of our experiment is, how to prepare it and how to collect the data. Therefore the claim that information plays an essential role in physical theories has a valid basis. It is also true that scientific theories, in the end, are models that can capture only the aspects of nature that can be tested experimentally, the information extracted by the experiment, under suitable circumstances. It is true that any physical process can be used to process information, when properly encoded. The use of information in physics, then, does not warrant a fundamental change of perspective in what constitutes a physical object, as some physicists have claimed. But this is true for any mathematical object: a real number may represent mass, color in the frequency spectrum, the total money supply, the half-life for an isotope, a probability and so on. The precise meaning of this variability is context dependent, as the choices of the elements and binning are not fixed and the meaning of the weights depends on what the distribution is describing. This expression quantifies the variability of the elements within the distribution, the variety of values one finds. These are the same requirements Shannon put forth for his expression Shannon ( 1948), from which he showed that the only possible choice is H ( p i ) = − ∑ p i log ( p i ). Then H ( r k ) = H ( p i ) + p a H ( q j ). We will turn to statistical mechanics and apply Shannon variability in different ways to recover connections to the Boltzmann, Gibbs and von Neumann entropies. We will study how the expression works over continuous variables, and show how phase space is special as it leaves the Shannon variability invariant under change of coordinates. It measures the number of questions one must ask to identify an element of the distribution, linking its use to information theory. We will show that the expression − ∑ i p i log p i is the only linear indicator of variability. Therefore we will use the term “Shannon variability,” leaving “entropy” to the physical concept. The general idea is that it measures the variability of the elements within a distribution, which is a general concept applicable to many branches of science, and it is independent of the notion of entropy of thermodynamics and statistical mechanics. This paper aims to give a crisp characterization of the Shannon entropy that is intuitive and precise.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |