Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

New Foundations for Information TheoryFurther Developments of Logical Entropy

New Foundations for Information Theory: Further Developments of Logical Entropy [This chapter develops the multivariate (i.e., three or more variables) entropies. The Shannon mutual information is negative in the standard probability theory example of three random variables that are pair-wise independent but not mutually independent. When we assume metrical data in the values of the random variable (e.g., a real-valued variable), then there is a natural notion of metrical logical entropy and it is twice the variance—which makes the connection with basic concepts of statistics. The twice-variance formula shows how to extend logical entropy to continuous random variables. Boltzmann entropy is analyzed to show that Shannon entropy only arises in statistical mechanics as a numerical approximation that has attractive properties of analytical tractability. Edwin Jaynes’s Method of MaxEntropy uses the maximization of the Shannon entropy to generalize the indifference principle. When other constraints rule out the uniform distribution, the Jaynes recommendation is to choose the distribution that maximizes the Shannon entropy. The maximization of logical entropy yields a different distribution. Which solution is best? The maximum logical entropy solution is closest to the uniform distribution in terms of the ordinary Euclidean notion of distance. The chapter ends by giving the transition to coding theory.] http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png

New Foundations for Information TheoryFurther Developments of Logical Entropy

Loading next page...
 
/lp/springer-journals/new-foundations-for-information-theory-further-developments-of-logical-n6F8vmZ0kT
Publisher
Springer International Publishing
Copyright
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2021
ISBN
978-3-030-86551-1
Pages
39 –72
DOI
10.1007/978-3-030-86552-8_4
Publisher site
See Chapter on Publisher Site

Abstract

[This chapter develops the multivariate (i.e., three or more variables) entropies. The Shannon mutual information is negative in the standard probability theory example of three random variables that are pair-wise independent but not mutually independent. When we assume metrical data in the values of the random variable (e.g., a real-valued variable), then there is a natural notion of metrical logical entropy and it is twice the variance—which makes the connection with basic concepts of statistics. The twice-variance formula shows how to extend logical entropy to continuous random variables. Boltzmann entropy is analyzed to show that Shannon entropy only arises in statistical mechanics as a numerical approximation that has attractive properties of analytical tractability. Edwin Jaynes’s Method of MaxEntropy uses the maximization of the Shannon entropy to generalize the indifference principle. When other constraints rule out the uniform distribution, the Jaynes recommendation is to choose the distribution that maximizes the Shannon entropy. The maximization of logical entropy yields a different distribution. Which solution is best? The maximum logical entropy solution is closest to the uniform distribution in terms of the ordinary Euclidean notion of distance. The chapter ends by giving the transition to coding theory.]

Published: Sep 2, 2021

Keywords: Multivariate joint distributions; Negative Shannon mutual information; Countable distributions; Metrical logical entropy; Variance; Boltzmann entropy; The MaxEntropy method; Coding theory; Rooted trees

There are no references for this article.