Access the full text.
Sign up today, get DeepDyve free for 14 days.
[This chapter develops the multivariate (i.e., three or more variables) entropies. The Shannon mutual information is negative in the standard probability theory example of three random variables that are pair-wise independent but not mutually independent. When we assume metrical data in the values of the random variable (e.g., a real-valued variable), then there is a natural notion of metrical logical entropy and it is twice the variance—which makes the connection with basic concepts of statistics. The twice-variance formula shows how to extend logical entropy to continuous random variables. Boltzmann entropy is analyzed to show that Shannon entropy only arises in statistical mechanics as a numerical approximation that has attractive properties of analytical tractability. Edwin Jaynes’s Method of MaxEntropy uses the maximization of the Shannon entropy to generalize the indifference principle. When other constraints rule out the uniform distribution, the Jaynes recommendation is to choose the distribution that maximizes the Shannon entropy. The maximization of logical entropy yields a different distribution. Which solution is best? The maximum logical entropy solution is closest to the uniform distribution in terms of the ordinary Euclidean notion of distance. The chapter ends by giving the transition to coding theory.]
Published: Sep 2, 2021
Keywords: Multivariate joint distributions; Negative Shannon mutual information; Countable distributions; Metrical logical entropy; Variance; Boltzmann entropy; The MaxEntropy method; Coding theory; Rooted trees
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.