Access the full text.
Sign up today, get DeepDyve free for 14 days.
[This chapter is focused on developing the basic notion of Shannon entropy, its interpretation in terms of distinctions, i.e., the minimum average number of yes-or-no questions that must be answered to distinguish all the “messages.” Thus Shannon entropy is also a quantitative indicator of information-as-distinctions, and, accordingly, a “dit-bit transform” is defined that turns any simple, joint, conditional, or mutual logical entropy into the corresponding notion of Shannon entropy. One of the delicate points is that while logical entropy is always a non-negative measure in the sense of measure theory (indeed, a probability measure), we will later see that for three or more random variables, the Shannon mutual information can be negative. This means that Shannon entropy can in general be characterized only as a signed measure, i.e., a measure that can take on negative values.]
Published: Sep 2, 2021
Keywords: Shannon entropy; Logical entropy; Dit-bit transform; Measure; Venn diagrams
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.