Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

New Foundations for Information TheoryThe Relationship Between Logical Entropy and Shannon Entropy

New Foundations for Information Theory: The Relationship Between Logical Entropy and Shannon Entropy [This chapter is focused on developing the basic notion of Shannon entropy, its interpretation in terms of distinctions, i.e., the minimum average number of yes-or-no questions that must be answered to distinguish all the “messages.” Thus Shannon entropy is also a quantitative indicator of information-as-distinctions, and, accordingly, a “dit-bit transform” is defined that turns any simple, joint, conditional, or mutual logical entropy into the corresponding notion of Shannon entropy. One of the delicate points is that while logical entropy is always a non-negative measure in the sense of measure theory (indeed, a probability measure), we will later see that for three or more random variables, the Shannon mutual information can be negative. This means that Shannon entropy can in general be characterized only as a signed measure, i.e., a measure that can take on negative values.] http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png

New Foundations for Information TheoryThe Relationship Between Logical Entropy and Shannon Entropy

Loading next page...
 
/lp/springer-journals/new-foundations-for-information-theory-the-relationship-between-CjpFIKqPFc
Publisher
Springer International Publishing
Copyright
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2021
ISBN
978-3-030-86551-1
Pages
15 –22
DOI
10.1007/978-3-030-86552-8_2
Publisher site
See Chapter on Publisher Site

Abstract

[This chapter is focused on developing the basic notion of Shannon entropy, its interpretation in terms of distinctions, i.e., the minimum average number of yes-or-no questions that must be answered to distinguish all the “messages.” Thus Shannon entropy is also a quantitative indicator of information-as-distinctions, and, accordingly, a “dit-bit transform” is defined that turns any simple, joint, conditional, or mutual logical entropy into the corresponding notion of Shannon entropy. One of the delicate points is that while logical entropy is always a non-negative measure in the sense of measure theory (indeed, a probability measure), we will later see that for three or more random variables, the Shannon mutual information can be negative. This means that Shannon entropy can in general be characterized only as a signed measure, i.e., a measure that can take on negative values.]

Published: Sep 2, 2021

Keywords: Shannon entropy; Logical entropy; Dit-bit transform; Measure; Venn diagrams

There are no references for this article.