Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

New Foundations for Information TheoryLogical Entropy

New Foundations for Information Theory: Logical Entropy [This book presents a new foundation for information theory where the notion of information is defined in terms of distinctions, differences, distinguishability, and diversity. The direct measure is logical entropy which is the quantitative measure of the distinctions made by a partition. Shannon entropy is a transform or re-quantification of logical entropy for Claude Shannon’s “mathematical theory of communications.” The interpretation of the logical entropy of a partition is the two-draw probability of getting a distinction of the partition (a pair of elements distinguished by the partition) so it realizes a dictum of Gian-Carlo Rota: ProbabilitySubsets≈InformationPartitions\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$\frac {Probability}{Subsets}\approx \frac {Information}{Partitions}$$ \end{document}. Andrei Kolmogorov suggested that information should be defined independently of probability, so logical entropy is first defined in terms of the set of distinctions of a partition and then a probability measure on the set defines the quantitative version of logical entropy. We give a history of the logical entropy formula that goes back to Corrado Gini’s 1912 “index of mutability” and has been rediscovered many times.] http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png

New Foundations for Information TheoryLogical Entropy

Loading next page...
 
/lp/springer-journals/new-foundations-for-information-theory-logical-entropy-y46sodVjFc
Publisher
Springer International Publishing
Copyright
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2021
ISBN
978-3-030-86551-1
Pages
1 –13
DOI
10.1007/978-3-030-86552-8_1
Publisher site
See Chapter on Publisher Site

Abstract

[This book presents a new foundation for information theory where the notion of information is defined in terms of distinctions, differences, distinguishability, and diversity. The direct measure is logical entropy which is the quantitative measure of the distinctions made by a partition. Shannon entropy is a transform or re-quantification of logical entropy for Claude Shannon’s “mathematical theory of communications.” The interpretation of the logical entropy of a partition is the two-draw probability of getting a distinction of the partition (a pair of elements distinguished by the partition) so it realizes a dictum of Gian-Carlo Rota: ProbabilitySubsets≈InformationPartitions\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$\frac {Probability}{Subsets}\approx \frac {Information}{Partitions}$$ \end{document}. Andrei Kolmogorov suggested that information should be defined independently of probability, so logical entropy is first defined in terms of the set of distinctions of a partition and then a probability measure on the set defines the quantitative version of logical entropy. We give a history of the logical entropy formula that goes back to Corrado Gini’s 1912 “index of mutability” and has been rediscovered many times.]

Published: Sep 2, 2021

Keywords: Information-as-distinctions; Logical entropy; History of the formula

There are no references for this article.