New foundations for information theory : logical entropy and Shannon entropy / David Ellerman.
2021
Q370 .E45 2021
Linked e-resources
Linked Resource
Concurrent users
Unlimited
Authorized users
Authorized users
Document Delivery Supplied
Can lend chapters, not whole ebooks
Details
Title
New foundations for information theory : logical entropy and Shannon entropy / David Ellerman.
Author
ISBN
9783030865528 (electronic bk.)
3030865525 (electronic bk.)
9783030865511
3030865517
3030865525 (electronic bk.)
9783030865511
3030865517
Published
Cham : Springer, [2021]
Copyright
©2021
Language
English
Description
1 online resource : illustrations
Item Number
10.1007/978-3-030-86552-8 doi
Call Number
Q370 .E45 2021
Dewey Decimal Classification
003/.54
Summary
This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications. Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent the pre-probability notion of information. Then logical entropy is a probability measure on the information sets, the probability that on two independent trials, a distinction or "dit" of the partition will be obtained. The formula for logical entropy is a new derivation of an old formula that goes back to the early twentieth century and has been re-derived many times in different contexts. As a probability measure, all the compound notions of joint, conditional, and mutual logical entropy are immediate. The Shannon entropy (which is not defined as a measure in the sense of measure theory) and its compound notions are then derived from a non-linear dit-to-bit transform that re-quantifies the distinctions of a random variable in terms of bits-so the Shannon entropy is the average number of binary distinctions or bits necessary to make all the distinctions of the random variable. And, using a linearization method, all the set concepts in this logical information theory naturally extend to vector spaces in general--and to Hilbert spaces in particular--for quantum logical information theory which provides the natural measure of the distinctions made in quantum measurement. Relatively short but dense in content, this work can be a reference to researchers and graduate students doing investigations in information theory, maximum entropy methods in physics, engineering, and statistics, and to all those with a special interest in a new approach to quantum information theory
Bibliography, etc. Note
Includes bibliographical references and index.
Access Note
Access limited to authorized users.
Digital File Characteristics
text file
PDF
Source of Description
Online resource; title from PDF title page (SpringerLink, viewed November 12, 2021).
Series
SpringerBriefs in philosophy. 2211-4556
Available in Other Form
Print version: 9783030865511
Linked Resources
Record Appears in
Table of Contents
Logical entropy
The relationship between logical entropy and Shannon entropy
The compound notions for logical and Shannon entropies
Further developments of logical entropy
Logical Quantum Information Theory
Conclusion
Appendix: Introduction to the logic of partitions.
The relationship between logical entropy and Shannon entropy
The compound notions for logical and Shannon entropies
Further developments of logical entropy
Logical Quantum Information Theory
Conclusion
Appendix: Introduction to the logic of partitions.