Information theory : three theorems of Claude Shannon / Antoine Chambert-Loir.
2023
Q360
Linked e-resources
Linked Resource
Concurrent users
Unlimited
Authorized users
Authorized users
Document Delivery Supplied
Can lend chapters, not whole ebooks
Details
Title
Information theory : three theorems of Claude Shannon / Antoine Chambert-Loir.
ISBN
9783031215612 (electronic bk.)
3031215613 (electronic bk.)
3031215605
9783031215605
3031215613 (electronic bk.)
3031215605
9783031215605
Publication Details
Cham, Switzerland : Springer, 2023.
Language
English
Description
1 online resource
Item Number
10.1007/978-3-031-21561-2 doi
Call Number
Q360
Dewey Decimal Classification
003/.54
Summary
This book provides an introduction to information theory, focussing on Shannons three foundational theorems of 19481949. Shannons first two theorems, based on the notion of entropy in probability theory, specify the extent to which a message can be compressed for fast transmission and how to erase errors associated with poor transmission. The third theorem, using Fourier theory, ensures that a signal can be reconstructed from a sufficiently fine sampling of it. These three theorems constitute the roadmap of the book. The first chapter studies the entropy of a discrete random variable and related notions. The second chapter, on compression and error correcting, introduces the concept of coding, proves the existence of optimal codes and good codes (Shannon's first theorem), and shows how information can be transmitted in the presence of noise (Shannon's second theorem). The third chapter proves the sampling theorem (Shannon's third theorem) and looks at its connections with other results, such as the Poisson summation formula. Finally, there is a discussion of the uncertainty principle in information theory. Featuring a good supply of exercises (with solutions), and an introductory chapter covering the prerequisites, this text stems out lectures given to mathematics/computer science students at the beginning graduate level.
Bibliography, etc. Note
Includes bibliographical references and index.
Access Note
Access limited to authorized users.
Source of Description
Online resource; title from PDF title page (SpringerLink, viewed March 22, 2023).
Series
Unitext. Matematica per il 3+2, 2038-5757
Unitext ; v. 144.
Unitext ; v. 144.
Available in Other Form
Print version: 9783031215605
Linked Resources
Record Appears in
Table of Contents
Elements of Theory of Probability
Entropy and Mutual Information
Coding
Sampling
Solutions to Exercises
Bibliography
Notation
Index.
Entropy and Mutual Information
Coding
Sampling
Solutions to Exercises
Bibliography
Notation
Index.