001461138 000__ 03654cam\a22005777a\4500 001461138 001__ 1461138 001461138 003__ OCoLC 001461138 005__ 20230502014306.0 001461138 006__ m\\\\\o\\d\\\\\\\\ 001461138 007__ cr\un\nnnunnun 001461138 008__ 230320s2023\\\\sz\\\\\\ob\\\\001\0\eng\d 001461138 020__ $$a9783031215612$$q(electronic bk.) 001461138 020__ $$a3031215613$$q(electronic bk.) 001461138 020__ $$z3031215605 001461138 020__ $$z9783031215605 001461138 0247_ $$a10.1007/978-3-031-21561-2$$2doi 001461138 035__ $$aSP(OCoLC)1373338488 001461138 040__ $$aYDX$$beng$$cYDX$$dGW5XE$$dOCLCF 001461138 049__ $$aISEA 001461138 050_4 $$aQ360 001461138 08204 $$a003/.54$$223/eng/20230322 001461138 1001_ $$aChambert-Loir, Antoine,$$eauthor. 001461138 24510 $$aInformation theory :$$bthree theorems of Claude Shannon /$$cAntoine Chambert-Loir. 001461138 260__ $$aCham, Switzerland :$$bSpringer,$$c2023. 001461138 300__ $$a1 online resource 001461138 4901_ $$aLa Matematica per il 3+2,$$x2038-5757 001461138 4901_ $$aUnitext ;$$vv. 144 001461138 504__ $$aIncludes bibliographical references and index. 001461138 5050_ $$aElements of Theory of Probability -- Entropy and Mutual Information -- Coding -- Sampling -- Solutions to Exercises -- Bibliography -- Notation -- Index. 001461138 506__ $$aAccess limited to authorized users. 001461138 520__ $$aThis book provides an introduction to information theory, focussing on Shannons three foundational theorems of 19481949. Shannons first two theorems, based on the notion of entropy in probability theory, specify the extent to which a message can be compressed for fast transmission and how to erase errors associated with poor transmission. The third theorem, using Fourier theory, ensures that a signal can be reconstructed from a sufficiently fine sampling of it. These three theorems constitute the roadmap of the book. The first chapter studies the entropy of a discrete random variable and related notions. The second chapter, on compression and error correcting, introduces the concept of coding, proves the existence of optimal codes and good codes (Shannon's first theorem), and shows how information can be transmitted in the presence of noise (Shannon's second theorem). The third chapter proves the sampling theorem (Shannon's third theorem) and looks at its connections with other results, such as the Poisson summation formula. Finally, there is a discussion of the uncertainty principle in information theory. Featuring a good supply of exercises (with solutions), and an introductory chapter covering the prerequisites, this text stems out lectures given to mathematics/computer science students at the beginning graduate level. 001461138 588__ $$aOnline resource; title from PDF title page (SpringerLink, viewed March 22, 2023). 001461138 650_0 $$aInformation theory. 001461138 655_0 $$aElectronic books. 001461138 77608 $$iPrint version: $$z3031215605$$z9783031215605$$w(OCoLC)1347781524 001461138 830_0 $$aUnitext.$$pMatematica per il 3+2,$$x2038-5757 001461138 830_0 $$aUnitext ;$$vv. 144. 001461138 852__ $$bebk 001461138 85640 $$3Springer Nature$$uhttps://univsouthin.idm.oclc.org/login?url=https://link.springer.com/10.1007/978-3-031-21561-2$$zOnline Access$$91397441.1 001461138 909CO $$ooai:library.usi.edu:1461138$$pGLOBAL_SET 001461138 980__ $$aBIB 001461138 980__ $$aEBOOK 001461138 982__ $$aEbook 001461138 983__ $$aOnline 001461138 994__ $$a92$$bISE