Linked e-resources
Details
Table of Contents
1. Information theory
1.1 Information measurement
1.2 Requirements for an information metric
2. Sources of information
2.1 Source coding
2.2 Extension of a memoryless discrete source
2.3 Prefix codes
2.4 The information unit
3. Source coding
3.1 Types of source codes
3.2 Construction of instantaneous codes
3.3 Kraft inequality
3.4 Huffman code
4. Information transmission
4.1 The concept of information theory
4.2 Joint information measurement
4.3 Conditional entropy
4.4 Model for a communication channel
4.5 Noiseless channel
4.6 Channel with independent output and input
4.7 Relations between the entropies
4.8 Mutual information
4.9 Channel capacity
5. Multiple access systems
5.1 Introduction
5.2 The Gaussian multiple access channel
5.3 The Gaussian channel with Rayleigh fading
5.4 The noncooperative multiple access channel
5.5 Multiple access in a dynamic environment
5.6 Analysis of the capacity for a Markovian multiple access channel
6. Code division multiple access
6.1 Introduction
6.2 Fundamentals of spread spectrum signals
6.3 Performance analysis of CDMA systems
6.4 Sequence design
7. The capacity of a CDMA system
7.1 Introduction
7.2 Analysis of a CDMA system with a fixed number of users and small SNR
7.3 CDMA system with a fixed number of users and high SNR
7.4 A tight bound on the capacity of a CDMA system
8. Theoretical cryptography
8.1 Introduction
8.2 Cryptographic aspects of computer networks
8.3 Principles of cryptography
8.4 Information theoretical aspects of cryptography
8.5 Mutual information for cryptosystems
Appendix A. Probability theory
Set theory and measure
Basic probability theory
Random variables
References
About the author
Index.
1.1 Information measurement
1.2 Requirements for an information metric
2. Sources of information
2.1 Source coding
2.2 Extension of a memoryless discrete source
2.3 Prefix codes
2.4 The information unit
3. Source coding
3.1 Types of source codes
3.2 Construction of instantaneous codes
3.3 Kraft inequality
3.4 Huffman code
4. Information transmission
4.1 The concept of information theory
4.2 Joint information measurement
4.3 Conditional entropy
4.4 Model for a communication channel
4.5 Noiseless channel
4.6 Channel with independent output and input
4.7 Relations between the entropies
4.8 Mutual information
4.9 Channel capacity
5. Multiple access systems
5.1 Introduction
5.2 The Gaussian multiple access channel
5.3 The Gaussian channel with Rayleigh fading
5.4 The noncooperative multiple access channel
5.5 Multiple access in a dynamic environment
5.6 Analysis of the capacity for a Markovian multiple access channel
6. Code division multiple access
6.1 Introduction
6.2 Fundamentals of spread spectrum signals
6.3 Performance analysis of CDMA systems
6.4 Sequence design
7. The capacity of a CDMA system
7.1 Introduction
7.2 Analysis of a CDMA system with a fixed number of users and small SNR
7.3 CDMA system with a fixed number of users and high SNR
7.4 A tight bound on the capacity of a CDMA system
8. Theoretical cryptography
8.1 Introduction
8.2 Cryptographic aspects of computer networks
8.3 Principles of cryptography
8.4 Information theoretical aspects of cryptography
8.5 Mutual information for cryptosystems
Appendix A. Probability theory
Set theory and measure
Basic probability theory
Random variables
References
About the author
Index.