Chapter 2:Information Measures for Discrete Systems, 2.1.1 Self-information, 2.1.2 Entropy, 2.1.3 Properties of entropy, 2.1.4 Joint entropy and conditional entropy, 2.1.5 Properties of joint and conditional entropy, 2.2 Mutual information, 2.2.1 Properties of mutual information, 2.3 Properties of entropy and mutual information for multiple random variables

WeekCourse ContentCourse Video
Overview: The philosophy behind information theoryWatch Online
Chapter 1:IntroductionWatch Online
Appendix A:Overview on Suprema and Limits
A.1 Supremum and maximum
A.2 Infimum and minimum
A.3 Boundedness and suprema operations
A.4 Sequences and their limits
Watch Online
Appendix A:Overview on Suprema and Limits
Review A.1-A.4
A.5 Equivalence
Watch Online
Appendix B:Overview in Probability and Random Processes
B.1 Probability space
B.2 Random variables and random processes
Watch Online
Appendix B:Overview in Probability and Random Processes
B.3 Statistical properties of random sources
Watch Online
Appendix B:Overview in Probability and Random
Processes
B.5 Ergodicity and law of large numbers
B.6 Central limit theorem
B.7 Convexity, concavity and Jensen’s inequality
B.8 Lagrange multipliers tech. & KKT conditions
Watch Online
Chapter 2:Information Measures for Discrete Systems
2.1.1 Self-information
2.1.2 Entropy
2.1.3 Properties of entropy
2.1.4 Joint entropy and conditional entropy
2.1.5 Properties of joint and conditional entropy
2.2 Mutual information
2.2.1 Properties of m
Watch Online
Chapter 2:Information Measures for Discrete Systems
2.4 Data processing inequality
2.5 Fano’s inequality
Watch Online
Chapter 2:Information Measures for Discrete Systems
2.6 Divergence and variational distance
Watch Online
Chapter 2:Information Measures for Discrete System
2.7 Convexity/concavity of information measures
2.8 Fundamentals of hypothesis testing
2.9 R´enyi’s information measures
Watch Online
Chapter 3:Lossless Data Compression
3.1 Principles of data compression
3.2.1 Block codes for DMS
3.2.2 Block Codes for Stationary Ergodic Sources
Watch Online
Chapter 3:Lossless Data Compression
3.3 Variable-Length Code for Lossless Data Comp.
3.3.1 Non-singular Codes and Uniquely Decodable Codes
3.3.2 Prefix or Instantaneous Codes
Watch Online
Chapter 3:Lossless Data Compression
3.3.3 Examples of Binary Prefix Codes
Watch Online
Chapter 3:Lossless Data Compression
3.3.4 Universal Lossless Variable-Length Codes
Watch Online
Chapter 4:Data Transmission and Channel Capacity
4.3 Block codes for data transmission over DMCs(1/3)
Watch Online
Chapter 4:Data Transmission and Channel Capacity
4.3 Block codes for data transmission over DMCs(2/3)
Watch Online
Chapter 4:Data Transmission and Channel Capacity
4.3 Block codes for data transmission over DMCs(3/3)
Watch Online
Chapter 4:Data Transmission and Channel Capacity
4.5 Calculating channel capacity
4.5.1 Symmetric, Weakly Symmetric, and Quasi-symmetric Channels
4.5.2 Karuch-Kuhn-Tucker cond. for chan. capacity
Watch Online
Chapter 4:Data Transmission and Channel Capacity
4.4 Example of Polar Codes for the BEC
Watch Online
Chapter 4:Data Transmission and Channel Capacity
4.6 Lossless joint source-channel coding and Shannon’s separation principle
Watch Online
Chapter 5:Differential Entropy and Gaussian Channels
5.1 Differential entropy
Watch Online
Chapter 5:Differential Entropy and Gaussian Channels
5.1 Differential entropy
5.2 Joint & cond. diff. entrop., diverg. & mutual info
Watch Online
Chapter 5:Differential Entropy and Gaussian Channels
5.3 AEP for continuous memoryless sources
5.4 Capacity for discrete memoryless Gaussian chan
Watch Online
Chapter 5:Differential Entropy and Gaussian Channels
5.5 Capacity of Uncorrelated Parallel Gaussian Chan
Watch Online
Chapter 5:Differential Entropy and Gaussian Channels
5.6 Capacity of correlated parallel Gaussian channels
5.7 Non-Gaussian discrete-time memoryless channels
5.8 Capacity of band-limited white Gaussian channel
Watch Online
Chapter 6:Lossy Data Compression and Transmission
6.1.1 Motivation
6.1.2 Distortion measures
6.1.3 Frequently used distortion measures
6.2 Fixed-length lossy data compression
Watch Online
Chapter 6:Lossy Data Compression and Transmission
6.3 Rate-distortion theorem
AEP for distortion typical set
Shannon’s lossy source coding theorem
Watch Online
Chapter 6:Lossy Data Compression and Transmission
6.4 Calculation of the rate-distortion function
6.4.2 Rate distortion func / the squared error dist
Watch Online
Chapter 6:Lossy Data Compression and Transmission
6.5 Lossy joint source-channel coding theorem
6.6 Shannon limit of communication systems(1/2)
Watch Online
Chapter 6:Lossy Data Compression and Transmission
6.6 Shannon limit of communication systems(2/2)
Watch Online
Preface and Introduction
Chapter 1:Generalized Information Measures for Arbitrary Systems with Memory
Watch Online
Chapter 2:General Data Compression TheoremsWatch Online
Chapter 3:Measure of Randomness for Stochastic ProcessesWatch Online
Chapter 4:Channel Coding Theorems and Approximations of Output Statistics for Arbitrary ChannelsWatch Online