Information Theory - 107 Academic Year

消息理論 - 107學年度

本課程是由 國立陽明交通大學電機工程學系 提供。

The purpose of this course is to present a concise, yet mathematically rigorous, introduction to the main pillars of information theory. It thus naturally focuses on the foundational concepts and indispensable results of the subject for single-user systems, where a single data source or message needs to be reliably processed and communicated over a noiseless or noisy point-to-point channel.

At the first part of this course, six meticulously core chapters with accompanying problems, emphasizing the key topics of information measures, lossless and lossy data compression, channel coding, and joint source-channel coding. Two appendices covering necessary and supplementary material in real analysis and in probability and stochastic processes are included.

At the second part of the course, advanced topics concerning the information theoretic limits of discrete-time single-user stochastic systems with arbitrary statistical memory (i.e., systems that are not necessarily stationary, ergodic or information stable) will be covered.

課程用書:
Fady Alajaji and Po-Ning Chen, An Introduction to Single-User Information Theory, Springer, July 2018.

為求學習成效完美,請購買課本!

授課教師 電機工程學系 陳伯寧老師
課程學分 3學分
授課年度 107學年度
授課對象 研究生
預備知識 A basic understanding of probability, real analysis and stochastic processes shall be of help for the study of this course.
課程提供 課程影音   課程綱要   課程行事曆   課程講義

週次課程內容課程影音課程下載
Overview: The philosophy behind information theory線上觀看MP4下載
Chapter 1:Introduction線上觀看MP4下載
Appendix A:Overview on Suprema and Limits
A.1 Supremum and maximum
A.2 Infimum and minimum
A.3 Boundedness and suprema operations
A.4 Sequences and their limits
線上觀看MP4下載
Appendix A:Overview on Suprema and Limits
Review A.1-A.4
A.5 Equivalence
線上觀看MP4下載
Appendix B:Overview in Probability and Random Processes
B.1 Probability space
B.2 Random variables and random processes
線上觀看MP4下載
Appendix B:Overview in Probability and Random Processes
B.3 Statistical properties of random sources
線上觀看MP4下載
Appendix B:Overview in Probability and Random
Processes
B.5 Ergodicity and law of large numbers
B.6 Central limit theorem
B.7 Convexity, concavity and Jensen’s inequality
B.8 Lagrange multipliers tech. & KKT conditions
線上觀看MP4下載
Chapter 2:Information Measures for Discrete Systems
2.1.1 Self-information
2.1.2 Entropy
2.1.3 Properties of entropy
2.1.4 Joint entropy and conditional entropy
2.1.5 Properties of joint and conditional entropy
2.2 Mutual information
2.2.1 Properties of m
線上觀看MP4下載
Chapter 2:Information Measures for Discrete Systems
2.4 Data processing inequality
2.5 Fano’s inequality
線上觀看MP4下載
Chapter 2:Information Measures for Discrete Systems
2.6 Divergence and variational distance
線上觀看MP4下載
Chapter 2:Information Measures for Discrete System
2.7 Convexity/concavity of information measures
2.8 Fundamentals of hypothesis testing
2.9 R´enyi’s information measures
線上觀看MP4下載
Chapter 3:Lossless Data Compression
3.1 Principles of data compression
3.2.1 Block codes for DMS
3.2.2 Block Codes for Stationary Ergodic Sources
線上觀看MP4下載
Chapter 3:Lossless Data Compression
3.3 Variable-Length Code for Lossless Data Comp.
3.3.1 Non-singular Codes and Uniquely Decodable Codes
3.3.2 Prefix or Instantaneous Codes
線上觀看MP4下載
Chapter 3:Lossless Data Compression
3.3.3 Examples of Binary Prefix Codes
線上觀看MP4下載
Chapter 3:Lossless Data Compression
3.3.4 Universal Lossless Variable-Length Codes
線上觀看MP4下載
Chapter 4:Data Transmission and Channel Capacity
4.3 Block codes for data transmission over DMCs(1/3)
線上觀看MP4下載
Chapter 4:Data Transmission and Channel Capacity
4.3 Block codes for data transmission over DMCs(2/3)
線上觀看MP4下載
Chapter 4:Data Transmission and Channel Capacity
4.3 Block codes for data transmission over DMCs(3/3)
線上觀看MP4下載
Chapter 4:Data Transmission and Channel Capacity
4.5 Calculating channel capacity
4.5.1 Symmetric, Weakly Symmetric, and Quasi-symmetric Channels
4.5.2 Karuch-Kuhn-Tucker cond. for chan. capacity
線上觀看MP4下載
Chapter 4:Data Transmission and Channel Capacity
4.4 Example of Polar Codes for the BEC
線上觀看MP4下載
Chapter 4:Data Transmission and Channel Capacity
4.6 Lossless joint source-channel coding and Shannon’s separation principle
線上觀看MP4下載
Chapter 5:Differential Entropy and Gaussian Channels
5.1 Differential entropy
線上觀看MP4下載
Chapter 5:Differential Entropy and Gaussian Channels
5.1 Differential entropy
5.2 Joint & cond. diff. entrop., diverg. & mutual info
線上觀看MP4下載
Chapter 5:Differential Entropy and Gaussian Channels
5.3 AEP for continuous memoryless sources
5.4 Capacity for discrete memoryless Gaussian chan
線上觀看MP4下載
Chapter 5:Differential Entropy and Gaussian Channels
5.5 Capacity of Uncorrelated Parallel Gaussian Chan
線上觀看MP4下載
Chapter 5:Differential Entropy and Gaussian Channels
5.6 Capacity of correlated parallel Gaussian channels
5.7 Non-Gaussian discrete-time memoryless channels
5.8 Capacity of band-limited white Gaussian channel
線上觀看MP4下載
Chapter 6:Lossy Data Compression and Transmission
6.1.1 Motivation
6.1.2 Distortion measures
6.1.3 Frequently used distortion measures
6.2 Fixed-length lossy data compression
線上觀看MP4下載
Chapter 6:Lossy Data Compression and Transmission
6.3 Rate-distortion theorem
AEP for distortion typical set
Shannon’s lossy source coding theorem
線上觀看MP4下載
Chapter 6:Lossy Data Compression and Transmission
6.4 Calculation of the rate-distortion function
6.4.2 Rate distortion func / the squared error dist
線上觀看MP4下載
Chapter 6:Lossy Data Compression and Transmission
6.5 Lossy joint source-channel coding theorem
6.6 Shannon limit of communication systems(1/2)
線上觀看MP4下載
Chapter 6:Lossy Data Compression and Transmission
6.6 Shannon limit of communication systems(2/2)
線上觀看MP4下載
Preface and Introduction
Chapter 1:Generalized Information Measures for Arbitrary Systems with Memory
線上觀看MP4下載
Chapter 2:General Data Compression Theorems線上觀看MP4下載
Chapter 3:Measure of Randomness for Stochastic Processes線上觀看MP4下載
Chapter 4:Channel Coding Theorems and Approximations of Output Statistics for Arbitrary Channels線上觀看MP4下載
 

課程目標

The purpose of this course is to present a concise, yet mathematically rigorous, introduction to the main pillars of information theory. It thus naturally focuses on the foundational concepts and indispensable results of the subject for single-user systems, where a single data source or message needs to be reliably processed and communicated over a noiseless or noisy point-to-point channel.

At the first part of this course, six meticulously core chapters with accompanying problems, emphasizing the key topics of information measures, lossless and lossy data compression, channel coding, and joint source-channel coding. Two appendices covering necessary and supplementary material in real analysis and in probability and stochastic processes are included.

At the second part of the course, advanced topics concerning the information theoretic limits of discrete-time single-user stochastic systems with arbitrary statistical memory (i.e., systems that are not necessarily stationary, ergodic or information stable) will be covered.

 

課程書目

Fady Alajaji and Po-Ning Chen, An Introduction to Single-User Information Theory, Springer, July 2018.

The following is a list of recommended references:
1. A Student’s Guide to Coding and Information Theory, Stefan M. Moser and Po-Ning Chen, Cambridge University Press, January 2012.
2. Elements of Information Theory, Thomas M. Cover and Joy A. Thomas, 2nd edition, John Wiley & Sons, Inc., July 2006.
3. A First Course in Information Theory (Information Technology: Transmission, Processing, and Storage), Raymond W. Yeung, Plenum Pub Corp., May 2002.
4. Principles and Practices of Information Theory, Richard E. Blahut, Addison Wesley, 1988.
5. Information Theory and Reliable Communication, Robert G. Gallager, 1985.
6. Information Theory, Robert B. Ash, Dover Publications, Inc., 1965.
7. Mathematical Foundations of Information Theory, A. I. Khinchin, Dover Publications, Inc., 1957.

 

評分標準

項目百分比
期中考50%
期末考50%

本課程行事曆提供課程進度與考試資訊參考。

學期週次
上課日期
參考課程進度

第一週

2019/02/22
  • A coherent introduction of the primary principles of single-user information theory
第二週2019/03/01
  • Holiday
第三週2019/03/08
  • Overview in the subjects of suprema, limits, probability and random processes (e.g., random variables, statistical properties of random processes, Markov chains, convergence of sequences of random variables, ergodicity and laws of large numbers, central limit theorem, concavity and convexity, Jensen’s inequality, Lagrange multipliers, and the Karush–Kuhn–Tucker (KKT) conditions for constrained optimization problems) 
第四週2019/03/15
  • Information measures for discrete systems and their properties (self-information, entropy, mutual information and divergence, data processing theorem, Fano’s inequality, Pinsker’s inequality, simple hypothesis testing, Neyman–Pearson lemma, Chernoff–Stein lemma, and Rényi’s information measures) 
第五週2019/03/22
  • Fundamentals of lossless source coding (i.e., data compression): discrete memoryless sources, fixed-length (block) codes for asymptotically lossless compression, AEP, fixed-length source coding theorems for memoryless and stationary ergodic sources, entropy rate and redundancy, variable-length codes for lossless compression, variable-length source coding theorems for memoryless and stationary sources, prefix codes, Kraft inequality, Huffman codes, Shannon–Fano–Elias codes, and Lempel–Ziv codes 
第六週2019/03/29
  • Fundamentals of lossless source coding (i.e., data compression): discrete memoryless sources, fixed-length (block) codes for asymptotically lossless compression, AEP, fixed-length source coding theorems for memoryless and stationary ergodic sources, entropy rate and redundancy, variable-length codes for lossless compression, variable-length source coding theorems for memoryless and stationary sources, prefix codes, Kraft inequality, Huffman codes, Shannon–Fano–Elias codes, and Lempel–Ziv codes
第七週2019/04/05
  • Holiday
第八週2019/04/12
  • Fundamentals of channel coding: discrete memoryless channels, block codes for data transmission, channel capacity, coding theorem for discrete memoryless channels, calculation of channel capacity, channels with symmetric structures, lossless joint source–channel coding, and Shannon’s separation principle
第九週2019/04/19
  • Fundamentals of channel coding: discrete memoryless channels, block codes for data transmission, channel capacity, coding theorem for discrete memoryless channels, calculation of channel capacity, channels with symmetric structures, lossless joint source–channel coding, and Shannon’s separation principle
第十週2019/04/26
  • Midterm Exam
第十一週2019/05/03
  • Information measures for continuous alphabet systems and Gaussian channels: differential entropy, mutual information and divergence, AEP for continuous memoryless sources, capacity and channel coding theorem of discrete-time memoryless Gaussian channels, capacity of uncorrelated parallel Gaussian channels and the water-filling principle, capacity of correlated Gaussian channels, non-Gaussian discrete-time memoryless channels, and capacity of band-limited (continuous-time) white Gaussian channels
第十二週2019/05/10
  • Fundamentals of lossy source coding and joint source–channel coding: distortion measures, rate–distortion theorem for memoryless sources, rate–distortion theorem for stationary ergodic sources, rate–distortion function and its properties, rate–distortion function for memoryless Gaussian sources, lossy joint source–channel coding theorem, and Shannon limit of communication systems
第十三週2019/05/17
  • Fundamentals of lossy source coding and joint source–channel coding: distortion measures, rate–distortion theorem for memoryless sources, rate–distortion theorem for stationary ergodic sources, rate–distortion function and its properties, rate–distortion function for memoryless Gaussian sources, lossy joint source–channel coding theorem, and Shannon limit of communication systems
第十四週2019/05/24
  • General information measure: Information spectrum and Quantile and their properties
第十五週2019/05/31
  • Advanced topics of losslesss data compression: Fixed-length lossless data compression theorem for arbitrary channels, variable-length lossless data compression theorem for arbitrary channels
第十六週2019/06/07
  • Holiday

Laboratory Manuals

章節
下載連接
Overview : The philosophy behind information theory Chapter 1 IntroductionPDF
Appendix A Overview on Suprema and LimitsPDF
Appendix B Overview in Probability and Random ProcessesPDF
Chapter 2 Information Measures for Discrete SystemsPDF
Chapter 3 Lossless Data CompressionPDF
Chapter 4 Data Transmission and Channel CapacityPDF
Chapter 5 Differential Entropy and Gaussian ChannelsPDF
Chapter 6 Lossy Data Compression and TransmissionPDF
Preface and IntroductionPDF
Chapter 1 Generalized Information Measures for Arbitrary Systems with MemoryPDF
Chapter 2 General Data Compression TheoremsPDF
Chapter 3 Measure of Randomness for Stochastic ProcessesPDF
Chapter 4 Channel Coding Theorems and
Approximations of Output Statistics for
Arbitrary Channels
PDF