Information Theory - 107 Academic Year

消息理論 - 107學年度

本課程是由 國立陽明交通大學電機工程學系 提供。

The purpose of this course is to present a concise, yet mathematically rigorous, introduction to the main pillars of information theory. It thus naturally focuses on the foundational concepts and indispensable results of the subject for single-user systems, where a single data source or message needs to be reliably processed and communicated over a noiseless or noisy point-to-point channel.

At the first part of this course, six meticulously core chapters with accompanying problems, emphasizing the key topics of information measures, lossless and lossy data compression, channel coding, and joint source-channel coding. Two appendices covering necessary and supplementary material in real analysis and in probability and stochastic processes are included.

At the second part of the course, advanced topics concerning the information theoretic limits of discrete-time single-user stochastic systems with arbitrary statistical memory (i.e., systems that are not necessarily stationary, ergodic or information stable) will be covered.

課程用書:
Fady Alajaji and Po-Ning Chen, An Introduction to Single-User Information Theory, Springer, July 2018.

為求學習成效完美,請購買課本!

授課教師 電機工程學系 陳伯寧老師
課程學分 3學分
授課年度 107學年度
授課對象 研究生
預備知識 A basic understanding of probability, real analysis and stochastic processes shall be of help for the study of this course.
課程提供 課程影音   課程綱要   課程行事曆   課程講義

 

課程目標

The purpose of this course is to present a concise, yet mathematically rigorous, introduction to the main pillars of information theory. It thus naturally focuses on the foundational concepts and indispensable results of the subject for single-user systems, where a single data source or message needs to be reliably processed and communicated over a noiseless or noisy point-to-point channel.

At the first part of this course, six meticulously core chapters with accompanying problems, emphasizing the key topics of information measures, lossless and lossy data compression, channel coding, and joint source-channel coding. Two appendices covering necessary and supplementary material in real analysis and in probability and stochastic processes are included.

At the second part of the course, advanced topics concerning the information theoretic limits of discrete-time single-user stochastic systems with arbitrary statistical memory (i.e., systems that are not necessarily stationary, ergodic or information stable) will be covered.

 

課程書目

Fady Alajaji and Po-Ning Chen, An Introduction to Single-User Information Theory, Springer, July 2018.

The following is a list of recommended references:
1. A Student’s Guide to Coding and Information Theory, Stefan M. Moser and Po-Ning Chen, Cambridge University Press, January 2012.
2. Elements of Information Theory, Thomas M. Cover and Joy A. Thomas, 2nd edition, John Wiley & Sons, Inc., July 2006.
3. A First Course in Information Theory (Information Technology: Transmission, Processing, and Storage), Raymond W. Yeung, Plenum Pub Corp., May 2002.
4. Principles and Practices of Information Theory, Richard E. Blahut, Addison Wesley, 1988.
5. Information Theory and Reliable Communication, Robert G. Gallager, 1985.
6. Information Theory, Robert B. Ash, Dover Publications, Inc., 1965.
7. Mathematical Foundations of Information Theory, A. I. Khinchin, Dover Publications, Inc., 1957.

 

評分標準

項目百分比
期中考50%
期末考50%

本課程行事曆提供課程進度與考試資訊參考。

學期週次
上課日期
參考課程進度

第一週

2019/02/22
  • A coherent introduction of the primary principles of single-user information theory
第二週2019/03/01
  • Holiday
第三週2019/03/08
  • Overview in the subjects of suprema, limits, probability and random processes (e.g., random variables, statistical properties of random processes, Markov chains, convergence of sequences of random variables, ergodicity and laws of large numbers, central limit theorem, concavity and convexity, Jensen’s inequality, Lagrange multipliers, and the Karush–Kuhn–Tucker (KKT) conditions for constrained optimization problems) 
第四週2019/03/15
  • Information measures for discrete systems and their properties (self-information, entropy, mutual information and divergence, data processing theorem, Fano’s inequality, Pinsker’s inequality, simple hypothesis testing, Neyman–Pearson lemma, Chernoff–Stein lemma, and Rényi’s information measures) 
第五週2019/03/22
  • Fundamentals of lossless source coding (i.e., data compression): discrete memoryless sources, fixed-length (block) codes for asymptotically lossless compression, AEP, fixed-length source coding theorems for memoryless and stationary ergodic sources, entropy rate and redundancy, variable-length codes for lossless compression, variable-length source coding theorems for memoryless and stationary sources, prefix codes, Kraft inequality, Huffman codes, Shannon–Fano–Elias codes, and Lempel–Ziv codes 
第六週2019/03/29
  • Fundamentals of lossless source coding (i.e., data compression): discrete memoryless sources, fixed-length (block) codes for asymptotically lossless compression, AEP, fixed-length source coding theorems for memoryless and stationary ergodic sources, entropy rate and redundancy, variable-length codes for lossless compression, variable-length source coding theorems for memoryless and stationary sources, prefix codes, Kraft inequality, Huffman codes, Shannon–Fano–Elias codes, and Lempel–Ziv codes
第七週2019/04/05
  • Holiday
第八週2019/04/12
  • Fundamentals of channel coding: discrete memoryless channels, block codes for data transmission, channel capacity, coding theorem for discrete memoryless channels, calculation of channel capacity, channels with symmetric structures, lossless joint source–channel coding, and Shannon’s separation principle
第九週2019/04/19
  • Fundamentals of channel coding: discrete memoryless channels, block codes for data transmission, channel capacity, coding theorem for discrete memoryless channels, calculation of channel capacity, channels with symmetric structures, lossless joint source–channel coding, and Shannon’s separation principle
第十週2019/04/26
  • Midterm Exam
第十一週2019/05/03
  • Information measures for continuous alphabet systems and Gaussian channels: differential entropy, mutual information and divergence, AEP for continuous memoryless sources, capacity and channel coding theorem of discrete-time memoryless Gaussian channels, capacity of uncorrelated parallel Gaussian channels and the water-filling principle, capacity of correlated Gaussian channels, non-Gaussian discrete-time memoryless channels, and capacity of band-limited (continuous-time) white Gaussian channels
第十二週2019/05/10
  • Fundamentals of lossy source coding and joint source–channel coding: distortion measures, rate–distortion theorem for memoryless sources, rate–distortion theorem for stationary ergodic sources, rate–distortion function and its properties, rate–distortion function for memoryless Gaussian sources, lossy joint source–channel coding theorem, and Shannon limit of communication systems
第十三週2019/05/17
  • Fundamentals of lossy source coding and joint source–channel coding: distortion measures, rate–distortion theorem for memoryless sources, rate–distortion theorem for stationary ergodic sources, rate–distortion function and its properties, rate–distortion function for memoryless Gaussian sources, lossy joint source–channel coding theorem, and Shannon limit of communication systems
第十四週2019/05/24
  • General information measure: Information spectrum and Quantile and their properties
第十五週2019/05/31
  • Advanced topics of losslesss data compression: Fixed-length lossless data compression theorem for arbitrary channels, variable-length lossless data compression theorem for arbitrary channels
第十六週2019/06/07
  • Holiday

Laboratory Manuals

章節
下載連接
Overview : The philosophy behind information theory Chapter 1 IntroductionPDF
Appendix A Overview on Suprema and LimitsPDF
Appendix B Overview in Probability and Random ProcessesPDF
Chapter 2 Information Measures for Discrete SystemsPDF
Chapter 3 Lossless Data CompressionPDF
Chapter 4 Data Transmission and Channel CapacityPDF
Chapter 5 Differential Entropy and Gaussian ChannelsPDF
Chapter 6 Lossy Data Compression and TransmissionPDF
Preface and IntroductionPDF
Chapter 1 Generalized Information Measures for Arbitrary Systems with MemoryPDF
Chapter 2 General Data Compression TheoremsPDF
Chapter 3 Measure of Randomness for Stochastic ProcessesPDF
Chapter 4 Channel Coding Theorems and
Approximations of Output Statistics for
Arbitrary Channels
PDF
preload imagepreload image