本課程是由 國立陽明交通大學電機工程學系 提供。
The purpose of this course is to present a concise, yet mathematically rigorous, introduction to the main pillars of information theory. It thus naturally focuses on the foundational concepts and indispensable results of the subject for single-user systems, where a single data source or message needs to be reliably processed and communicated over a noiseless or noisy point-to-point channel.
At the first part of this course, six meticulously core chapters with accompanying problems, emphasizing the key topics of information measures, lossless and lossy data compression, channel coding, and joint source-channel coding. Two appendices covering necessary and supplementary material in real analysis and in probability and stochastic processes are included.
At the second part of the course, advanced topics concerning the information theoretic limits of discrete-time single-user stochastic systems with arbitrary statistical memory (i.e., systems that are not necessarily stationary, ergodic or information stable) will be covered.
課程用書:
Fady Alajaji and Po-Ning Chen, An Introduction to Single-User Information Theory, Springer, July 2018.
為求學習成效完美,請購買課本!
授課教師 | 電機工程學系 陳伯寧老師 |
---|---|
課程學分 | 3學分 |
授課年度 | 107學年度 |
授課對象 | 研究生 |
預備知識 | A basic understanding of probability, real analysis and stochastic processes shall be of help for the study of this course. |
課程提供 | 課程影音 課程綱要 課程行事曆 課程講義 |
週次 | 課程內容 | 課程影音 | 課程下載 |
---|---|---|---|
Overview: The philosophy behind information theory | 線上觀看 | MP4下載 | |
Chapter 1:Introduction | 線上觀看 | MP4下載 | |
Appendix A:Overview on Suprema and Limits A.1 Supremum and maximum A.2 Infimum and minimum A.3 Boundedness and suprema operations A.4 Sequences and their limits | 線上觀看 | MP4下載 | |
Appendix A:Overview on Suprema and Limits Review A.1-A.4 A.5 Equivalence | 線上觀看 | MP4下載 | |
Appendix B:Overview in Probability and Random Processes B.1 Probability space B.2 Random variables and random processes | 線上觀看 | MP4下載 | |
Appendix B:Overview in Probability and Random Processes B.3 Statistical properties of random sources | 線上觀看 | MP4下載 | |
Appendix B:Overview in Probability and Random Processes B.5 Ergodicity and law of large numbers B.6 Central limit theorem B.7 Convexity, concavity and Jensen’s inequality B.8 Lagrange multipliers tech. & KKT conditions | 線上觀看 | MP4下載 | |
Chapter 2:Information Measures for Discrete Systems 2.1.1 Self-information 2.1.2 Entropy 2.1.3 Properties of entropy 2.1.4 Joint entropy and conditional entropy 2.1.5 Properties of joint and conditional entropy 2.2 Mutual information 2.2.1 Properties of m | 線上觀看 | MP4下載 | |
Chapter 2:Information Measures for Discrete Systems 2.4 Data processing inequality 2.5 Fano’s inequality | 線上觀看 | MP4下載 | |
Chapter 2:Information Measures for Discrete Systems 2.6 Divergence and variational distance | 線上觀看 | MP4下載 | |
Chapter 2:Information Measures for Discrete System 2.7 Convexity/concavity of information measures 2.8 Fundamentals of hypothesis testing 2.9 R´enyi’s information measures | 線上觀看 | MP4下載 | |
Chapter 3:Lossless Data Compression 3.1 Principles of data compression 3.2.1 Block codes for DMS 3.2.2 Block Codes for Stationary Ergodic Sources | 線上觀看 | MP4下載 | |
Chapter 3:Lossless Data Compression 3.3 Variable-Length Code for Lossless Data Comp. 3.3.1 Non-singular Codes and Uniquely Decodable Codes 3.3.2 Prefix or Instantaneous Codes | 線上觀看 | MP4下載 | |
Chapter 3:Lossless Data Compression 3.3.3 Examples of Binary Prefix Codes | 線上觀看 | MP4下載 | |
Chapter 3:Lossless Data Compression 3.3.4 Universal Lossless Variable-Length Codes | 線上觀看 | MP4下載 | |
Chapter 4:Data Transmission and Channel Capacity 4.3 Block codes for data transmission over DMCs(1/3) | 線上觀看 | MP4下載 | |
Chapter 4:Data Transmission and Channel Capacity 4.3 Block codes for data transmission over DMCs(2/3) | 線上觀看 | MP4下載 | |
Chapter 4:Data Transmission and Channel Capacity 4.3 Block codes for data transmission over DMCs(3/3) | 線上觀看 | MP4下載 | |
Chapter 4:Data Transmission and Channel Capacity 4.5 Calculating channel capacity 4.5.1 Symmetric, Weakly Symmetric, and Quasi-symmetric Channels 4.5.2 Karuch-Kuhn-Tucker cond. for chan. capacity | 線上觀看 | MP4下載 | |
Chapter 4:Data Transmission and Channel Capacity 4.4 Example of Polar Codes for the BEC | 線上觀看 | MP4下載 | |
Chapter 4:Data Transmission and Channel Capacity 4.6 Lossless joint source-channel coding and Shannon’s separation principle | 線上觀看 | MP4下載 | |
Chapter 5:Differential Entropy and Gaussian Channels 5.1 Differential entropy | 線上觀看 | MP4下載 | |
Chapter 5:Differential Entropy and Gaussian Channels 5.1 Differential entropy 5.2 Joint & cond. diff. entrop., diverg. & mutual info | 線上觀看 | MP4下載 | |
Chapter 5:Differential Entropy and Gaussian Channels 5.3 AEP for continuous memoryless sources 5.4 Capacity for discrete memoryless Gaussian chan | 線上觀看 | MP4下載 | |
Chapter 5:Differential Entropy and Gaussian Channels 5.5 Capacity of Uncorrelated Parallel Gaussian Chan | 線上觀看 | MP4下載 | |
Chapter 5:Differential Entropy and Gaussian Channels 5.6 Capacity of correlated parallel Gaussian channels 5.7 Non-Gaussian discrete-time memoryless channels 5.8 Capacity of band-limited white Gaussian channel | 線上觀看 | MP4下載 | |
Chapter 6:Lossy Data Compression and Transmission 6.1.1 Motivation 6.1.2 Distortion measures 6.1.3 Frequently used distortion measures 6.2 Fixed-length lossy data compression | 線上觀看 | MP4下載 | |
Chapter 6:Lossy Data Compression and Transmission 6.3 Rate-distortion theorem AEP for distortion typical set Shannon’s lossy source coding theorem | 線上觀看 | MP4下載 | |
Chapter 6:Lossy Data Compression and Transmission 6.4 Calculation of the rate-distortion function 6.4.2 Rate distortion func / the squared error dist | 線上觀看 | MP4下載 | |
Chapter 6:Lossy Data Compression and Transmission 6.5 Lossy joint source-channel coding theorem 6.6 Shannon limit of communication systems(1/2) | 線上觀看 | MP4下載 | |
Chapter 6:Lossy Data Compression and Transmission 6.6 Shannon limit of communication systems(2/2) | 線上觀看 | MP4下載 | |
Preface and Introduction Chapter 1:Generalized Information Measures for Arbitrary Systems with Memory | 線上觀看 | MP4下載 | |
Chapter 2:General Data Compression Theorems | 線上觀看 | MP4下載 | |
Chapter 3:Measure of Randomness for Stochastic Processes | 線上觀看 | MP4下載 | |
Chapter 4:Channel Coding Theorems and Approximations of Output Statistics for Arbitrary Channels | 線上觀看 | MP4下載 |
課程目標
The purpose of this course is to present a concise, yet mathematically rigorous, introduction to the main pillars of information theory. It thus naturally focuses on the foundational concepts and indispensable results of the subject for single-user systems, where a single data source or message needs to be reliably processed and communicated over a noiseless or noisy point-to-point channel.
At the first part of this course, six meticulously core chapters with accompanying problems, emphasizing the key topics of information measures, lossless and lossy data compression, channel coding, and joint source-channel coding. Two appendices covering necessary and supplementary material in real analysis and in probability and stochastic processes are included.
At the second part of the course, advanced topics concerning the information theoretic limits of discrete-time single-user stochastic systems with arbitrary statistical memory (i.e., systems that are not necessarily stationary, ergodic or information stable) will be covered.
課程書目
Fady Alajaji and Po-Ning Chen, An Introduction to Single-User Information Theory, Springer, July 2018.
The following is a list of recommended references:
1. A Student’s Guide to Coding and Information Theory, Stefan M. Moser and Po-Ning Chen, Cambridge University Press, January 2012.
2. Elements of Information Theory, Thomas M. Cover and Joy A. Thomas, 2nd edition, John Wiley & Sons, Inc., July 2006.
3. A First Course in Information Theory (Information Technology: Transmission, Processing, and Storage), Raymond W. Yeung, Plenum Pub Corp., May 2002.
4. Principles and Practices of Information Theory, Richard E. Blahut, Addison Wesley, 1988.
5. Information Theory and Reliable Communication, Robert G. Gallager, 1985.
6. Information Theory, Robert B. Ash, Dover Publications, Inc., 1965.
7. Mathematical Foundations of Information Theory, A. I. Khinchin, Dover Publications, Inc., 1957.
評分標準
項目 | 百分比 |
期中考 | 50% |
期末考 | 50% |
本課程行事曆提供課程進度與考試資訊參考。
學期週次 | 上課日期 | 參考課程進度 |
第一週 | 2019/02/22 |
|
第二週 | 2019/03/01 |
|
第三週 | 2019/03/08 |
|
第四週 | 2019/03/15 |
|
第五週 | 2019/03/22 |
|
第六週 | 2019/03/29 |
|
第七週 | 2019/04/05 |
|
第八週 | 2019/04/12 |
|
第九週 | 2019/04/19 |
|
第十週 | 2019/04/26 |
|
第十一週 | 2019/05/03 |
|
第十二週 | 2019/05/10 |
|
第十三週 | 2019/05/17 |
|
第十四週 | 2019/05/24 |
|
第十五週 | 2019/05/31 |
|
第十六週 | 2019/06/07 |
|
Laboratory Manuals
章節 | 下載連接 |
Overview : The philosophy behind information theory Chapter 1 Introduction | |
Appendix A Overview on Suprema and Limits | |
Appendix B Overview in Probability and Random Processes | |
Chapter 2 Information Measures for Discrete Systems | |
Chapter 3 Lossless Data Compression | |
Chapter 4 Data Transmission and Channel Capacity | |
Chapter 5 Differential Entropy and Gaussian Channels | |
Chapter 6 Lossy Data Compression and Transmission | |
Preface and Introduction | |
Chapter 1 Generalized Information Measures for Arbitrary Systems with Memory | |
Chapter 2 General Data Compression Theorems | |
Chapter 3 Measure of Randomness for Stochastic Processes | |
Chapter 4 Channel Coding Theorems and Approximations of Output Statistics for Arbitrary Channels |