## EECS 428 - Information Theory

**CATALOG DESCRIPTION:** Information measures and their properties: entropy, divergence, mutual information, channel capacity. Shannon's fundamental theorems for data compression and coding for noisy channels. Applications in communications, statistical inference, probability, physics.

*Prerequisites by course: EECS 302 (Probabilities Systems and Random Signals).*

*Prerequisites by topic: Good understanding of basic probability. (A review of probability theory will be given in Week 1.)*

*This course fulfills the Theory Depth requirement.*

**REQUIRED TEXT:** Cover & Thomas, Elements of Information Theory, 2nd ed., Wiley, 2006.

**REFERENCE TEXTS:**

- R. G. Gallager, Information Theory and Reliable Communication, Wiley, 1968.
- D. J. MacKay, Information Theory, Inference and Learning Algorithms, Cambridge, 2004

**COURSE DIRECTOR**: **Prof. Dongning Guo**

Problem Sets: There will be weekly problem sets. Problem sets are required to be handed in by the end of the class (usually on Friday) in which they are due. Each student is allowed to have one exception to turn in his/her homework within 72 hours past due time during the quarter. Late problem sets will not be accepted otherwise. Each of you is encouraged to work on the problem sets on your own and by consulting the textbook. Working together in small groups on the problem sets is encouraged whenever it helps to better learn the material. However, each person must write up his/her own solution to hand in.

**COURSE GOALS:**

**PREREQUISITES BY COURSES: ****EECS 302** (Probabilities Systems and Random Signals)

**DETAILED COURSE TOPICS:**

- Overview of information theory and its applications.
- Review of probability theory.
- Information measures (entropy, divergence, mutual information) and basic properties.
- Typical sets and the Asymptotic Equipartition Property.
- Data compression/lossless source coding.
- Entropy rates for stochastic processes, Markov chains.
- Huﬀman coding, Lempel-Ziv compression.
- Channel coding, channel capacity.
- The channel coding theorem for discrete memoryless channels.
- Converse to the channel coding theorem, joint source channel coding.
- Discrete and continuous-time Gaussian channels, band-limited channels.
- Source-channel separation.
- Rate distortion

**GRADES: **

- Problem sets: 30%
- Midterm exam: 25%
- Final Exam: 45%

**COURSE OBJECTIVES:**