ENEE 627: Information Theory
Spring 2015

The following schedule lists all the material discussed in class.
The distribution among the lectures is somewhat approximate.
The final exam is based on this list of topics.

Lecture 1. Introduction to the course. The notions of entropy and mutual information. (CT: Introduction)
Lecture 2. Entropy, joint and conditional entropy, divergence. (CT: 2.1-2.6)
Lecture 3. Convexity and inequalities (CT: 2.6,2.7)
Lecture 4. Convexity of I(X;Y), Data processing, Fano's inequality (CT: 2.7, 2.8, 2.10)
Lecture 5. Asymptotic Equipartition (CT: 3.1-3.3)
Lecture 6. Types: a refinement of typical sequences (CT: 11.1, 11.2)
Lecture 7. Data compression, unique decodability, Kraft's inequality (CT: 5.1, 5.2)
Lecture 8. Optimal prefix codes (5.3,5.4). Block codes. MacMillan's theorem (CT: 5.5)
Lecture 9. Huffman codes, optimality.
Lecture 10. (actually the above material took us 10 lectures).
Lecture 11. Shannon-Fano-Elias codes, optimality (CT: 5.6-5.10). Universal coding (Ch. 13) Arithmetic coding (13.3; 13.2)
Lecture 12. LZ78 and its optimality.
Lecture 13. Channel capacity. Definition and examples (CT: 7.1 + notes).
Lecture 14. Geometric proof of Shannon's theorem for the BSC (notes)
Lecture 15. Jointly typical sequences. Direct part of capacity theorem for DMC. (CT: 7.4-7.7)
Lecture 16. Converse for a DMC (CT: 7.9)
Lecture 17. Effective version of Shannon's theorem: Polar codes, I   (notes)
Lecture 18. Effective version of Shannon's theorem: Polar codes, II  (notes)
Lecture 19. Effective version of Shannon's theorem: Polar codes, III (notes)
   Recap of channel coding
Lecture 20. Feedback capacity. Source-channel separation. (CT. 7.12-7.13)
Lecture 21. Continuous RVs. Differential entropy (CT: 8.1, 8.2)
Lecture 22. Mutual information for continuous RVs. Gaussian RV has largest h(X). (CT: 8.5-8.6)
Lecture 23. Discrete-time Gaussian channel. Geometric heuristics for Shannon's capacity theorem (CT: 9.1-9.2)
Lecture 24. Proof of the Capacity Theorem (CT: 9.2-9.4)
Lecture 25. Rate-Distortion function; quantization, examples (CT 10.2)
Lecture 26. Proof of the Rate-Distortion theorem (CT: 10.3-10.4)
Lecture 27. Problems of Network Information Theory. The Slepian-Wolf theorem (Encoding of correlated sources, CT 15.4)
Lecture 28. Slepian-Wolf's theorem. (CT: 15.4) Implementation with linear codes. Generation of secret keys.