ENEE 627(721): Information Theory
An analytical framework for modeling and evaluating
point-to-point communication systems is developed. The key concepts of
information content of a signal source and information capacity of a
transmission medium are precisely defined, and their relationship to
data compression algorithms and error control codes is examined in detail.
The course aims to instil an appreciation for the fundamental capabilities
and limitations of information transmission schemes; and to provide the
mathematical tools for applying these ideas to a broad class of
Required topics from probability/random processes
are: calculus of distributions including the multivariate Gaussian, the
Chebyshev and Jensen inequalities, conditional probability and independence,
sequences of independent random variables and laws of large numbers,
elements of Markov chains and stationary processes. It is important that
students are comfortable with fundamentals of mathematical analysis or
advanced calculus (e.g., concepts of continuity, convexity, optimization
via Lagrange multipliers), and some linear algebra (up to diagonalization
of square symmetric matrices).
- T.M. Cover and J.A. Thomas, Elements of Information Theory,
John Wiley (1991).
- R.G. Gallager, Information Theory and Reliable Communication,
for Discrete Memoryless Systems, Academic Press (1981).
- R.J. McEliece, The Theory of Information and Coding,
- Measures of information: entropy, relative entropy and
mutual information. Basic inequalities (Jensen, log-sum, Fano, data
- Data compression by fixed-to-variable-length codes:
Unique decodability and the prefix condition. Kraft inequality,
relationship of average codeword length to source entropy. Examples of
coding techniques: Huffman, Shannon-Fano-Elias, Lempel-Ziv, universal.
- Data compression of discrete memoryless sources by
fixed-to-fixed-length codes. Typicality and the asymptotic equi-partition
property. Interpretation of entropy of this context.
- Entropy rate of stationary sources. Stationary Markov sources:
entropy rate and data compression.
- Discrete memoryless channels. Definition of capacity and its computation.
- Communication over discrete channels: the channel coding
theorem and the physical significance of capacity. Feedback in
memoryless channels. The joint source-channel coding theorem. Elementary
parity-check codes; maximum likelihood decoding.
- Measures of information for sources with continuous range.
- Discrete-time Gaussian channels and their capacity. Simple
and parallel configurations. White and colored noise. Extension to
band-limited waveform channels.
- Vector quantization of a discrete-time memoryless source
under a fidelity criterion. Definition of the rate distortion function
and its computation in simple cases. The rate distortion theorem.
- Arithmetic coding (CT: 5.8).
- Bandlimited Gaussian channels (CT: 10.3).
- Maximum entropy techniques (CT: 11).
- Asymptotics of hypothesis testing (CT: 12.7-12.9).
- Computation of channel capacity and the rate distortion function.
- Multiuser information theory (CT: 14).