EE514A/EE515A - Information Theory I/II - Fall 2019/Winter 2019 Quarter - Fall Quarter, 2019

Last updated: 12/02/2019-11:34:36

This page is located at


Prof. Jeff A. Bilmes --- Email me
Office: 418 EE/CS Bldg., +1 206 221 5236
Office hours: Tuesdays TBD, EEB-418 (+ online)

TA (for IT-I)

Neeraja Abhyankar Office: EEB-417
Office hours: TDB


Class is held: Mo/We 2:30-4:20 MUE 154


Information I and II

Description: This course will cover the basics of information theory.

Information Theory I: entropy, mutual information, asymptotic equipartition properties, data compression to the entropy limit (source coding theorem), Huffman, Lempel-Ziv, convolutional codes, communication at the channel capacity limit (channel coding theorem), method of types, differential entropy, maximum entropy.

Information Theory II (EE515, Winter 2020) : ECC, turbo, LDPC and other codes, Kolmogorov complexity, spectral estimation, rate-distortion theory, alternating minimization for computation of RD curve and channel capacity, more on the Gaussian channel, network information theory, information geometry, and some recent results on use of polymatroids in information theory.

Additional topics throughout will include information theory as it is applicable to pattern recognition, natural language processing, computer science and complexity, biological science, and communications.

Syllabus: see the slides from lecure 1.


Homework must be done and submitted electronically via the following link

Lecture Slides

Lecture slides will be made available as they are being prepared --- they will probably appear soon before a given lecture, and they will be in PDF format (original source is latex). Note, that these slides are corrected after the lecture (and might also include some additional discussion we had during lecture). If you find bugs/typos in these slides, please email me.
Lec. # Slides Post Lecture Slides Lecture Date Contents
0 pdf pdf -- Preliminaries, probability, convexity, Jensen.
1 pdf pdf 9/25/19 Introduction, information, entropy
2 pdf pdf 9/30/19 Entropy, Mutual Information, KL-Divergence
3 pdf pdf 10/2/19 More KL, Jensen, more Venn, Log Sum, Data Proc. Inequality
4 pdf pdf 10/7/19 Data Proc. Ineq., thermodynamics, Stats, Fano, M. of Conv
5 pdf pdf 10/9/19 M. of Conv, AEP
6 pdf pdf 10/14/19 more AEP, Source Coding, Types
7 pdf pdf 10/16/19 Types, Univ.\ Src Coding, Stoc.\ Procs, Entropy Rates
8 pdf pdf 10/18/19 Entropy rates, HMMs, Coding
9 pdf pdf 10/28/19 Kraft ineq., Shannon Codes, Kraft ineq. II, Huffman
10 pdf pdf 10/30/19 Huffman, Shannon/Fano/Elias
11 pdf pdf 11/6/19 Huffman, Shannon/Fano/Elias, Games
12 pdf pdf 11/11/19 Arith. Coding, Background On Channel Capacity
13 pdf pdf 11/13/19 Channel Capacity, DMC
14 pdf pdf 11/18/19 Ex. DMC, Properties, Joint AEP, Shannon's 2nd Theorem.
15 pdf pdf 11/20/19 Joint AEP, Shannon's 2nd Theorem
16 pdf pdf 11/25/19 Zero Error Codes, 2nd Thm Conv, Zero Error, R=C, Feedback, Joint Thm, Coding, Hamming Codes
17 pdf pdf 11/27/19 Hamming Codes, Differential Entropy
18 pdf pdf 12/2/19 Diff. Entropy, Gaussian Diff. Entropy, Max Entropy, Gaussian Channel

Discussion Board

You can post questions, discussion topics, or general information at this link.

Relevant Books

There are many books available that discuss some the material that we are covering in this course. See the end of the lecture slides for books/papers that are relevant to each specific lecture, and see lecture1.pdf for a description of our book (Cover and Thomas) and other books/papers relevant to this class.

Important Dates/Exceptions (also see academic calendar )

Religious Accommodations