Applied Information Theory

CMPSCI 691GG • Fall 2006 • Tuesday and Thursday, 11:15-12:30 • Agricultural Engineering Building (AEBC) S308 (See map)


Erik Learned-Miller

Teaching Assistant: None


Stats 515 or equivalent. Basic college-level probability. Basic programming ability in C++, Matlab, or Java.

Reading Materials

• Required text: Elements of Information Theory, Second Edition, by Thomas Cover and Joy Thomas.

• NOTE THAT THIS IS THE SECOND EDITION, NOT THE FIRST EDITION!!! If you have a copy of the first edition of this book, you might be able to use it, as the second edition is not too different. However, you should at least have access to the second edition to check for differences, page number references, problem set numbers, and other differences. I will not answer questions about the differences between the books, as I do not have time to figure this out. You should have at least one friend with a copy of the second edition if you want to use the first edition, so that you can find appropriate page number references and so on.


Test 2 Review

Minimum entropy joint alignment

Mutual Information Alignment

David Mackay's Information Theory book (electronic edition is free and on the web)

Guest Lecturer John Fisher's slides

Guest Lecturer Hossein Pishro Nik's slides

Assignment 3 Solutions (not quite complete)

Problem Sets

Problem sets are due at any time on the day indicated on the course web page. If you turn them in after class, please slide them under the door of my office, or put them in my faculty mailbox. The latest you can turn in a problem set is midnight on the day due. The fact that you cannot get into the building after hours does not count as an excuse. If you're worried about that, turn it in earlier. I may take off 50% for problem sets turned in late.

If you want written feedback on your homework (such as how many points per problem you got), please hand it in on paper, rather than as an email.


This course will introduce the basic concepts of Information Theory: entropy, relative entropy, mutual information, channel capacity, and rate distortion. Applications, rather than proofs, will be emphasized. In addition, the statistical problem of computing information theoretic quantities from data will be emphasized. Non-parametric and semi-parametric statistical models will also be covered.

Date Lecture topic New assignments Assignments due Reading
Sept. 7 Introduction and Overview. Entropy as non-parametric likelihood. Mutual information as measure of dependence. Dreams as decoding. Duality of compressibility and likelihood. Plus much much more! Chap. 2, C+T
Sept. 12 Entropy, Relative entropy, Mutual Information Assignment 1 Posted Chap. 2, C+T
Sept. 14 Entropy, Relative entropy, Mutual Information continued Chap. 2, C+T
Sept. 19 Convexity, Jensen's Inequality, Basic Information Theory Inequalities Chap. 2
Sept. 21 Fano's Inequality and the Asymptotic Equipartition Property Assignment 1 DUE Chap. 3
Sept. 26 Source Coding and Compression Chap. 5
Sept. 28 Source coding 2 Chap. 5
Oct. 3 Differential Entropy and Density Estimation with Parzen Windows (beg. of Chap 8)
Oct. 5 Minimum Entropy Joint Alignment Algorithms
Oct. 10 Intro to Independent Components Analysis
Oct. 12 Source Coding of Finite Sequences Assignment 1 Solutions Chap. 5
Oct. 17 "Shannon codes" and achievability of near entropy source coding. Huffman coding. Assignment 2 Posted Write a 1-page description of paper. Include your specific opinions about the validity of the theory presented. I want you to argue for it or against it. Chap. 5
Oct. 19 Guest Lecture: John Fisher, MIT. Fusing information from different sensor types using mutual information.
Oct. 24 Intro to Channel Coding Assignment 2 DUE Chap. 7
Oct. 26 Channel coding theorem, part 1: Chap. 7
Oct. 31 Some simple practical coding schemes: Hamming codes Assignment 3: The following problems from Chapter 7: 1, 2, 3, 8, 13, 15, 20, 30. Chap. 7
Nov. 2 Guest Lecturer: Dennis Goeckel, ECE Department. Fading channels and wireless communication. ATTENDANCE REQUIRED!!! ATTENDANCE REQUIRED!!! ATTENDANCE REQUIRED!!!
Nov. 9 More Channel Coding
Nov. 14 Finish Channel Coding Assignment 3 DUE
Nov. 16 Independent Components Analysis, Tree-dependent components analysis, and other semi-parametric models
Nov. 21 TEST IN CLASS Chapters 2, 3, 5, 7, plus class material through Channel Coding
Nov. 23 Thanksgiving: NO CLASS
Nov. 28 Semi-parametric models continued Tree-dependent Component Analysis Please read to the end of Section 3.

Also read Tree-structured Non-parametric distributions.
Nov. 30 Image alignment by maximization of mutual information
Dec. 5
Dec. 7
Dec. 12 Last day of class