Arya Mazumdar arya@cs.umass.edu
TuTh 11:30 - 12:45 CS 140
Tu 15:00-16:00 CS 222
This course will introduce the basic concepts of Information Theory: entropy, relative entropy, mutual information, channel capacity, and rate distortion (data compression). Applications of these concepts will be emphasized. The notions of sparse-graph codes, message passing and belief propagation will also be introduced and studied in detail. Applications in inference algorithms and machine learning will be highlighted; in particular, topics of neural networks, statistical estimation, community detection and clustering will be covered.
No required textbook. The following references are useful.
Thomas M. Cover, Joy A. Thomas, Elements of Information Theory, 2nd Edition, Wiley, Link.
David MacKay, Information Theory, Inference, and Learning Algorithms, Cambridge. Download free HERE.
There will be 4 home assignments, 1 midterm and 1 final exam.
Homeworks 25%
Scribing 10%
Midterm 30%
Final 35%
A group of two students will submit a single homework that will reflect the collaborative effort of that group of students. Assignments must be submitted at the beginning of the class on the deadline. Late submission by a day at the instructors office will incur a 20% penalty. Submissions will not be accepted if there is any more delay without substantial reason (such as a doctor's note).
A group of two students will scribe the notes for one (or two) lectures in TeX. Template for lecture notes will be provided. Scribing for two lectures will automatically earn full points for that group of students. Scribed notes must be submitted by email to the instructor within a week of the lecture. Late submission by a day will incur a 20% penalty. Any more delay without substantial reason will incur a penalty that will depend on the instructor's discretion.
Lecture | Date | Topics | Notes |
1 | Tu Jan 19 | What is information, what to expect from this course, entropy | Lecture 1 |
2 | Th Jan 21 | Joint entropy, conditional entropy, relative entropy and mutual information | Lecture 2 |
3 | Tu Jan 26 | Properties of mutual information, data compression, uniquely decodable codes, Kraft's inequality, fundamental limit | Lecture 3 |
4 | Th Jan 28 | Instantaneous codes, Huffman tree, Shannon coding, Huffman code | Lecture 4 |
5 | Tu Feb 2 | Huffman code, Shannon-Fano-Elias code, Arithmetic coding, competitive optimality | Lecture 5 |
6 | Tu Feb 9 | Lempel-Ziv coding, basis of source coding, asymptotic equipartition property, typical set, large deviation | Lecture 6 |
7 | Th Feb 11 | Chernoff bound and KL-divergence, chain rules, Jensen's inequality, data-processing inequality | Lecture 7 |
8 | Th Feb 18 | Fano's inequality, multiple-hypothesis testing, binary hypothesis testing, Le Cam's identity, total variation distance, Pinsker's inequality | Lecture 8 |
9 | Tu Feb 23 | Pinsker's inequality, Binary hypothesis testing, Neyman-Pearson test, optimality, Bayes’ test | Lecture 9 |
10 | Th Feb 25 | Application of binary hypothesis testing, Gaussian noise, binary symmetric channel, codes; Application of Pinsker's inequality: Multi-Arm Bandit problems | Lecture 10 |
11 | Tu Mar 1 | Finding biased coin, Multi-Arm-Bandit regret lower bound | Lecture 11 |
12 | Th Mar 3 | Multi-Arm-Bandit intuitions, differential entropy, quantization, maximal entropy, parameter estimation | Lecture 12 |
13 | Tu Mar 8 | Secret sharing, parameter estimation, Fisher Information, score function, Cramer_Rao bound | Lecture 13 |
Midterm | Th Mar 10 | Midterm | Midterm |
14 | Tu Mar 22 | One time pad, the big picture, Shannon's communication model, discrete memoryless channel, BSC, codes, capacity, Shannon capacity converse | Lecture 14 |
15 | Th Mar 24 | Channel capacity, binary erasure channel, Gaussian channel and SNR, random codes achieve capacity | Lecture 15 |
16 | Tu Mar 29 | BSC and BEC, linear codes, a note on finite fields, parity check code and parity check matrix | Lecture 16 |
17 | Th Mar 31 | Midterm review, Hamming code, decoding, random linear codes achieve capacity | Lecture 17 |
18 | Tu Apr 5 | Linear codes, correcting errors and erasures, Singleton bound, iterative decoding on Tanner graph, stopping set, ML and bit-MAP decoding | Lecture 18 |
19 | Th Apr 7 | Iterative decoding, belief propagation, message-passing rules for LDPC codes, expander graphs | Lecture 19 |
20 | Tu Apr 12 | Expander graph and codes, bit-flip decoding, network coding, max-flow-mincut, multicast | Lecture 20 |
21 | Th Apr 14 | Network coding multicast: main theorem, linear codes achieve multicast capacity via the sparse-zeros lemma, introduction to compressed sensing, single pixel camera | Lecture 21 |
22 | Tu Apr 19 | Compressible signal (Kolmogorov complexity) and compressed sensing, spark condition, Vandermonde matrix, RIP condition and sparse recovery | Lecture 22 |
23 | Th Apr 21 | Mixture of Gaussians, ML clustering, Newton-Raphson and soft K-means | Lecture 23 |
24 | Tu Apr 26 | Information theoretic clustering, review of the class | Best Wishes |
Solution is in Moodle.
Solution is in Moodle.