Algorithmic Introduction to Coding Theory
Lecture 1 (9/5): Introduction. Shannon's theorem. Information, Entropy. References.
Lecture 2 (9/10): Converse of Shannon's noisy coding theorem. Discussion on Shannon capacity. Hamming's theory. Error-correcting codes. Linear codes. Lecture 2.5: Notes on algebra. Lecture 3 (9/12): Hamming Codes. Hamming bound. Perfect codes. Finite fields. Polynomials over finite fields. Lecture 4 (9/19): Singleton bound. Reed Solomon codes, MDS codes. Reed Muller codes, Hadamard codes. Lecture 4.5: BCH Codes (REVISED 1/8/2002.) Lecture 5 (9/24): Asymptotically good codes over small alphabets. Random codes, Random linear codes, Wozencraft ensemble. Lecture 6 (9/26): Wozencraft ensemble (contd.). Operations on codes. Concatenated codes. Forney codes. Justesen codes. Lecture 7 (10/1): Algebraic geometry codes: Codes better than random codes. Lecture 8 (10/3): Impossibility results for codes: Plotkin bound, Johnson bound, Elias-Bassalygo bound. Lecture 9 (10/10): Bounds (contd.): MacWilliams Identities. Linear Programming bound. The asymptotic perspective. Lecture 10 (10/22): Algorithmic questions: Encoding. Decoding. Decoding from erasures. Decoding RS codes. Some additional material:
Lecture 1 (9/5): Introduction. Shannon's theorem. Information, Entropy. References.
- My notes on rational function interpolation and
- Michael Rosenblum's notes on rational approximation.
Lecture 11 (10/24): Abstract algorithm for decoding. Application: AG codes. Concatenated codes & GMD decoding.