ECE 5630 - Information Theory for Data Transmission, Security, and Machine Learning
ECE 5630 - Information Theory for Data Transmission, Security, and Machine Learning
For whom?
The course is inteded for graduate students interested in mathematical foundations of information theory and their applications to the study of data transmission, secure communication and machine learning. Knowledge of probability theory and mathematical maturity are prerequisite.
Time and Location:
Lectures: TuTh255-410, 206 Upson Hall
Office Hours: We9-11, 322 Rhodes Hall
Instructor: Ziv Goldfeld, 322 Rhodes Hall
News:
- Syllabus
- Instruction for writing lecture notes
- Jan. 31st, 2020: Midterm - Will be held on Tuesday, Mar. 17th, between 7:30-10:00pm at 203 Phillips Hall.
- Jan. 31st, 2020: Reminder - The lecture on Tuesday, Feb. 4th, is cancelled. The lecture of Thursday, Feb. 6th, is rescheduled to Friday, Feb. 7th, between 2:55-4:10pm at 203 Phillips Hall. This lecture will be recorded and made available online to class students.
- Jan. 31st, 2020: 1st homework sheet is now posted (see section below)
- Feb. 3rd, 2020: No office hours will be held on Wednesday, Feb. 5th. Instead, office hours will be held on Monday, Feb. 10th, between 1-3pm (as well as Wednesday, Feb. 12th, between 9-11am).
- Feb. 11th, 2020: Typo fixed in Question 7 of homework sheet 1
- Feb. 21st, 2020: 2nd homework sheet is now posted (see section below)
- Mar. 2nd, 2020: 2nd homework submission deadline extended from Mar. 3rd to the 5th.
- Mar. 9th, 2020: 3rd homework sheet is now posted (see section below). The exercise in this sheet are at prelim level - use them for practice. A reference for additional exercise is given at the top of the homework sheet.
- Mar. 9th, 2020: Upcoming prelim - Tuesday, Mar. 17th, 7:30-10:00pm, 203 Phillips Hall. [Cancelled]
- Mar. 25th, 2020: Updates and guidelines regarding remainder of the semester and virtual learning.
- Mar. 25th, 2020: Instructions for online submission of homework assingments.
- Mar. 30th, 2020: Office hours will be held on Wednesday, April 1st, between 9-10am via Zoom. To attend, click this link.
- Apr. 6th, 2020: Recoding of Lecture 13 is available on Canvas.
- Apr. 6th, 2020: Invitations for Zoom lecture (discussion+question) sessions for the remainder of the semester are available on Canvas.
- Apr. 9th, 2020: Recoding of Lecture 14 is available on Canvas.
- Apr. 13th, 2020: Recoding of Lecture 15 is available on Canvas.
- Apr. 15th, 2020: Recoding of Lecture 16 is available on Canvas.
- Apr. 15th, 2020: 4th homework sheet is now posted (see section below)
- Apr. 20th, 2020: Recoding of Lecture 17 is available on Canvas.
Homework Sheets:
Lecture notes:
- Lectures 1 and 2 (probability spaces, and the Borel σ-algebra)
- Lecture 3 (random variables, CDFs, PMFs and PDFs)
- Lecture 4 (expectation, variance, LLN, transition kernels, and conditional expectation)
- Lecture 5 (intro to f-divergences, and primar on convexity)
- Lecture 6 (absolute continuity, Radon-Nikodym Theorem, f-divergence definition, examples and properties)
- Lecture 7 (data processing inequality)
- Lecture 8 (convex conjugates, f-divergence dual form, and generative modeling)
- Lecture 9 (Shannon entropy, differential entropy, and mutual information)
- Lecture 10 (mutual informaion vs entropy, conditional mutual information and data processing inequality)
- Lecture 11 (introducion to letter typical sequences)
- Lecture 12 (letter typical set: definition and properties)
Overview:
Information theory studies the quantification, compression, communication and encryption of information. Through elegant mathematical formulations of operational problems coupled with powerful techniques, information theory characterizes fundamental performance limits and provides deep engineering insight into the design of the underlying systems. The course We will cover both classical and modern topics, starting from f-divergences, information measures and relations between them. Using these tools we will explore optimal data transmission (rates) over noisy channels. We will then cover distribution simulation and leverage it to study information-theoretically secure communication in the presence of a (passive or active) eavesdropper. Finally, we will explore connections between information theory and machine learning, examining how these fields can cross-fertilize each other.
Format:
Two 1:15hr lectures per week, with a midterm and a final exam.
Tentative List of Topics:
- Information measures (f-divergences, entropy mutual information) and their properties
- Typical sets and the `asymptotic equipartition property’
- Channel coding theorem: fundamental limit of reliable data transmission over noisy channels
- Distribution simulation and channel resolvability
- Physical-layer security: reliable and (information-theoretically) secure communication
- Classic (passive eavesdropper) wiretap channel and active `wiretap channel of type II’
- Estimating mutual information using neural networks
- Learning wiretap codes using neural networks