The course is inteded for graduate students interested in mathematical foundations of information theory and their applications to the study of data transmission, secure communication and machine learning. Knowledge of probability theory and mathematical maturity are prerequisite.

**Lectures: **TuTh255-410, 206 Upson Hall

**Office Hours: **We9-11, 322 Rhodes Hall

**Instructor:** Ziv Goldfeld, 322 Rhodes Hall

- Syllabus
- Instruction for writing lecture notes
__Jan. 31st, 2020:__Midterm - Will be held on Tuesday, Mar. 17th, between 7:30-10:00pm at 203 Phillips Hall.__Jan. 31st, 2020:__Reminder - The lecture on Tuesday, Feb. 4th, is cancelled. The lecture of Thursday, Feb. 6th, is rescheduled to Friday, Feb. 7th, between 2:55-4:10pm at 203 Phillips Hall. This lecture will be recorded and made available online to class students.__Jan. 31st, 2020:__1st homework sheet is now posted (see section below)__Feb. 3rd, 2020:__No office hours will be held on Wednesday, Feb. 5th. Instead, office hours will be held on Monday, Feb. 10th, between 1-3pm (as well as Wednesday, Feb. 12th, between 9-11am).__Feb. 11th, 2020:__Typo fixed in Question 7 of homework sheet 1__Feb. 21st, 2020:__2nd homework sheet is now posted (see section below)__Mar. 2nd, 2020:__2nd homework submission deadline extended from Mar. 3rd to the 5th.__Mar. 9th, 2020:__3rd homework sheet is now posted (see section below). The exercise in this sheet are at prelim level - use them for practice. A reference for additional exercise is given at the top of the homework sheet.__Mar. 9th, 2020:__Upcoming prelim - Tuesday, Mar. 17th, 7:30-10:00pm, 203 Phillips Hall.**[Cancelled]**__Mar. 25th, 2020:__Updates and guidelines regarding remainder of the semester and virtual learning.__Mar. 25th, 2020:__Instructions for online submission of homework assingments.__Mar. 30th, 2020:__Office hours will be held on Wednesday, April 1st, between 9-10am via Zoom. To attend, click this link.__Apr. 6th, 2020:__Recoding of Lecture 13 is available on Canvas.__Apr. 6th, 2020:__Invitations for Zoom lecture (discussion+question) sessions for the remainder of the semester are available on Canvas.

- Homework sheet 1 (submission due to Feb. 13th at 2:55pm in class)
**[2nd version - Typo fixed in Q7]**[HW1 solutions] - Homework sheet 2 (submission extended until Mar. 5th at 2:55pm in class) [HW2 solutions]
- Homework sheet 3 (submission due to Apr. 13th via Canvas)
**[Exercises in the homework sheet are at prelim level]**

- Lectures 1 and 2 (probability spaces, and the Borel σ-algebra)
- Lecture 3 (random variables, CDFs, PMFs and PDFs)
- Lecture 4 (expectation, variance, LLN, transition kernels, and conditional expectation)
- Lecture 5 (intro to f-divergences, and primar on convexity)
- Lecture 6 (absolute continuity, Radon-Nikodym Theorem, f-divergence definition, examples and properties)
- Lecture 7 (data processing inequality)
- Lecture 8 (convex conjugates, f-divergence dual form, and generative modeling)
- Lecture 9 (Shannon entropy, differential entropy, and mutual information)
**[In preperation]** - Lecture 10 (mutual informaion vs entropy, conditional mutual information and data processing inequality)
**[In preperation]** - Lecture 11 (introducion to letter typical sequences)
**[In preperation]** - Lecture 12 (letter typical set: definition and properties)

Information theory studies the quantification, compression, communication and encryption of information. Through elegant mathematical formulations of operational problems coupled with powerful techniques, information theory characterizes fundamental performance limits and provides deep engineering insight into the design of the underlying systems. The course We will cover both classical and modern topics, starting from f-divergences, information measures and relations between them. Using these tools we will explore optimal data transmission (rates) over noisy channels. We will then cover distribution simulation and leverage it to study information-theoretically secure communication in the presence of a (passive or active) eavesdropper. Finally, we will explore connections between information theory and machine learning, examining how these fields can cross-fertilize each other.

Two 1:15hr lectures per week, with a midterm and a final exam.

- Information measures (f-divergences, entropy mutual information) and their properties
- Typical sets and the `asymptotic equipartition property’
- Channel coding theorem: fundamental limit of reliable data transmission over noisy channels
- Distribution simulation and channel resolvability
- Physical-layer security: reliable and (information-theoretically) secure communication
- Classic (passive eavesdropper) wiretap channel and active `wiretap channel of type II’
- Estimating mutual information using neural networks
- Learning wiretap codes using neural networks