ECE 6970 - Statistical Distances for Modern Machine Learning

ECE 6970 - Statistical Distances for Modern Machine Learning

For whom?

The course is inteded for graduate students interested in mathematical aspects of some modern machine learning techniques. Knowledge of probability theory and mathematical maturity are prerequisite. Familiarity with functional analysis is helpful.


Time and Location:

Lectures: TuTh255-410, 202 Upson Hall

Office Hours: We9-11, 322 Rhodes Hall

Instructor: Ziv Goldfeld, 322 Rhodes Hall




Homework Sheets:



Statistical distances such as optimal transport (particularly, the Wasserstein metric), total variation, Kullback-Leibler (KL) divergence, Chi-squared divergence, and others, are used to design and analyze a variety of machine learning systems. Applications include anomaly and outlier detection, ordinal regression, generative adversarial networks (GANs), and many more. This course will establish the mathematical foundations of these important measures, explore their statistical properties (e.g., convergence rates of empirical measures), and focus on GANs and, more generally, on deep neural networks (DNNs) as applications (design and analysis).



The format is based on paper reading and presentation assignments performed by the students. Each student will present a work of hers/his choice from a prescribed list. The course instructor will deliver the first 3-4 lectures, as well as some throughout the semester. The final project will include a scientific assignment based on another chosen article. Choices for project assignments include extension of existing results, implementation tasks, critical summary of a paper, etc. The last 4 lectures will be dedicated to final project synopses presentations.


List of Tentative Topics: