Machine Learning and Pattern Recognition

ECE4950
Machine Learning and Pattern Recognition

Instructor: Jayadev Acharya, 382 Rhodes Hall
Office hours: Mo 10-11, Rhodes 310

Course Staff:
Sourbh Bhadane, OH: We 4-5, 380 Rhodes Hall
Ziteng Sun, OH: Tu 3.30-4.30, 380 Rhodes Hall
Huanyu Zhang, OH: Th 4-5, 380 Rhodes Hall

Lectures: MWF 1115-1205 hours, NEW ROOM: Phillips Hall 101
Discussion: Tu 905-955 hours, Phillips Hall 101

Overview

The course is devoted to the understanding how machine learning works. We will gain hands on experience (yes, there is coding) as well as an understanding of the theoretical underpinnings of the methods.

Prerequisites: Linear Algebra (Math2940 or equivalent), Basic Probability and Statistics (ECE3100, STSCI3080, ECE3250 or equivalent). The discussion sessions will be devoted to reinforce various concepts that we will not cover but use in the lectures.

Assignments

Grading

Assignments: 25%
Almost weekly assignments.

Miniproject: 22.5%.

Midterm: 20%

Final: 27.5%

Late submission policy: TBD

We will use Piazza for announcements, discussions, and for posting materials. We will use CMS for uploading assignments.

Materials

We will take materials from various sources. Some books are:
  • Pattern Recognition and Machine Learning, Christopher Bishop
  • A Course in Machine Learning, Hal Daume III (available here)
  • Machine Learning: a Probabilistic Perspective, Kevin Murphy (available online at Cornell Library)
  • The Elements of Statistical Learning: Trevor Hastie, Robert Tibshirani, Jerome Friedman (available here)
  • Machine Learning, Tom Mitchell

Tentative Schedule

(some ordering subject to change)
WeekTopic References
1 Introduction, Decision trees Mitchell Ch. 3, CIML Ch. 1
2 Naive BayesMitchell 6.9-6.10; Murphy 3.5
3 Perceptron Mitchell Ch. 4, CIML Ch. 4
4, Regression, regularization ESL 3.2, 3.2.2, 3.4, 3.4.2, notes-cuny
5, 6 SVM, Kernel trick Bishop Ch. 7.1,Andrew-Ng Notes
7 Nearest neighbor methods Mitchell Ch 8
8 Bagging, boosting notes-breiman, tutorial-boosting
9 Neural Networks Mitchell Ch 4.
10 HMM, Viterbi algorithm Rabiner-Tutorial, Murphy Ch. 15.1-15.3
11 Dimensionality reduction: PCA, JL transforms dasgupta-gupta-jl, poczos-notes, parr-notes
12 k-means, EM algorithm Bishop Ch. 9
13 Learning Theory Mitchell Ch. 7

Coding

We will use python as the programming language. Familiarity with python will be helpful, but not necessary. Please install Python, and play around. How to install: We will use scipy and sci-kit-learn, which come with an anaconda update, but need manual installation for the second option (use pip).
Please install Jupyter. It is an interactive (and awesome) way to do ML using Python. It might be easier to install with Anaconda, but if you can installed Python the hard way, you should be able to install it too.