Machine Learning and Pattern Recognition

Machine Learning and Pattern Recognition

Instructor: Jayadev Acharya, 304 Rhodes Hall
Office hours: MoTh 1500-1600 hours, Location: 310 Rhodes Hall (Starting 1/30/17)

Teaching Assistant: Nirmal Vijay Shende
Office hours: Wed 1600-1700 hours, Location: 310 Rhodes Hall

Lectures: MWF 1115-1205 hours, NEW ROOM: Phillips Hall 101
Discussion: Tu 905-955 hours, Phillips Hall 101
The class has reached its enrollment capacity. Please enter your name in the waitlist if interested.


The course is devoted to the understanding how machine learning works. We will gain hands on experience (yes, there is coding) as well as an understanding of the theoretical underpinnings of the methods. [flyer]

Prerequisites: Linear Algebra (Math2940 or equivalent), Basic Probability and Statistics (ECE3100, STSCI3080, ECE3250 or equivalent). The discussion sessions will be devoted to reinforce various concepts that we will not cover but use in the lectures.


  1. Assignment 1 on CMS. Due 2/27
  2. Assignment 2 on CMS. Due 3/5
  3. Assignment 3 on CMS. Due 3/24.
  4. Assignment 4,5 on CMS, Due 4/28
  5. Kaggle Competition is now up, Due 5/10


Assignments: 50%
6-7 assignments. Due 2 weeks after upload.

Miniproject: 25%
Kaggle competition.

Examination: 25%

The final grades will be computed in two ways. One based on the percentages above, and another which we will decide at the end of the course. The final grade is higher of the two. Attendance is mandatory.

We will use Piazza for announcements, discussions, and for posting materials. We will use CMS for uploading assignments.


We will take materials from various sources. Some books are:
  • Pattern Recognition and Machine Learning, Christopher Bishop
  • A Course in Machine Learning, Hal Daume III (available here)
  • Machine Learning: a Probabilistic Perspective, Kevin Murphy (available online at Cornell Library)
  • The Elements of Statistical Learning: Trevor Hastie, Robert Tibshirani, Jerome Friedman (available here)
  • Machine Learning, Tom Mitchell
  • Introduction to Machine Learning, Nils Nilsson (available here)

Tentative Schedule

(some ordering subject to change)
WeekTopic References
1 Introduction, Decision trees Mitchell Ch. 3, CIML Ch. 1
2 Naive BayesMitchell 6.9-6.10; Murphy 3.5
3 Perceptron Mitchell Ch. 4, CIML Ch. 4
4, Regression, regularization ESL 3.2, 3.2.2, 3.4, 3.4.2, notes-cuny
5, 6 SVM, Kernel trick Bishop Ch. 7.1,Andrew-Ng Notes
7 Nearest neighbor methods Mitchell Ch 8
8 Bagging, boosting notes-breiman, tutorial-boosting
9 Neural Networks Mitchell Ch 4.
10 HMM, Viterbi algorithm Rabiner-Tutorial, Murphy Ch. 15.1-15.3
11 Dimensionality reduction: PCA, JL transforms dasgupta-gupta-jl, poczos-notes, parr-notes
12 k-means, EM algorithm Bishop Ch. 9
13 Learning Theory Mitchell Ch. 7


We will use python as the programming language. Familiarity with python will be helpful, but not necessary. Please install Python, and play around. How to install: We will use scipy and sci-kit-learn, which come with an anaconda update, but need manual installation for the second option (use pip).
Please install Jupyter. It is an interactive (and awesome) way to do ML using Python. It might be easier to install with Anaconda, but if you can installed Python the hard way, you should be able to install it too.