Machine Learning and Pattern Recognition

ECE4950
Machine Learning and Pattern Recognition

Instructor: Jayadev Acharya, 304 Rhodes Hall
Office hours: MoTh 1500-1600 hours, Location: 310 Rhodes Hall

Teaching Assistant: Nirmal Vijay Shende
Office hours: TBA

Lectures: MWF 1115-1205 hours, Snee Hall Geological Sci 1150
Discussion: Tu 905-955 hours, Phillips Hall 407
The class has reached its enrollment capacity. Please enter your name in the waitlist if interested.

Overview

The course is devoted to the understanding how machine learning works. We will gain hands on experience (yes, there is coding) as well as an understanding of the theoretical underpinnings of the methods. [flyer]

Prerequisites: Linear Algebra (Math2940 or equivalent), Basic Probability and Statistics (ECE3100, STSCI3080, ECE3250 or equivalent). The discussion sessions will be devoted to reinforce various concepts that we will not cover but use in the lectures.

Grading

Assignments: 50%
6-7 assignments. Due 2 weeks after upload.

Miniproject: 25%
Kaggle competition.

Examination: 25%

The final grades will be computed in two ways. One based on the percentages above, and another which we will decide at the end of the course. The final grade is higher of the two. Attendance is mandatory.

We will use Piazza for announcements, discussions, and for posting materials. We will use CMS for uploading assignments.

Materials

We will take materials from various sources. Some books are:
  • Pattern Recognition and Machine Learning, Christopher Bishop
  • A Course in Machine Learning, Hal Daume III (available here)
  • Machine Learning: a Probabilistic Perspective, Kevin Murphy (available online at Cornell Library)
  • The Elements of Statistical Learning: Trevor Hastie, Robert Tibshirani, Jerome Friedman (available here)
  • Machine Learning, Tom Mitchell
  • Introduction to Machine Learning, Nils Nilsson (available here)

Tentative Schedule

(some ordering subject to change)
WeekTopic References
1 Introduction, Decision trees
2 Naive Bayes
3 Regression, regularization
4,5,6 Perceptron, Logistic regression, SVM
7 Nearest neighbor methods
8 Bagging, boosting, random Forests
9 HMM, Viterbi algorithm
10 Bayesian networks, graphical models
11 Dimensionality reduction
12 k-means, EM algorithm
13 Neural networks, deep learning

Coding

We will use python as the programming language. Familiarity with python will be helpful, but not necessary. Please install Python, and play around. How to install: We will use scipy and sci-kit-learn, which come with an anaconda update, but need manual installation for the second option (use pip).
Please install Jupyter. It is an interactive (and awesome) way to do ML using Python. It might be easier to install with Anaconda, but if you can installed Python the hard way, you should be able to install it too.